Sample records for complex sample analysis

  1. Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding

    DTIC Science & Technology

    2012-01-01

    Report No: CG-D-15-13 Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding...Distribution Statement A: Approved for public release; distribution is unlimited. January 2012 Automated Protist Analysis of Complex Samples...Chelsea Street New London, CT 06320 Automated Protist Analysis of Complex Samples iii UNCLAS//PUBLIC | CG-926 R&DC | B. Nelson, et al

  2. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  3. Navigating complex sample analysis using national survey data.

    PubMed

    Saylor, Jennifer; Friedmann, Erika; Lee, Hyeon Joo

    2012-01-01

    The National Center for Health Statistics conducts the National Health and Nutrition Examination Survey and other national surveys with probability-based complex sample designs. Goals of national surveys are to provide valid data for the population of the United States. Analyses of data from population surveys present unique challenges in the research process but are valuable avenues to study the health of the United States population. The aim of this study was to demonstrate the importance of using complex data analysis techniques for data obtained with complex multistage sampling design and provide an example of analysis using the SPSS Complex Samples procedure. Illustration of challenges and solutions specific to secondary data analysis of national databases are described using the National Health and Nutrition Examination Survey as the exemplar. Oversampling of small or sensitive groups provides necessary estimates of variability within small groups. Use of weights without complex samples accurately estimates population means and frequency from the sample after accounting for over- or undersampling of specific groups. Weighting alone leads to inappropriate population estimates of variability, because they are computed as if the measures were from the entire population rather than a sample in the data set. The SPSS Complex Samples procedure allows inclusion of all sampling design elements, stratification, clusters, and weights. Use of national data sets allows use of extensive, expensive, and well-documented survey data for exploratory questions but limits analysis to those variables included in the data set. The large sample permits examination of multiple predictors and interactive relationships. Merging data files, availability of data in several waves of surveys, and complex sampling are techniques used to provide a representative sample but present unique challenges. In sophisticated data analysis techniques, use of these data is optimized.

  4. David Brandner | NREL

    Science.gov Websites

    chemical reaction engineering and transport phenomena Analytical analysis of complex bio-derived samples and Lignin Areas of Expertise Analytical analysis of complex samples Chemical reaction engineering and

  5. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  6. Revealing hidden clonal complexity in Mycobacterium tuberculosis infection by qualitative and quantitative improvement of sampling.

    PubMed

    Pérez-Lago, L; Palacios, J J; Herranz, M; Ruiz Serrano, M J; Bouza, E; García-de-Viedma, D

    2015-02-01

    The analysis of microevolution events, its functional relevance and impact on molecular epidemiology strategies, constitutes one of the most challenging aspects of the study of clonal complexity in infection by Mycobacterium tuberculosis. In this study, we retrospectively evaluated whether two improved sampling schemes could provide access to the clonal complexity that is undetected by the current standards (analysis of one isolate from one sputum). We evaluated in 48 patients the analysis by mycobacterial interspersed repetitive unit-variable number tandem repeat of M. tuberculosis isolates cultured from bronchial aspirate (BAS) or bronchoalveolar lavage (BAL) and, in another 16 cases, the analysis of a higher number of isolates from independent sputum samples. Analysis of the isolates from BAS/BAL specimens revealed clonal complexity in a very high proportion of cases (5/48); in most of these cases, complexity was not detected when the isolates from sputum samples were analysed. Systematic analysis of isolates from multiple sputum samples also improved the detection of clonal complexity. We found coexisting clonal variants in two of 16 cases that would have gone undetected in the analysis of the isolate from a single sputum specimen. Our results suggest that analysis of isolates from BAS/BAL specimens is highly efficient for recording the true clonal composition of M. tuberculosis in the lungs. When these samples are not available, we recommend increasing the number of isolates from independent sputum specimens, because they might not harbour the same pool of bacteria. Our data suggest that the degree of clonal complexity in tuberculosis has been underestimated because of the deficiencies inherent in a simplified procedure. Copyright © 2014 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  7. Letter Report: Stable Hydrogen and Oxygen Isotope Analysis of B-Complex Groundwater Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Brady D.; Moran, James J.; Nims, Megan K.

    Report summarizing stable oxygen and hydrogen isotope analysis of two groundwater samples from the B-Complex. Results from analyses were compared to perched water and pore water analyses performed previously.

  8. Complex sample survey estimation in static state-space

    Treesearch

    Raymond L. Czaplewski

    2010-01-01

    Increased use of remotely sensed data is a key strategy adopted by the Forest Inventory and Analysis Program. However, multiple sensor technologies require complex sampling units and sampling designs. The Recursive Restriction Estimator (RRE) accommodates this complexity. It is a design-consistent Empirical Best Linear Unbiased Prediction for the state-vector, which...

  9. Analysis of complex samples using a portable multi-wavelength light emitting diode (LED) fluorescence spectrometer

    USDA-ARS?s Scientific Manuscript database

    Spectroscopic analysis of chemically complex samples often requires an increase n the dimensionality of the measured response surface. This often involves the measurement of emitted light intensities as functions of both wavelengths of excitation and emission resulting in the generation of an excita...

  10. Pseudotargeted MS Method for the Sensitive Analysis of Protein Phosphorylation in Protein Complexes.

    PubMed

    Lyu, Jiawen; Wang, Yan; Mao, Jiawei; Yao, Yating; Wang, Shujuan; Zheng, Yong; Ye, Mingliang

    2018-05-15

    In this study, we presented an enrichment-free approach for the sensitive analysis of protein phosphorylation in minute amounts of samples, such as purified protein complexes. This method takes advantage of the high sensitivity of parallel reaction monitoring (PRM). Specifically, low confident phosphopeptides identified from the data-dependent acquisition (DDA) data set were used to build a pseudotargeted list for PRM analysis to allow the identification of additional phosphopeptides with high confidence. The development of this targeted approach is very easy as the same sample and the same LC-system were used for the discovery and the targeted analysis phases. No sample fractionation or enrichment was required for the discovery phase which allowed this method to analyze minute amount of sample. We applied this pseudotargeted MS method to quantitatively examine phosphopeptides in affinity purified endogenous Shc1 protein complexes at four temporal stages of EGF signaling and identified 82 phospho-sites. To our knowledge, this is the highest number of phospho-sites identified from the protein complexes. This pseudotargeted MS method is highly sensitive in the identification of low abundance phosphopeptides and could be a powerful tool to study phosphorylation-regulated assembly of protein complex.

  11. Analysis of macromolecules, ligands and macromolecule-ligand complexes

    DOEpatents

    Von Dreele, Robert B [Los Alamos, NM

    2008-12-23

    A method for determining atomic level structures of macromolecule-ligand complexes through high-resolution powder diffraction analysis and a method for providing suitable microcrystalline powder for diffraction analysis are provided. In one embodiment, powder diffraction data is collected from samples of polycrystalline macromolecule and macromolecule-ligand complex and the refined structure of the macromolecule is used as an approximate model for a combined Rietveld and stereochemical restraint refinement of the macromolecule-ligand complex. A difference Fourier map is calculated and the ligand position and points of interaction between the atoms of the macromolecule and the atoms of the ligand can be deduced and visualized. A suitable polycrystalline sample of macromolecule-ligand complex can be produced by physically agitating a mixture of lyophilized macromolecule, ligand and a solvent.

  12. Nicholas Cleveland | NREL

    Science.gov Websites

    tools for complex sample analysis Affiliated Research Programs Biochemical Catalysis Working Group The synthesis Catalyst characterization Catalyst testing and reaction screening Analysis of complex organics

  13. Two-dimensional fingerprinting approach for comparison of complex substances analysed by HPLC-UV and fluorescence detection.

    PubMed

    Ni, Yongnian; Liu, Ying; Kokot, Serge

    2011-02-07

    This work is concerned with the research and development of methodology for analysis of complex mixtures such as pharmaceutical or food samples, which contain many analytes. Variously treated samples (swill washed, fried and scorched) of the Rhizoma atractylodis macrocephalae (RAM) traditional Chinese medicine (TCM) as well as the common substitute, Rhizoma atractylodis (RA) TCM were chosen as examples for analysis. A combined data matrix of chromatographic 2-D HPLC-DAD-FLD (two-dimensional high performance liquid chromatography with diode array and fluorescence detectors) fingerprint profiles was constructed with the use of the HPLC-DAD and HPLC-FLD individual data matrices; the purpose was to collect maximum information and to interpret this complex data with the use of various chemometrics methods e.g. the rank-ordering multi-criteria decision making (MCDM) PROMETHEE and GAIA, K-nearest neighbours (KNN), partial least squares (PLS), back propagation-artificial neural networks (BP-ANN) methods. The chemometrics analysis demonstrated that the combined 2-D HPLC-DAD-FLD data matrix does indeed provide more information and facilitates better performing classification/prediction models for the analysis of such complex samples as the RAM and RA ones noted above. It is suggested that this fingerprint approach is suitable for analysis of other complex, multi-analyte substances.

  14. Microarray R-based analysis of complex lysate experiments with MIRACLE

    PubMed Central

    List, Markus; Block, Ines; Pedersen, Marlene Lemvig; Christiansen, Helle; Schmidt, Steffen; Thomassen, Mads; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan

    2014-01-01

    Motivation: Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. Typical challenges involved in this technology are antibody selection, sample preparation and optimization of staining conditions. The issue of combining effective sample management and data analysis, however, has been widely neglected. Results: This motivated us to develop MIRACLE, a comprehensive and user-friendly web application bridging the gap between spotting and array analysis by conveniently keeping track of sample information. Data processing includes correction of staining bias, estimation of protein concentration from response curves, normalization for total protein amount per sample and statistical evaluation. Established analysis methods have been integrated with MIRACLE, offering experimental scientists an end-to-end solution for sample management and for carrying out data analysis. In addition, experienced users have the possibility to export data to R for more complex analyses. MIRACLE thus has the potential to further spread utilization of RPPAs as an emerging technology for high-throughput protein analysis. Availability: Project URL: http://www.nanocan.org/miracle/ Contact: mlist@health.sdu.dk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25161257

  15. Microarray R-based analysis of complex lysate experiments with MIRACLE.

    PubMed

    List, Markus; Block, Ines; Pedersen, Marlene Lemvig; Christiansen, Helle; Schmidt, Steffen; Thomassen, Mads; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan

    2014-09-01

    Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. Typical challenges involved in this technology are antibody selection, sample preparation and optimization of staining conditions. The issue of combining effective sample management and data analysis, however, has been widely neglected. This motivated us to develop MIRACLE, a comprehensive and user-friendly web application bridging the gap between spotting and array analysis by conveniently keeping track of sample information. Data processing includes correction of staining bias, estimation of protein concentration from response curves, normalization for total protein amount per sample and statistical evaluation. Established analysis methods have been integrated with MIRACLE, offering experimental scientists an end-to-end solution for sample management and for carrying out data analysis. In addition, experienced users have the possibility to export data to R for more complex analyses. MIRACLE thus has the potential to further spread utilization of RPPAs as an emerging technology for high-throughput protein analysis. Project URL: http://www.nanocan.org/miracle/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  16. Analysis of Supercritical-Extracted Chelated Metal Ions From Mixed Organic-Inorganic Samples

    NASA Technical Reports Server (NTRS)

    Sinha, Mahadeva P. (Inventor)

    1996-01-01

    Organic and inorganic contaminants of an environmental sample are analyzed by the same GC-MS instrument by adding an oxidizing agent to the sample to oxidize metal or metal compounds to form metal ions. The metal ions are converted to chelate complexes and the chelate complexes are extracted into a supercritical fluid such as CO2. The metal chelate extract after flowing through a restrictor tube is directly injected into the ionization chamber of a mass spectrometer, preferably containing a refractory metal filament such as rhenium to fragment the complex to release metal ions which are detected. This provides a fast, economical method for the analysis of metal contaminants in a sample and can be automated. An organic extract of the sample in conventional or supercritical fluid solvents can be detected in the same mass spectrometer, preferably after separation in a supercritical fluid chromatograph.

  17. Overcoming Matrix Effects in a Complex Sample: Analysis of Multiple Elements in Multivitamins by Atomic Absorption Spectroscopy

    ERIC Educational Resources Information Center

    Arnold, Randy J.; Arndt, Brett; Blaser, Emilia; Blosser, Chris; Caulton, Dana; Chung, Won Sog; Fiorenza, Garrett; Heath, Wyatt; Jacobs, Alex; Kahng, Eunice; Koh, Eun; Le, Thao; Mandla, Kyle; McCory, Chelsey; Newman, Laura; Pithadia, Amit; Reckelhoff, Anna; Rheinhardt, Joseph; Skljarevski, Sonja; Stuart, Jordyn; Taylor, Cassie; Thomas, Scott; Tse, Kyle; Wall, Rachel; Warkentien, Chad

    2011-01-01

    A multivitamin tablet and liquid are analyzed for the elements calcium, magnesium, iron, zinc, copper, and manganese using atomic absorption spectrometry. Linear calibration and standard addition are used for all elements except calcium, allowing for an estimate of the matrix effects encountered for this complex sample. Sample preparation using…

  18. Tandem Extraction/Liquid Chromatography-Mass Spectrometry Protocol for the Analysis of Acrylamide and Surfactant-related Compounds in Complex Aqueous Environmental Samples

    EPA Science Inventory

    The development of a liquid chromatography‐mass spectrometry (LC‐MS)‐based strategy for the detection and quantitation of acrylamide and surfactant‐related compounds in aqueous complex environmental samples.

  19. The use of laser-induced fluorescence or ultraviolet detectors for sensitive and selective analysis of tobramycin or erythropoietin in complex samples

    NASA Astrophysics Data System (ADS)

    Ahmed, Hytham M.; Ebeid, Wael B.

    2015-05-01

    Complex samples analysis is a challenge in pharmaceutical and biopharmaceutical analysis. In this work, tobramycin (TOB) analysis in human urine samples and recombinant human erythropoietin (rhEPO) analysis in the presence of similar protein were selected as representative examples of such samples analysis. Assays of TOB in urine samples are difficult because of poor detectability. Therefore laser induced fluorescence detector (LIF) was combined with a separation technique, micellar electrokinetic chromatography (MEKC), to determine TOB through derivatization with fluorescein isothiocyanate (FITC). Borate was used as background electrolyte (BGE) with negative-charged mixed micelles as additive. The method was successively applied to urine samples. The LOD and LOQ for Tobramycin in urine were 90 and 200 ng/ml respectively and recovery was >98% (n = 5). All urine samples were analyzed by direct injection without sample pre-treatment. Another use of hyphenated analytical technique, capillary zone electrophoresis (CZE) connected to ultraviolet (UV) detector was also used for sensitive analysis of rhEPO at low levels (2000 IU) in the presence of large amount of human serum albumin (HSA). Analysis of rhEPO was achieved by the use of the electrokinetic injection (EI) with discontinuous buffers. Phosphate buffer was used as BGE with metal ions as additive. The proposed method can be used for the estimation of large number of quality control rhEPO samples in a short period.

  20. Monosodium Glutamate Analysis in Meatballs Soup

    NASA Astrophysics Data System (ADS)

    Marlina, D.; Amran, A.; Ulianas, A.

    2018-04-01

    The analysis of monosodium glutamate (MSG) in meatball soup using Cu2+ ion as a MSG complex by UV-Vis spectrophotometry has carried out. Reaction of MSG with Cu2+ ions have formed complex compounds [Cu(C5H8NO4)2]2+ characterized by the color change of Cu2+ ion solution from light blue to dark blue. Maximum of complex absorbance [Cu(C5H8NO4)2]2+ is at 621 nm wavelength. The results showed that, the greatest condition of complex [Cu(C5H8NO4)2]2+ was at pH 10, concentration of Cu2+ 0.01 M, complex time is a 30 minute and stable for 170 minutes. Linear response and detection limit of MSG analysis with Cu2+ ions are 0.0005-0.025 M (R2 = 0.994) and (LOD) 0.0003 M. repeatability and recovery method is quite good (% RSD = 0.89% and %recovery = 93%). The analysis of MSG content in meatball soup with MSG complex method was 0.00372 M in sample A and 0.00370 M in sample B.

  1. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  2. Comparison of sample preparation techniques and data analysis for the LC-MS/MS-based identification of proteins in human follicular fluid.

    PubMed

    Lehmann, Roland; Schmidt, André; Pastuschek, Jana; Müller, Mario M; Fritzsche, Andreas; Dieterle, Stefan; Greb, Robert R; Markert, Udo R; Slevogt, Hortense

    2018-06-25

    The proteomic analysis of complex body fluids by liquid chromatography tandem mass spectrometry (LC-MS/MS) analysis requires the selection of suitable sample preparation techniques and optimal parameter settings in data analysis software packages to obtain reliable results. Proteomic analysis of follicular fluid, as a representative of a complex body fluid similar to serum or plasma, is difficult as it contains a vast amount of high abundant proteins and a variety of proteins with different concentrations. However, the accessibility of this complex body fluid for LC-MS/MS analysis is an opportunity to gain insights into the status, the composition of fertility-relevant proteins including immunological factors or for the discovery of new diagnostic and prognostic markers for, for example, the treatment of infertility. In this study, we compared different sample preparation methods (FASP, eFASP and in-solution digestion) and three different data analysis software packages (Proteome Discoverer with SEQUEST, Mascot and MaxQuant with Andromeda) combined with semi- and full-tryptic databank search options to obtain a maximum coverage of the follicular fluid proteome. We found that the most comprehensive proteome coverage is achieved by the eFASP sample preparation method using SDS in the initial denaturing step and the SEQUEST-based semi-tryptic data analysis. In conclusion, we have developed a fractionation-free methodical workflow for in depth LC-MS/MS-based analysis for the standardized investigation of human follicle fluid as an important representative of a complex body fluid. Taken together, we were able to identify a total of 1392 proteins in follicular fluid. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Analysis of Proteins, Protein Complexes, and Organellar Proteomes Using Sheathless Capillary Zone Electrophoresis - Native Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Belov, Arseniy M.; Viner, Rosa; Santos, Marcia R.; Horn, David M.; Bern, Marshall; Karger, Barry L.; Ivanov, Alexander R.

    2017-12-01

    Native mass spectrometry (MS) is a rapidly advancing field in the analysis of proteins, protein complexes, and macromolecular species of various types. The majority of native MS experiments reported to-date has been conducted using direct infusion of purified analytes into a mass spectrometer. In this study, capillary zone electrophoresis (CZE) was coupled online to Orbitrap mass spectrometers using a commercial sheathless interface to enable high-performance separation, identification, and structural characterization of limited amounts of purified proteins and protein complexes, the latter with preserved non-covalent associations under native conditions. The performance of both bare-fused silica and polyacrylamide-coated capillaries was assessed using mixtures of protein standards known to form non-covalent protein-protein and protein-ligand complexes. High-efficiency separation of native complexes is demonstrated using both capillary types, while the polyacrylamide neutral-coated capillary showed better reproducibility and higher efficiency for more complex samples. The platform was then evaluated for the determination of monoclonal antibody aggregation and for analysis of proteomes of limited complexity using a ribosomal isolate from E. coli. Native CZE-MS, using accurate single stage and tandem-MS measurements, enabled identification of proteoforms and non-covalent complexes at femtomole levels. This study demonstrates that native CZE-MS can serve as an orthogonal and complementary technique to conventional native MS methodologies with the advantages of low sample consumption, minimal sample processing and losses, and high throughput and sensitivity. This study presents a novel platform for analysis of ribosomes and other macromolecular complexes and organelles, with the potential for discovery of novel structural features defining cellular phenotypes (e.g., specialized ribosomes). [Figure not available: see fulltext.

  4. Multivariate analysis: greater insights into complex systems

    USDA-ARS?s Scientific Manuscript database

    Many agronomic researchers measure and collect multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate (MV) statistical methods encompass the simultaneous analysis of all random variables (RV) measured on each experimental or sampling ...

  5. What can one sample tell us? Stable isotopes can assess complex processes in national assessments of lakes, rivers and streams.

    EPA Science Inventory

    Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...

  6. Solid phase excitation-emission fluorescence method for the classification of complex substances: Cortex Phellodendri and other traditional Chinese medicines as examples.

    PubMed

    Gu, Yao; Ni, Yongnian; Kokot, Serge

    2012-09-13

    A novel, simple and direct fluorescence method for analysis of complex substances and their potential substitutes has been researched and developed. Measurements involved excitation and emission (EEM) fluorescence spectra of powdered, complex, medicinal herbs, Cortex Phellodendri Chinensis (CPC) and the similar Cortex Phellodendri Amurensis (CPA); these substances were compared and discriminated from each other and the potentially adulterated samples (Caulis mahoniae (CM) and David poplar bark (DPB)). Different chemometrics methods were applied for resolution of the complex spectra, and the excitation spectra were found to be the most informative; only the rank-ordering PROMETHEE method was able to classify the samples with single ingredients (CPA, CPC, CM) or those with binary mixtures (CPA/CPC, CPA/CM, CPC/CM). Interestingly, it was essential to use the geometrical analysis for interactive aid (GAIA) display for a full understanding of the classification results. However, these two methods, like the other chemometrics models, were unable to classify composite spectral matrices consisting of data from samples of single ingredients and binary mixtures; this suggested that the excitation spectra of the different samples were very similar. However, the method is useful for classification of single-ingredient samples and, separately, their binary mixtures; it may also be applied for similar classification work with other complex substances.

  7. A matrix-assisted laser desorption/ionization mass spectroscopy method for the analysis of small molecules by integrating chemical labeling with the supramolecular chemistry of cucurbituril.

    PubMed

    Ding, Jun; Xiao, Hua-Ming; Liu, Simin; Wang, Chang; Liu, Xin; Feng, Yu-Qi

    2018-10-05

    Although several methods have realized the analysis of low molecular weight (LMW) compounds using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) by overcoming the problem of interference with MS signals in the low mass region derived from conventional organic matrices, this emerging field still requires strategies to address the issue of analyzing complex samples containing LMW components in addition to the LMW compounds of interest, and solve the problem of lack of universality. The present study proposes an integrated strategy that combines chemical labeling with the supramolecular chemistry of cucurbit [n]uril (CB [n]) for the MALDI MS analysis of LMW compounds in complex samples. In this strategy, the target LMW compounds are first labeled by introducing a series of bifunctional reagents that selectively react with the target analytes and also form stable inclusion complexes with CB [n]. Then, the labeled products act as guest molecules that readily and selectively form stable inclusion complexes with CB [n]. This strategy relocates the MS signals of the LMW compounds of interest from the low mass region suffering high interference to the high mass region where interference with low mass components is absent. Experimental results demonstrate that a wide range of LMW compounds, including carboxylic acids, aldehydes, amines, thiol, and cis-diols, can be successfully detected using the proposed strategy, and the limits of detection were in the range of 0.01-1.76 nmol/mL. In addition, the high selectivity of the labeling reagents for the target analytes in conjunction with the high selectivity of the binding between the labeled products and CB [n] ensures an absence of signal interference with the non-targeted LMW components of complex samples. Finally, the feasibility of the proposed strategy for complex sample analysis is demonstrated by the accurate and rapid quantitative analysis of aldehydes in saliva and herbal medicines. As such, this work not only provides an alternative method for the detection of various LMW compounds using MALDI MS, but also can be applied to the selective and high-throughput analysis of LMW analytes in complex samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Highly Reproducible Label Free Quantitative Proteomic Analysis of RNA Polymerase Complexes*

    PubMed Central

    Mosley, Amber L.; Sardiu, Mihaela E.; Pattenden, Samantha G.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.

    2011-01-01

    The use of quantitative proteomics methods to study protein complexes has the potential to provide in-depth information on the abundance of different protein components as well as their modification state in various cellular conditions. To interrogate protein complex quantitation using shotgun proteomic methods, we have focused on the analysis of protein complexes using label-free multidimensional protein identification technology and studied the reproducibility of biological replicates. For these studies, we focused on three highly related and essential multi-protein enzymes, RNA polymerase I, II, and III from Saccharomyces cerevisiae. We found that label-free quantitation using spectral counting is highly reproducible at the protein and peptide level when analyzing RNA polymerase I, II, and III. In addition, we show that peptide sampling does not follow a random sampling model, and we show the need for advanced computational models to predict peptide detection probabilities. In order to address these issues, we used the APEX protocol to model the expected peptide detectability based on whole cell lysate acquired using the same multidimensional protein identification technology analysis used for the protein complexes. Neither method was able to predict the peptide sampling levels that we observed using replicate multidimensional protein identification technology analyses. In addition to the analysis of the RNA polymerase complexes, our analysis provides quantitative information about several RNAP associated proteins including the RNAPII elongation factor complexes DSIF and TFIIF. Our data shows that DSIF and TFIIF are the most highly enriched RNAP accessory factors in Rpb3-TAP purifications and demonstrate our ability to measure low level associated protein abundance across biological replicates. In addition, our quantitative data supports a model in which DSIF and TFIIF interact with RNAPII in a dynamic fashion in agreement with previously published reports. PMID:21048197

  9. The use of laser-induced fluorescence or ultraviolet detectors for sensitive and selective analysis of tobramycin or erythropoietin in complex samples.

    PubMed

    Ahmed, Hytham M; Ebeid, Wael B

    2015-05-15

    Complex samples analysis is a challenge in pharmaceutical and biopharmaceutical analysis. In this work, tobramycin (TOB) analysis in human urine samples and recombinant human erythropoietin (rhEPO) analysis in the presence of similar protein were selected as representative examples of such samples analysis. Assays of TOB in urine samples are difficult because of poor detectability. Therefore laser induced fluorescence detector (LIF) was combined with a separation technique, micellar electrokinetic chromatography (MEKC), to determine TOB through derivatization with fluorescein isothiocyanate (FITC). Borate was used as background electrolyte (BGE) with negative-charged mixed micelles as additive. The method was successively applied to urine samples. The LOD and LOQ for Tobramycin in urine were 90 and 200ng/ml respectively and recovery was >98% (n=5). All urine samples were analyzed by direct injection without sample pre-treatment. Another use of hyphenated analytical technique, capillary zone electrophoresis (CZE) connected to ultraviolet (UV) detector was also used for sensitive analysis of rhEPO at low levels (2000IU) in the presence of large amount of human serum albumin (HSA). Analysis of rhEPO was achieved by the use of the electrokinetic injection (EI) with discontinuous buffers. Phosphate buffer was used as BGE with metal ions as additive. The proposed method can be used for the estimation of large number of quality control rhEPO samples in a short period. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Sampling from complex networks using distributed learning automata

    NASA Astrophysics Data System (ADS)

    Rezvanian, Alireza; Rahmati, Mohammad; Meybodi, Mohammad Reza

    2014-02-01

    A complex network provides a framework for modeling many real-world phenomena in the form of a network. In general, a complex network is considered as a graph of real world phenomena such as biological networks, ecological networks, technological networks, information networks and particularly social networks. Recently, major studies are reported for the characterization of social networks due to a growing trend in analysis of online social networks as dynamic complex large-scale graphs. Due to the large scale and limited access of real networks, the network model is characterized using an appropriate part of a network by sampling approaches. In this paper, a new sampling algorithm based on distributed learning automata has been proposed for sampling from complex networks. In the proposed algorithm, a set of distributed learning automata cooperate with each other in order to take appropriate samples from the given network. To investigate the performance of the proposed algorithm, several simulation experiments are conducted on well-known complex networks. Experimental results are compared with several sampling methods in terms of different measures. The experimental results demonstrate the superiority of the proposed algorithm over the others.

  11. Characterization of Factors Affecting Nanoparticle Tracking Analysis Results With Synthetic and Protein Nanoparticles.

    PubMed

    Krueger, Aaron B; Carnell, Pauline; Carpenter, John F

    2016-04-01

    In many manufacturing and research areas, the ability to accurately monitor and characterize nanoparticles is becoming increasingly important. Nanoparticle tracking analysis is rapidly becoming a standard method for this characterization, yet several key factors in data acquisition and analysis may affect results. Nanoparticle tracking analysis is prone to user input and bias on account of a high number of parameters available, contains a limited analysis volume, and individual sample characteristics such as polydispersity or complex protein solutions may affect analysis results. This study systematically addressed these key issues. The integrated syringe pump was used to increase the sample volume analyzed. It was observed that measurements recorded under flow caused a reduction in total particle counts for both polystyrene and protein particles compared to those collected under static conditions. In addition, data for polydisperse samples tended to lose peak resolution at higher flow rates, masking distinct particle populations. Furthermore, in a bimodal particle population, a bias was seen toward the larger species within the sample. The impacts of filtration on an agitated intravenous immunoglobulin sample and operating parameters including "MINexps" and "blur" were investigated to optimize the method. Taken together, this study provides recommendations on instrument settings and sample preparations to properly characterize complex samples. Copyright © 2016. Published by Elsevier Inc.

  12. Contribution of PCR Denaturing Gradient Gel Electrophoresis Combined with Mixed Chromatogram Software Separation for Complex Urinary Sample Analysis.

    PubMed

    Kotásková, Iva; Mališová, Barbora; Obručová, Hana; Holá, Veronika; Peroutková, Tereza; Růžička, Filip; Freiberger, Tomáš

    2017-01-01

    Complex samples are a challenge for sequencing-based broad-range diagnostics. We analysed 19 urinary catheter, ureteral Double-J catheter, and urine samples using 3 methodological approaches. Out of the total 84 operational taxonomic units, 37, 61, and 88% were identified by culture, PCR-DGGE-SS (PCR denaturing gradient gel electrophoresis followed by Sanger sequencing), and PCR-DGGE-RM (PCR- DGGE combined with software chromatogram separation by RipSeq Mixed tool), respectively. The latter approach was shown to be an efficient tool to complement culture in complex sample assessment. © 2017 S. Karger AG, Basel.

  13. Extractive electrospray ionization mass spectrometry toward in situ analysis without sample pretreatment.

    PubMed

    Li, Ming; Hu, Bin; Li, Jianqiang; Chen, Rong; Zhang, Xie; Chen, Huanwen

    2009-09-15

    A homemade novel nanoextractive electrospray ionization (nanoEESI) source has been characterized for in situ mass spectrometric analysis of ambient samples without sample pretreatment. The primary ions generated using a nanospray emitter interact with the neutral sample plume created by manually nebulizing liquid samples, allowing production of the analyte ions in the spatial cross section of the nanoEESI source. The performance of nanoEESI is experimentally investigated by coupling the nanoEESI source to a commercial LTQ mass spectrometer for rapid analysis of various ambient samples using positive/negative ion detection modes. Compounds of interest in actual samples such as aerosol drug preparations, beverages, milk suspensions, farmland water, and groundwater were unambiguously detected using tandem nanoEESI ion trap mass spectrometry. The limit of detection was low picogram per milliliter levels for the compounds tested. Acceptable relative standard deviation (RSD) values (5-10%) were obtained for direct measurement of analytes in complex matrixes, providing linear dynamic signal responses using manual sample introduction. A single sample analysis was completed within 1.2 s. Requiring no sheath gas for either primary ion production or neutral sample introduction, the nanoEESI has advantages including readiness for miniaturization and integration, simple maintenance, easy operation, and low cost. The experimental data demonstrate that the nanoEESI is a promising tool for high-throughput, sensitive, quantitative, in situ analysis of ambient complex samples, showing potential applications for in situ analysis in multiple disciplines including but not limited to pharmaceutical analysis, food quality control, pesticides residue detection, and homeland security.

  14. Analysis of $sup 239$Pu and $sup 241$Am in NAEG large-sized bovine samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Major, W.J.; Lee, K.D.; Wessman, R.A.

    Methods are described for the analysis of environmental levels of $sup 239$Pu and $sup 241$Am in large-sized bovine samples. Special procedure modifications to overcome the complexities of sample preparation and analyses and special techniques employed to prepare and analyze different types of bovine samples, such as muscle, blood, liver, and bone are discussed. (CH)

  15. LC-MS/MS signal suppression effects in the analysis of pesticides in complex environmental matrices.

    PubMed

    Choi, B K; Hercules, D M; Gusev, A I

    2001-02-01

    The application of LC separation and mobile phase additives in addressing LC-MS/MS matrix signal suppression effects for the analysis of pesticides in a complex environmental matrix was investigated. It was shown that signal suppression is most significant for analytes eluting early in the LC-MS analysis. Introduction of different buffers (e.g. ammonium formate, ammonium hydroxide, formic acid) into the LC mobile phase was effective in improving signal correlation between the matrix and standard samples. The signal improvement is dependent on buffer concentration as well as LC separation of the matrix components. The application of LC separation alone was not effective in addressing suppression effects when characterizing complex matrix samples. Overloading of the LC column by matrix components was found to significantly contribute to analyte-matrix co-elution and suppression of signal. This signal suppression effect can be efficiently compensated by 2D LC (LC-LC) separation techniques. The effectiveness of buffers and LC separation in improving signal correlation between standard and matrix samples is discussed.

  16. 16S rRNA based microarray analysis of ten periodontal bacteria in patients with different forms of periodontitis.

    PubMed

    Topcuoglu, Nursen; Kulekci, Guven

    2015-10-01

    DNA microarray analysis is a computer based technology, that a reverse capture, which targets 10 periodontal bacteria (ParoCheck) is available for rapid semi-quantitative determination. The aim of this three-year retrospective study was to display the microarray analysis results for the subgingival biofilm samples taken from patient cases diagnosed with different forms of periodontitis. A total of 84 patients with generalized aggressive periodontitis (GAP,n:29), generalized chronic periodontitis (GCP, n:25), peri-implantitis (PI,n:14), localized aggressive periodontitis (LAP,n:8) and refractory chronic periodontitis (RP,n:8) were consecutively selected from the archives of the Oral Microbiological Diagnostic Laboratory. The subgingival biofilm samples were analyzed by the microarray-based identification of 10 selected species. All the tested species were detected in the samples. The red complex bacteria were the most prevalent with very high levels in all groups. Fusobacterium nucleatum was detected in all samples at high levels. The green and blue complex bacteria were less prevalent compared with red and orange complex, except Aggregatibacter actinomycetemcomitas was detected in all LAP group. Positive correlations were found within all the red complex bacteria and between red and orange complex bacteria especially in GCP and GAP groups. Parocheck enables to monitoring of periodontal pathogens in all forms of periodontal disease and can be alternative to other guiding and reliable microbiologic tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Bayesian linkage and segregation analysis: factoring the problem.

    PubMed

    Matthysse, S

    2000-01-01

    Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.

  18. Online-LASIL: Laser Ablation of Solid Samples in Liquid with online-coupled ICP-OES detection for direct determination of the stoichiometry of complex metal oxide thin layers.

    PubMed

    Bonta, Maximilian; Frank, Johannes; Taibl, Stefanie; Fleig, Jürgen; Limbeck, Andreas

    2018-02-13

    Advanced materials such as complex metal oxides are used in a wide range of applications and have further promising perspectives in the form of thin films. The exact chemical composition essentially influences the electronic properties of these materials which makes correct assessment of their composition necessary. However, due to high chemical resistance and in the case of thin films low absolute analyte amounts, this procedure is in most cases not straightforward and extremely time-demanding. Commonly applied techniques either lack in ease of use (i.e., solution-based analysis with preceding sample dissolution), or adequately accurate quantification (i.e., solid sampling techniques). An analysis approach which combines the beneficial aspects of solution-based analysis as well as direct solid sampling is Laser Ablation of a Sample in Liquid (LASIL). In this work, it is shown that the analysis of major as well as minor sample constituents is possible using a novel online-LASIL setup, allowing sample analysis without manual sample handling after placing it in an ablation chamber. Strontium titanate (STO) thin layers with different compositions were analyzed in the course of this study. Precision of the newly developed online-LASIL method is comparable to conventional wet chemical approaches. With only about 15-20 min required for the analysis per sample, time demand is significantly reduced compared to often necessary fusion procedures lasting multiple hours. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. An application of sample entropy to precipitation in Paraíba State, Brazil

    NASA Astrophysics Data System (ADS)

    Xavier, Sílvio Fernando Alves; da Silva Jale, Jader; Stosic, Tatijana; dos Santos, Carlos Antonio Costa; Singh, Vijay P.

    2018-05-01

    A climate system is characterized to be a complex non-linear system. In order to describe the complex characteristics of precipitation series in Paraíba State, Brazil, we aim the use of sample entropy, a kind of entropy-based algorithm, to evaluate the complexity of precipitation series. Sixty-nine meteorological stations are distributed over four macroregions: Zona da Mata, Agreste, Borborema, and Sertão. The results of the analysis show that intricacies of monthly average precipitation have differences in the macroregions. Sample entropy is able to reflect the dynamic change of precipitation series providing a new way to investigate complexity of hydrological series. The complexity exhibits areal variation of local water resource systems which can influence the basis for utilizing and developing resources in dry areas.

  20. Mitochondrial Respiration in Human Colorectal and Breast Cancer Clinical Material Is Regulated Differently

    PubMed Central

    Koit, Andre; Ounpuu, Lyudmila; Klepinin, Aleksandr; Chekulayev, Vladimir; Timohhina, Natalja; Tepp, Kersti; Puurand, Marju; Truu, Laura; Heck, Karoliina; Valvere, Vahur; Guzun, Rita

    2017-01-01

    We conducted quantitative cellular respiration analysis on samples taken from human breast cancer (HBC) and human colorectal cancer (HCC) patients. Respiratory capacity is not lost as a result of tumor formation and even though, functionally, complex I in HCC was found to be suppressed, it was not evident on the protein level. Additionally, metabolic control analysis was used to quantify the role of components of mitochondrial interactosome. The main rate-controlling steps in HBC are complex IV and adenine nucleotide transporter, but in HCC, complexes I and III. Our kinetic measurements confirmed previous studies that respiratory chain complexes I and III in HBC and HCC can be assembled into supercomplexes with a possible partial addition from the complex IV pool. Therefore, the kinetic method can be a useful addition in studying supercomplexes in cell lines or human samples. In addition, when results from culture cells were compared to those from clinical samples, clear differences were present, but we also detected two different types of mitochondria within clinical HBC samples, possibly linked to two-compartment metabolism. Taken together, our data show that mitochondrial respiration and regulation of mitochondrial membrane permeability have substantial differences between these two cancer types when compared to each other to their adjacent healthy tissue or to respective cell cultures. PMID:28781720

  1. Chemometric and multivariate statistical analysis of time-of-flight secondary ion mass spectrometry spectra from complex Cu-Fe sulfides.

    PubMed

    Kalegowda, Yogesh; Harmer, Sarah L

    2012-03-20

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra of mineral samples are complex, comprised of large mass ranges and many peaks. Consequently, characterization and classification analysis of these systems is challenging. In this study, different chemometric and statistical data evaluation methods, based on monolayer sensitive TOF-SIMS data, have been tested for the characterization and classification of copper-iron sulfide minerals (chalcopyrite, chalcocite, bornite, and pyrite) at different flotation pulp conditions (feed, conditioned feed, and Eh modified). The complex mass spectral data sets were analyzed using the following chemometric and statistical techniques: principal component analysis (PCA); principal component-discriminant functional analysis (PC-DFA); soft independent modeling of class analogy (SIMCA); and k-Nearest Neighbor (k-NN) classification. PCA was found to be an important first step in multivariate analysis, providing insight into both the relative grouping of samples and the elemental/molecular basis for those groupings. For samples exposed to oxidative conditions (at Eh ~430 mV), each technique (PCA, PC-DFA, SIMCA, and k-NN) was found to produce excellent classification. For samples at reductive conditions (at Eh ~ -200 mV SHE), k-NN and SIMCA produced the most accurate classification. Phase identification of particles that contain the same elements but a different crystal structure in a mixed multimetal mineral system has been achieved.

  2. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    PubMed

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Comparison of protein profiles of beech bark disease-resistant or beech bark disease-susceptible American beech

    Treesearch

    Mary E. Mason; Marek Krasowski; Judy Loo; Jennifer. Koch

    2011-01-01

    Proteomic analysis of beech bark proteins from trees resistant and susceptible to beech bark disease (BBD) was conducted. Sixteen trees from eight geographically isolated stands, 10 resistant (healthy) and 6 susceptible (diseased/infested) trees, were studied. The genetic complexity of the sample unit, the sampling across a wide geographic area, and the complexity of...

  4. Alternatives to current flow cytometry data analysis for clinical and research studies.

    PubMed

    Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul

    2018-02-01

    Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.

  5. Sampling in the light of Wigner distribution.

    PubMed

    Stern, Adrian; Javidi, Bahram

    2004-03-01

    We propose a new method for analysis of the sampling and reconstruction conditions of real and complex signals by use of the Wigner domain. It is shown that the Wigner domain may provide a better understanding of the sampling process than the traditional Fourier domain. For example, it explains how certain non-bandlimited complex functions can be sampled and perfectly reconstructed. On the basis of observations in the Wigner domain, we derive a generalization to the Nyquist sampling criterion. By using this criterion, we demonstrate simple preprocessing operations that can adapt a signal that does not fulfill the Nyquist sampling criterion. The preprocessing operations demonstrated can be easily implemented by optical means.

  6. Fully Integrated Microfluidic Device for Direct Sample-to-Answer Genetic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Robin H.; Grodzinski, Piotr

    Integration of microfluidics technology with DNA microarrays enables building complete sample-to-answer systems that are useful in many applications such as clinic diagnostics. In this chapter, a fully integrated microfluidic device [1] that consists of microfluidic mixers, valves, pumps, channels, chambers, heaters, and a DNA microarray sensor to perform DNA analysis of complex biological sample solutions is present. This device can perform on-chip sample preparation (including magnetic bead-based cell capture, cell preconcentration and purification, and cell lysis) of complex biological sample solutions (such as whole blood), polymerase chain reaction, DNA hybridization, and electrochemical detection. A few novel microfluidic techniques were developed and employed. A micromix-ing technique based on a cavitation microstreaming principle was implemented to enhance target cell capture from whole blood samples using immunomagnetic beads. This technique was also employed to accelerate DNA hybridization reaction. Thermally actuated paraffin-based microvalves were developed to regulate flows. Electrochemical pumps and thermopneumatic pumps were integrated on the chip to provide pumping of liquid solutions. The device is completely self-contained: no external pressure sources, fluid storage, mechanical pumps, or valves are necessary for fluid manipulation, thus eliminating possible sample contamination and simplifying device operation. Pathogenic bacteria detection from ~mL whole blood samples and single-nucleotide polymorphism analysis directly from diluted blood were demonstrated. The device provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus has a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.

  7. Speciation and isotope dilution analysis of gadolinium-based contrast agents in wastewater.

    PubMed

    Telgmann, Lena; Wehe, Christoph A; Birka, Marvin; Künnemeyer, Jens; Nowak, Sascha; Sperling, Michael; Karst, Uwe

    2012-11-06

    The fate of Gadolinium (Gd)-based contrast agents for magnetic resonance imaging (MRI) during sewage treatment was investigated. The total concentration of Gd in influent and effluent 2 and 24 h composite samples was determined by means of isotope dilution analysis. The balancing of Gd input and output of a sewage plant over seven days indicated that approximately 10% of the Gd is removed during treatment. Batch experiments simulating the aeration tank of a sewage treatment plant confirmed the Gd complex removal during activated sludge treatment. For speciation analysis of the Gd complexes in wastewater samples, high performance liquid chromatography (HPLC) was hyphenated to inductively coupled plasma sector field mass spectrometry (ICP-SFMS). Separation of the five predominantly used contrast agents was carried out on a new hydrophilic interaction liquid chromatography stationary phase in less than 15 min. A limit of detection (LOD) of 0.13 μg/L and a limit of quantification of 0.43 μg/L could be achieved for the Gd chelates without having to apply enrichment techniques. Speciation analysis of the 24 h composite samples revealed that 80% of the Gd complexes are present as Gd-BT-DO3A in the sampled treatment plant. The day-of-week dependent variation of the complex load followed the variation of the total Gd load, indicating a similar behavior. The analysis of sewage sludge did not prove the presence of anthropogenic Gd. However, in the effluent of the chamber filter press, which was used for sludge dewatering, two of the contrast agents and three other unknown Gd species were observed. This indicates that species transformation took place during anaerobic sludge treatment.

  8. [The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].

    PubMed

    Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh

    2012-10-01

    The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.

  9. Principles of qualitative analysis in the chromatographic context.

    PubMed

    Valcárcel, M; Cárdenas, S; Simonet, B M; Carrillo-Carrión, C

    2007-07-27

    This article presents the state of the art of qualitative analysis in the framework of the chromatographic analysis. After establishing the differences between two main classes of qualitative analysis (analyte identification and sample classification/qualification) the particularities of instrumental qualitative analysis are commented on. Qualitative chromatographic analysis for sample classification/qualification through the so-called chromatographic fingerprint (for complex samples) or the volatiles profile (through the direct coupling headspace-mass spectrometry using the chromatograph as interface) is discussed. Next, more technical exposition of the qualitative chromatographic information is presented supported by a variety of representative examples.

  10. A Preliminary Analysis of the Linguistic Complexity of Numeracy Skills Test Items for Pre Service Teachers

    ERIC Educational Resources Information Center

    O'Keeffe, Lisa

    2016-01-01

    Language is frequently discussed as barrier to mathematics word problems. Hence this paper presents the initial findings of a linguistic analysis of numeracy skills test sample items. The theoretical perspective of multi-modal text analysis underpinned this study, in which data was extracted from the ten sample numeracy test items released by the…

  11. Strategies for the structural analysis of multi-protein complexes: lessons from the 3D-Repertoire project.

    PubMed

    Collinet, B; Friberg, A; Brooks, M A; van den Elzen, T; Henriot, V; Dziembowski, A; Graille, M; Durand, D; Leulliot, N; Saint André, C; Lazar, N; Sattler, M; Séraphin, B; van Tilbeurgh, H

    2011-08-01

    Structural studies of multi-protein complexes, whether by X-ray diffraction, scattering, NMR spectroscopy or electron microscopy, require stringent quality control of the component samples. The inability to produce 'keystone' subunits in a soluble and correctly folded form is a serious impediment to the reconstitution of the complexes. Co-expression of the components offers a valuable alternative to the expression of single proteins as a route to obtain sufficient amounts of the sample of interest. Even in cases where milligram-scale quantities of purified complex of interest become available, there is still no guarantee that good quality crystals can be obtained. At this step, protein engineering of one or more components of the complex is frequently required to improve solubility, yield or the ability to crystallize the sample. Subsequent characterization of these constructs may be performed by solution techniques such as Small Angle X-ray Scattering and Nuclear Magnetic Resonance to identify 'well behaved' complexes. Herein, we recount our experiences gained at protein production and complex assembly during the European 3D Repertoire project (3DR). The goal of this consortium was to obtain structural information on multi-protein complexes from yeast by combining crystallography, electron microscopy, NMR and in silico modeling methods. We present here representative set case studies of complexes that were produced and analyzed within the 3DR project. Our experience provides useful insight into strategies that are more generally applicable for structural analysis of protein complexes. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. [Variable selection methods combined with local linear embedding theory used for optimization of near infrared spectral quantitative models].

    PubMed

    Hao, Yong; Sun, Xu-Dong; Yang, Qiang

    2012-12-01

    Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.

  13. Current developments in forensic interpretation of mixed DNA samples (Review).

    PubMed

    Hu, Na; Cong, Bin; Li, Shujin; Ma, Chunling; Fu, Lihong; Zhang, Xiaojing

    2014-05-01

    A number of recent improvements have provided contemporary forensic investigations with a variety of tools to improve the analysis of mixed DNA samples in criminal investigations, producing notable improvements in the analysis of complex trace samples in cases of sexual assult and homicide. Mixed DNA contains DNA from two or more contributors, compounding DNA analysis by combining DNA from one or more major contributors with small amounts of DNA from potentially numerous minor contributors. These samples are characterized by a high probability of drop-out or drop-in combined with elevated stutter, significantly increasing analysis complexity. At some loci, minor contributor alleles may be completely obscured due to amplification bias or over-amplification, creating the illusion of additional contributors. Thus, estimating the number of contributors and separating contributor genotypes at a given locus is significantly more difficult in mixed DNA samples, requiring the application of specialized protocols that have only recently been widely commercialized and standardized. Over the last decade, the accuracy and repeatability of mixed DNA analyses available to conventional forensic laboratories has greatly advanced in terms of laboratory technology, mathematical models and biostatistical software, generating more accurate, rapid and readily available data for legal proceedings and criminal cases.

  14. Current developments in forensic interpretation of mixed DNA samples (Review)

    PubMed Central

    HU, NA; CONG, BIN; LI, SHUJIN; MA, CHUNLING; FU, LIHONG; ZHANG, XIAOJING

    2014-01-01

    A number of recent improvements have provided contemporary forensic investigations with a variety of tools to improve the analysis of mixed DNA samples in criminal investigations, producing notable improvements in the analysis of complex trace samples in cases of sexual assult and homicide. Mixed DNA contains DNA from two or more contributors, compounding DNA analysis by combining DNA from one or more major contributors with small amounts of DNA from potentially numerous minor contributors. These samples are characterized by a high probability of drop-out or drop-in combined with elevated stutter, significantly increasing analysis complexity. At some loci, minor contributor alleles may be completely obscured due to amplification bias or over-amplification, creating the illusion of additional contributors. Thus, estimating the number of contributors and separating contributor genotypes at a given locus is significantly more difficult in mixed DNA samples, requiring the application of specialized protocols that have only recently been widely commercialized and standardized. Over the last decade, the accuracy and repeatability of mixed DNA analyses available to conventional forensic laboratories has greatly advanced in terms of laboratory technology, mathematical models and biostatistical software, generating more accurate, rapid and readily available data for legal proceedings and criminal cases. PMID:24748965

  15. Polytopic vector analysis in igneous petrology: Application to lunar petrogenesis

    NASA Technical Reports Server (NTRS)

    Shervais, John W.; Ehrlich, R.

    1993-01-01

    Lunar samples represent a heterogeneous assemblage of rocks with complex inter-relationships that are difficult to decipher using standard petrogenetic approaches. These inter-relationships reflect several distinct petrogenetic trends as well as thermomechanical mixing of distinct components. Additional complications arise from the unequal quality of chemical analyses and from the fact that many samples (e.g., breccia clasts) are too small to be representative of the system from which they derived. Polytopic vector analysis (PVA) is a multi-variate procedure used as a tool for exploratory data analysis. PVA allows the analyst to classify samples and clarifies relationships among heterogenous samples with complex petrogenetic histories. It differs from orthogonal factor analysis in that it uses non-orthogonal multivariate sample vectors to extract sample endmember compositions. The output from a Q-mode (sample based) factor analysis is the initial step in PVA. The Q-mode analysis, using criteria established by Miesch and Klovan and Miesch, is used to determine the number of endmembers in the data system. The second step involves determination of endmembers and mixing proportions with all output expressed in the same geochemical variable as the input. The composition of endmembers is derived by analysis of the variability of the data set. Endmembers need not be present in the data set, nor is it necessary for their composition to be known a priori. A set of any endmembers defines a 'polytope' or classification figure (triangle for a three component system, tetrahedron for a four component system, a 'five-tope' in four dimensions for five component system, et cetera).

  16. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    PubMed

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  17. Aging and cardiovascular complexity: effect of the length of RR tachograms

    PubMed Central

    Nagaraj, Nithin

    2016-01-01

    As we age, our hearts undergo changes that result in a reduction in complexity of physiological interactions between different control mechanisms. This results in a potential risk of cardiovascular diseases which are the number one cause of death globally. Since cardiac signals are nonstationary and nonlinear in nature, complexity measures are better suited to handle such data. In this study, three complexity measures are used, namely Lempel–Ziv complexity (LZ), Sample Entropy (SampEn) and Effort-To-Compress (ETC). We determined the minimum length of RR tachogram required for characterizing complexity of healthy young and healthy old hearts. All the three measures indicated significantly lower complexity values for older subjects than younger ones. However, the minimum length of heart-beat interval data needed differs for the three measures, with LZ and ETC needing as low as 10 samples, whereas SampEn requires at least 80 samples. Our study indicates that complexity measures such as LZ and ETC are good candidates for the analysis of cardiovascular dynamics since they are able to work with very short RR tachograms. PMID:27957395

  18. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  19. Complexity quantification of dense array EEG using sample entropy analysis.

    PubMed

    Ramanand, Pravitha; Nampoori, V P N; Sreenivasan, R

    2004-09-01

    In this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.

  20. Sample Manipulation System for Sample Analysis at Mars

    NASA Technical Reports Server (NTRS)

    Mumm, Erik; Kennedy, Tom; Carlson, Lee; Roberts, Dustyn

    2008-01-01

    The Sample Analysis at Mars (SAM) instrument will analyze Martian samples collected by the Mars Science Laboratory Rover with a suite of spectrometers. This paper discusses the driving requirements, design, and lessons learned in the development of the Sample Manipulation System (SMS) within SAM. The SMS stores and manipulates 74 sample cups to be used for solid sample pyrolysis experiments. Focus is given to the unique mechanism architecture developed to deliver a high packing density of sample cups in a reliable, fault tolerant manner while minimizing system mass and control complexity. Lessons learned are presented on contamination control, launch restraint mechanisms for fragile sample cups, and mechanism test data.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentry, T.; Schadt, C.; Zhou, J.

    Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. Thismore » review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.« less

  2. Unattended reaction monitoring using an automated microfluidic sampler and on-line liquid chromatography.

    PubMed

    Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve

    2018-04-03

    In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  4. Optical Properties of Natural Minerals in the Far-Infrared

    NASA Astrophysics Data System (ADS)

    Long, Larry Lavern

    The reflectivity of natural mineral powders were measured in the far infrared. The complex indices of refraction were then determined by Kramers-Kronig analysis or dispersive analysis. The samples were constructed by pressing the powdered sample into a 13 mm diameter pellet. A few of the samples that were measured were kaolin, illite, and montmorillonite, clay samples that could not be obtained in large single crystals. For calcite and gypsum crystals a comparison between the single crystal measurements and powder measurements was done to determine the effect of sample preparation on the measured spectra.

  5. GeLC-MRM quantitation of mutant KRAS oncoprotein in complex biological samples.

    PubMed

    Halvey, Patrick J; Ferrone, Cristina R; Liebler, Daniel C

    2012-07-06

    Tumor-derived mutant KRAS (v-Ki-ras-2 Kirsten rat sarcoma viral oncogene) oncoprotein is a critical driver of cancer phenotypes and a potential biomarker for many epithelial cancers. Targeted mass spectrometry analysis by multiple reaction monitoring (MRM) enables selective detection and quantitation of wild-type and mutant KRAS proteins in complex biological samples. A recently described immunoprecipitation approach (Proc. Nat. Acad. Sci.2011, 108, 2444-2449) can be used to enrich KRAS for MRM analysis, but requires large protein inputs (2-4 mg). Here, we describe sodium dodecyl sulfate-polyacrylamide gel electrophoresis-based enrichment of KRAS in a low molecular weight (20-25 kDa) protein fraction prior to MRM analysis (GeLC-MRM). This approach reduces background proteome complexity, thus, allowing mutant KRAS to be reliably quantified in low protein inputs (5-50 μg). GeLC-MRM detected KRAS mutant variants (G12D, G13D, G12V, G12S) in a panel of cancer cell lines. GeLC-MRM analysis of wild-type and mutant was linear with respect to protein input and showed low variability across process replicates (CV = 14%). Concomitant analysis of a peptide from the highly similar HRAS and NRAS proteins enabled correction of KRAS-targeted measurements for contributions from these other proteins. KRAS peptides were also quantified in fluid from benign pancreatic cysts and pancreatic cancers at concentrations from 0.08 to 1.1 fmol/μg protein. GeLC-MRM provides a robust, sensitive approach to quantitation of mutant proteins in complex biological samples.

  6. Quantification of acidic compounds in complex biomass-derived streams

    DOE PAGES

    Karp, Eric M.; Nimlos, Claire T.; Deutch, Steve; ...

    2016-05-10

    Biomass-derived streams that contain acidic compounds from the degradation of lignin and polysaccharides (e.g. black liquor, pyrolysis oil, pyrolytic lignin, etc.) are chemically complex solutions prone to instability and degradation during analysis, making quantification of compounds within them challenging. Here we present a robust analytical method to quantify acidic compounds in complex biomass-derived mixtures using ion exchange, sample reconstitution in pyridine and derivatization with BSTFA. The procedure is based on an earlier method originally reported for kraft black liquors and, in this work, is applied to identify and quantify a large slate of acidic compounds in corn stover derived alkalinemore » pretreatment liquor (APL) as a function of pretreatment severity. Analysis of the samples is conducted with GCxGC-TOFMS to achieve good resolution of the components within the complex mixture. The results reveal the dominant low molecular weight components and their concentrations as a function of pretreatment severity. Application of this method is also demonstrated in the context of lignin conversion technologies by applying it to track the microbial conversion of an APL substrate. Here as well excellent results are achieved, and the appearance and disappearance of compounds is observed in agreement with the known metabolic pathways of two bacteria, indicating the sample integrity was maintained throughout analysis. Finally, it is shown that this method applies more generally to lignin-rich materials by demonstrating its usefulness in analysis of pyrolysis oil and pyrolytic lignin.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less

  8. Improvement of DGGE analysis by modifications of PCR protocols for analysis of microbial community members with low abundance.

    PubMed

    Wang, Yong-Feng; Zhang, Fang-Qiu; Gu, Ji-Dong

    2014-06-01

    Denaturing gradient gel electrophoresis (DGGE) is a powerful technique to reveal the community structures and composition of microorganisms in complex natural environments and samples. However, positive and reproducible polymerase chain reaction (PCR) products, which are difficult to acquire for some specific samples due to low abundance of the target microorganisms, significantly impair the effective applications of DGGE. Thus, nested PCR is often introduced to generate positive PCR products from the complex samples, but one problem is also introduced: The total number of thermocycling in nested PCR is usually unacceptably high, which results in skewed community structures by generation of random or mismatched PCR products on the DGGE gel, and this was demonstrated in this study. Furthermore, nested PCR could not resolve the uneven representative issue with PCR products of complex samples with unequal richness of microbial population. In order to solve the two problems in nested PCR, the general protocol was modified and improved in this study. Firstly, a general PCR procedure was used to amplify the target genes with the PCR primers without any guanine cytosine (GC) clamp, and then, the resultant PCR products were purified and diluted to 0.01 μg ml(-1). Subsequently, the diluted PCR products were utilized as templates to amplify again with the same PCR primers with the GC clamp for 17 cycles, and the products were finally subjected to DGGE analysis. We demonstrated that this is a much more reliable approach to obtain a high quality DGGE profile with high reproducibility. Thus, we recommend the adoption of this improved protocol in analyzing microorganisms of low abundance in complex samples when applying the DGGE fingerprinting technique to avoid biased results.

  9. Mood states modulate complexity in heartbeat dynamics: A multiscale entropy analysis

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Nardelli, M.; Bertschy, G.; Lanata, A.; Scilingo, E. P.

    2014-07-01

    This paper demonstrates that heartbeat complex dynamics is modulated by different pathological mental states. Multiscale entropy analysis was performed on R-R interval series gathered from the electrocardiogram of eight bipolar patients who exhibited mood states among depression, hypomania, and euthymia, i.e., good affective balance. Three different methodologies for the choice of the sample entropy radius value were also compared. We show that the complexity level can be used as a marker of mental states being able to discriminate among the three pathological mood states, suggesting to use heartbeat complexity as a more objective clinical biomarker for mental disorders.

  10. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  11. Analysis of solvent dyes in refined petroleum products by electrospray ionization mass spectrometry

    USGS Publications Warehouse

    Rostad, C.E.

    2010-01-01

    Solvent dyes are used to color refined petroleum products to enable differentiation between gasoline, diesel, and jet fuels. Analysis for these dyes in the hydrocarbon product is difficult due to their very low concentrations in such a complex matrix. Flow injection analysis/electrospray ionization/mass spectrometry in both negative and positive mode was used to optimize ionization of ten typical solvent dyes. Samples of hydrocarbon product were analyzed under similar conditions. Positive electrospray ionization produced very complex spectra, which were not suitably specific for targeting only the dyes. Negative electrospray ionization produced simple spectra because aliphatic and aromatic moieties were not ionized. This enabled screening for a target dye in samples of hydrocarbon product from a spill.

  12. Determination of the total concentration and speciation of metal ions in river, estuarine and seawater samples.

    PubMed

    Alberti, Giancarla; Biesuz, Raffaela; Pesavento, Maria

    2008-12-01

    Different natural water samples were investigated to determine the total concentration and the distribution of species for Cu(II), Pb(II), Al(III) and U(VI). The proposed method, named resin titration (RT), was developed in our laboratory to investigate the distribution of species for metal ions in complex matrices. It is a competition method, in which a complexing resin competes with natural ligands present in the sample to combine with the metal ions. In the present paper, river, estuarine and seawater samples, collected during a cruise in Adriatic Sea, were investigated. For each sample, two RTs were performed, using different complexing resins: the iminodiacetic Chelex 100 and the carboxylic Amberlite CG50. In this way, it was possible to detect different class of ligands. Satisfactory results have been obtained and are commented on critically. They were summarized by principal component analysis (PCA) and the correlations with physicochemical parameters allowed one to follow the evolution of the metals along the considered transect. It should be pointed out that, according to our findings, the ligands responsible for metal ions complexation are not the major components of the water system, since they form considerably weaker complexes.

  13. An overview of the genetic dissection of complex traits.

    PubMed

    Rao, D C

    2008-01-01

    Thanks to the recent revolutionary genomic advances such as the International HapMap consortium, resolution of the genetic architecture of common complex traits is beginning to look hopeful. While demonstrating the feasibility of genome-wide association (GWA) studies, the pathbreaking Wellcome Trust Case Control Consortium (WTCCC) study also serves to underscore the critical importance of very large sample sizes and draws attention to potential problems, which need to be addressed as part of the study design. Even the large WTCCC study had vastly inadequate power for several of the associations reported (and confirmed) and, therefore, most of the regions harboring relevant associations may not be identified anytime soon. This chapter provides an overview of some of the key developments in the methodological approaches to genetic dissection of common complex traits. Constrained Bayesian networks are suggested as especially useful for analysis of pathway-based SNPs. Likewise, composite likelihood is suggested as a promising method for modeling complex systems. It discusses the key steps in a study design, with an emphasis on GWA studies. Potential limitations highlighted by the WTCCC GWA study are discussed, including problems associated with massive genotype imputation, analysis of pooled national samples, shared controls, and the critical role of interactions. GWA studies clearly need massive sample sizes that are only possible through genuine collaborations. After all, for common complex traits, the question is not whether we can find some pieces of the puzzle, but how large and what kind of a sample we need to (nearly) solve the genetic puzzle.

  14. High and low frequency unfolded partial least squares regression based on empirical mode decomposition for quantitative analysis of fuel oil samples.

    PubMed

    Bian, Xihui; Li, Shujuan; Lin, Ligang; Tan, Xiaoyao; Fan, Qingjie; Li, Ming

    2016-06-21

    Accurate prediction of the model is fundamental to the successful analysis of complex samples. To utilize abundant information embedded over frequency and time domains, a novel regression model is presented for quantitative analysis of hydrocarbon contents in the fuel oil samples. The proposed method named as high and low frequency unfolded PLSR (HLUPLSR), which integrates empirical mode decomposition (EMD) and unfolded strategy with partial least squares regression (PLSR). In the proposed method, the original signals are firstly decomposed into a finite number of intrinsic mode functions (IMFs) and a residue by EMD. Secondly, the former high frequency IMFs are summed as a high frequency matrix and the latter IMFs and residue are summed as a low frequency matrix. Finally, the two matrices are unfolded to an extended matrix in variable dimension, and then the PLSR model is built between the extended matrix and the target values. Coupled with Ultraviolet (UV) spectroscopy, HLUPLSR has been applied to determine hydrocarbon contents of light gas oil and diesel fuels samples. Comparing with single PLSR and other signal processing techniques, the proposed method shows superiority in prediction ability and better model interpretation. Therefore, HLUPLSR method provides a promising tool for quantitative analysis of complex samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Preliminary construction of integral analysis for characteristic components in complex matrices by in-house fabricated solid-phase microextraction fibers combined with gas chromatography-mass spectrometry.

    PubMed

    Tang, Zhentao; Hou, Wenqian; Liu, Xiuming; Wang, Mingfeng; Duan, Yixiang

    2016-08-26

    Integral analysis plays an important role in study and quality control of substances with complex matrices in our daily life. As the preliminary construction of integral analysis of substances with complex matrices, developing a relatively comprehensive and sensitive methodology might offer more informative and reliable characteristic components. Flavoring mixtures belonging to the representatives of substances with complex matrices have now been widely used in various fields. To better study and control the quality of flavoring mixtures as additives in food industry, an in-house fabricated solid-phase microextraction (SPME) fiber was prepared based on sol-gel technology in this work. The active organic component of the fiber coating was multi-walled carbon nanotubes (MWCNTs) functionalized with hydroxyl-terminated polydimethyldiphenylsiloxane, which integrate the non-polar and polar chains of both materials. In this way, more sensitive extraction capability for a wider range of compounds can be obtained in comparison with commercial SPME fibers. Preliminarily integral analysis of three similar types of samples were realized by the optimized SPME-GC-MS method. With the obtained GC-MS data, a valid and well-fit model was established by partial least square discriminant analysis (PLS-DA) for classification of these samples (R2X=0.661, R2Y=0.996, Q2=0.986). The validity of the model (R2=0.266, Q2=-0.465) has also approved the potential to predict the "belongingness" of new samples. With the PLS-DA and SPSS method, further screening out the markers among three similar batches of samples may be helpful for monitoring and controlling the quality of the flavoring mixtures as additives in food industry. Conversely, the reliability and effectiveness of the GC-MS data has verified the comprehensive and efficient extraction performance of the in-house fabricated fiber. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. DNA binding and biological studies of some novel water-soluble polymer-copper(II)-phenanthroline complexes.

    PubMed

    Kumar, Rajendran Senthil; Arunachalam, Sankaralingam; Periasamy, Vaiyapuri Subbarayan; Preethy, Christo Paul; Riyasdeen, Anvarbatcha; Akbarsha, Mohammad Abdulkader

    2008-10-01

    Some novel water-soluble polymer-copper(II)-phenanthroline complex samples, [Cu(phen)2(BPEI)]Cl(2).4H2O (phen=1,10-phenanthroline, BPEI=branched polyethyleneimine), with different degrees of copper complex content in the polymer chain have been prepared by ligand substitution method in water-ethanol medium and characterized by infrared, UV-visible, EPR spectral and elemental analysis methods. The binding of these complex samples with DNA has been investigated by electronic absorption spectroscopy, emission spectroscopy and gel retardation assay. Electrostatic interactions between DNA molecule and polymer-copper(II) complex molecule containing many high positive charges have been observed. Besides these ionic interactions, van der Waals interactions, hydrogen bonding and other partial intercalation binding modes may also exist in this system. The polymer-copper(II) complex with higher degree of copper complex content was screened for its antimicrobial activity and antitumor activity.

  17. Flow injection gas chromatography with sulfur chemiluminescence detection for the analysis of total sulfur in complex hydrocarbon matrixes.

    PubMed

    Hua, Yujuan; Hawryluk, Myron; Gras, Ronda; Shearer, Randall; Luong, Jim

    2018-01-01

    A fast and reliable analytical technique for the determination of total sulfur levels in complex hydrocarbon matrices is introduced. The method employed flow injection technique using a gas chromatograph as a sample introduction device and a gas phase dual-plasma sulfur chemiluminescence detector for sulfur quantification. Using the technique described, total sulfur measurement in challenging hydrocarbon matrices can be achieved in less than 10 s with sample-to-sample time <2 min. The high degree of selectivity and sensitivity toward sulfur compounds of the detector offers the ability to measure low sulfur levels with a detection limit in the range of 20 ppb w/w S. The equimolar response characteristic of the detector allows the quantitation of unknown sulfur compounds and simplifies the calibration process. Response is linear over a concentration range of five orders of magnitude, with a high degree of repeatability. The detector's lack of response to hydrocarbons enables direct analysis without the need for time-consuming sample preparation and chromatographic separation processes. This flow injection-based sulfur chemiluminescence detection technique is ideal for fast analysis or trace sulfur analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. a Chiral Tag Study of the Absolute Configuration of Camphor

    NASA Astrophysics Data System (ADS)

    Pratt, David; Evangelisti, Luca; Smart, Taylor; Holdren, Martin S.; Mayer, Kevin J.; West, Channing; Pate, Brooks

    2017-06-01

    The chiral tagging method for rotational spectroscopy uses an established approach in chiral analysis of creating a complex with an enantiopure tag so that enantiomers of the molecule of interest are converted to diastereomer complexes. Since the diastereomers have distinct structure, they give distinguishable rotational spectra. Camphor was chosen as an example for the chiral tag method because it has spectral properties that could pose challenges to the use of three wave mixing rotational spectroscopy to establish absolute configuration. Specifically, one of the dipole moment components of camphor is small making three wave mixing measurements challenging and placing high accuracy requirements on computational chemistry for calculating the dipole moment direction in the principal axis system. The chiral tag measurements of camphor used the hydrogen bond donor 3-butyn-2-ol. Quantum chemistry calculations using the B3LYP-D3BJ method and the def2TZVP basis set identified 7 low energy isomers of the chiral complex. The two lowest energy complexes of the homochiral and heterochiral complexes are observed in a measurement using racemic tag. Absolute configuration is confirmed by the use of an enantiopure tag sample. Spectra with ^{13}C-sensitivity were acquired so that the carbon substitution structure of the complex could be obtained to provide a structure of camphor with correct stereochemistry. The chiral tag complex spectra can also be used to estimate the enantiomeric excess of the sample and analysis of the broadband spectrum indicates that the sample enantiopurity is higher than 99.5%. The structure of the complex is analyzed to determine the extent of geometry modification that occurs upon formation of the complex. These results show that initial isomer searches with fixed geometries will be accurate. The reduction in computation time from fixed geometry assumptions will be discussed.

  19. SPICE: exploration and analysis of post-cytometric complex multivariate datasets.

    PubMed

    Roederer, Mario; Nozzi, Joshua L; Nason, Martha C

    2011-02-01

    Polychromatic flow cytometry results in complex, multivariate datasets. To date, tools for the aggregate analysis of these datasets across multiple specimens grouped by different categorical variables, such as demographic information, have not been optimized. Often, the exploration of such datasets is accomplished by visualization of patterns with pie charts or bar charts, without easy access to statistical comparisons of measurements that comprise multiple components. Here we report on algorithms and a graphical interface we developed for these purposes. In particular, we discuss thresholding necessary for accurate representation of data in pie charts, the implications for display and comparison of normalized versus unnormalized data, and the effects of averaging when samples with significant background noise are present. Finally, we define a statistic for the nonparametric comparison of complex distributions to test for difference between groups of samples based on multi-component measurements. While originally developed to support the analysis of T cell functional profiles, these techniques are amenable to a broad range of datatypes. Published 2011 Wiley-Liss, Inc.

  20. An enhanced cluster analysis program with bootstrap significance testing for ecological community analysis

    USGS Publications Warehouse

    McKenna, J.E.

    2003-01-01

    The biosphere is filled with complex living patterns and important questions about biodiversity and community and ecosystem ecology are concerned with structure and function of multispecies systems that are responsible for those patterns. Cluster analysis identifies discrete groups within multivariate data and is an effective method of coping with these complexities, but often suffers from subjective identification of groups. The bootstrap testing method greatly improves objective significance determination for cluster analysis. The BOOTCLUS program makes cluster analysis that reliably identifies real patterns within a data set more accessible and easier to use than previously available programs. A variety of analysis options and rapid re-analysis provide a means to quickly evaluate several aspects of a data set. Interpretation is influenced by sampling design and a priori designation of samples into replicate groups, and ultimately relies on the researcher's knowledge of the organisms and their environment. However, the BOOTCLUS program provides reliable, objectively determined groupings of multivariate data.

  1. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  2. Speciation analysis of aluminium in plant parts of Betula pendula and in soil.

    PubMed

    Zioła-Frankowska, Anetta; Frankowski, Marcin

    2018-03-01

    The research presents the first results of aluminium speciation analysis in aqueous extracts of individual plant parts of Betula pendula and soil samples, using High Performance Ion Chromatography with Diode Array Detection (HPIC-DAD). The applied method allowed us to carry out a full speciation analysis of aluminium in the form of predominant aluminium-fluoride complexes: AlF (x=2,3,4) (3-x) (first analytical signal), AlF 2+ (second analytical signal) and Al 3+ (third analytical signal) in samples of lateral roots, tap roots, twigs, stem, leaf and soil collected under roots of B. pendula. Concentrations of aluminium and its complexes were determined for two types of environment characterised by different degree of human impact: contaminated site of the Chemical Plant in Luboń and protected area of the Wielkopolski National Park. For all the analysed samples of B. pendula and soil, AlF (x=2,3,4) (3-x) had the largest contribution, followed by Al 3+ and AlF 2+ . Significant differences in concentration and contribution of Al-F complexes and Al 3+ form, depending on the place of sampling (different anthropogenic pressure) and plant part of B. pendula were observed. Based on the obtained results, it was found that transport of aluminium is "blocked" by lateral roots, and is closely related to Al content of soil. Copyright © 2017. Published by Elsevier B.V.

  3. The potential of materials analysis by electron rutherford backscattering as illustrated by a case study of mouse bones and related compounds.

    PubMed

    Vos, Maarten; Tökési, Károly; Benkö, Ilona

    2013-06-01

    Electron Rutherford backscattering (ERBS) is a new technique that could be developed into a tool for materials analysis. Here we try to establish a methodology for the use of ERBS for materials analysis of more complex samples using bone minerals as a test case. For this purpose, we also studied several reference samples containing Ca: calcium carbonate (CaCO(3)) and hydroxyapatite and mouse bone powder. A very good understanding of the spectra of CaCO(3) and hydroxyapatite was obtained. Quantitative interpretation of the bone spectrum is more challenging. A good fit of these spectra is only obtained with the same peak widths as used for the hydroxyapatite sample, if one allows for the presence of impurity atoms with a mass close to that of Na and Mg. Our conclusion is that a meaningful interpretation of spectra of more complex samples in terms of composition is indeed possible, but only if widths of the peaks contributing to the spectra are known. Knowledge of the peak widths can either be developed by the study of reference samples (as was done here) or potentially be derived from theory.

  4. A practical approach to language complexity: a Wikipedia case study.

    PubMed

    Yasseri, Taha; Kornai, András; Kertész, János

    2012-01-01

    In this paper we present statistical analysis of English texts from Wikipedia. We try to address the issue of language complexity empirically by comparing the simple English Wikipedia (Simple) to comparable samples of the main English Wikipedia (Main). Simple is supposed to use a more simplified language with a limited vocabulary, and editors are explicitly requested to follow this guideline, yet in practice the vocabulary richness of both samples are at the same level. Detailed analysis of longer units (n-grams of words and part of speech tags) shows that the language of Simple is less complex than that of Main primarily due to the use of shorter sentences, as opposed to drastically simplified syntax or vocabulary. Comparing the two language varieties by the Gunning readability index supports this conclusion. We also report on the topical dependence of language complexity, that is, that the language is more advanced in conceptual articles compared to person-based (biographical) and object-based articles. Finally, we investigate the relation between conflict and language complexity by analyzing the content of the talk pages associated to controversial and peacefully developing articles, concluding that controversy has the effect of reducing language complexity.

  5. Fractal analysis of behaviour in a wild primate: behavioural complexity in health and disease

    PubMed Central

    MacIntosh, Andrew J. J.; Alados, Concepción L.; Huffman, Michael A.

    2011-01-01

    Parasitism and other stressors are ubiquitous in nature but their effects on animal behaviour can be difficult to identify. We investigated the effects of nematode parasitism and other indicators of physiological impairment on the sequential complexity of foraging and locomotion behaviour among wild Japanese macaques (Macaca fuscata yakui). We observed all sexually mature individuals (n = 28) in one macaque study group between October 2007 and August 2008, and collected two faecal samples/month/individual (n = 362) for parasitological examination. We used detrended fluctuation analysis (DFA) to investigate long-range autocorrelation in separate, binary sequences of foraging (n = 459) and locomotion (n = 446) behaviour collected via focal sampling. All behavioural sequences exhibited long-range autocorrelation, and linear mixed-effects models suggest that increasing infection with the nodular worm Oesophagostomum aculeatum, clinically impaired health, reproductive activity, ageing and low dominance status were associated with reductions in the complexity of locomotion, and to a lesser extent foraging, behaviour. Furthermore, the sequential complexity of behaviour increased with environmental complexity. We argue that a reduction in complexity in animal behaviour characterizes individuals in impaired or ‘stressed’ states, and may have consequences if animals cannot cope with heterogeneity in their natural habitats. PMID:21429908

  6. Analysis of regional brain mitochondrial bioenergetics and susceptibility to mitochondrial inhibition utilizing a microplate based system

    PubMed Central

    Sauerbeck, Andrew; Pandya, Jignesh; Singh, Indrapal; Bittman, Kevin; Readnower, Ryan; Bing, Guoying; Sullivan, Patrick

    2012-01-01

    The analysis of mitochondrial bioenergetic function typically has required 50–100 μg of protein per sample and at least 15 min per run when utilizing a Clark-type oxygen electrode. In the present work we describe a method utilizing the Seahorse Biosciences XF24 Flux Analyzer for measuring mitochondrial oxygen consumption simultaneously from multiple samples and utilizing only 5 μg of protein per sample. Utilizing this method we have investigated whether regionally based differences exist in mitochondria isolated from the cortex, striatum, hippocampus, and cerebellum. Analysis of basal mitochondrial bioenergetics revealed that minimal differences exist between the cortex, striatum, and hippocampus. However, the cerebellum exhibited significantly slower basal rates of Complex I and Complex II dependent oxygen consumption (p < 0.05). Mitochondrial inhibitors affected enzyme activity proportionally across all samples tested and only small differences existed in the effect of inhibitors on oxygen consumption. Investigation of the effect of rotenone administration on Complex I dependent oxygen consumption revealed that exposure to 10 pM rotenone led to a clear time dependent decrease in oxygen consumption beginning 12 min after administration (p < 0.05). These studies show that the utilization of this microplate based method for analysis of mitochondrial bioenergetics is effective at quantifying oxygen consumption simultaneously from multiple samples. Additionally, these studies indicate that minimal regional differences exist in mitochondria isolated from the cortex, striatum, or hippocampus. Furthermore, utilization of the mitochondrial inhibitors suggests that previous work indicating regionally specific deficits following systemic mitochondrial toxin exposure may not be the result of differences in the individual mitochondria from the affected regions. PMID:21402103

  7. Evaluation of physiologic complexity in time series using generalized sample entropy and surrogate data analysis

    NASA Astrophysics Data System (ADS)

    Eduardo Virgilio Silva, Luiz; Otavio Murta, Luiz

    2012-12-01

    Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiffmax) for q ≠1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiffmax values were capable of distinguish HRV groups (p-values 5.10×10-3, 1.11×10-7, and 5.50×10-7 for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis.

  8. L-edge sum rule analysis on 3d transition metal sites: from d10 to d0 and towards application to extremely dilute metallo-enzymes.

    PubMed

    Wang, Hongxin; Friedrich, Stephan; Li, Lei; Mao, Ziliang; Ge, Pinghua; Balasubramanian, Mahalingam; Patil, Daulat S

    2018-03-28

    According to L-edge sum rules, the number of 3d vacancies at a transition metal site is directly proportional to the integrated intensity of the L-edge X-ray absorption spectrum (XAS) for the corresponding metal complex. In this study, the numbers of 3d holes are characterized quantitatively or semi-quantitatively for a series of manganese (Mn) and nickel (Ni) complexes, including the electron configurations 3d 10 → 3d 0 . In addition, extremely dilute (<0.1% wt/wt) Ni enzymes were examined by two different approaches: (1) by using a high resolution superconducting tunnel junction X-ray detector to obtain XAS spectra with a very high signal-to-noise ratio, especially in the non-variant edge jump region; and (2) by adding an inert tracer to the sample that provides a prominent spectral feature to replace the weak edge jump for intensity normalization. In this publication, we present for the first time: (1) L-edge sum rule analysis for a series of Mn and Ni complexes that include electron configurations from an open shell 3d 0 to a closed shell 3d 10 ; (2) a systematic analysis on the uncertainties, especially on that from the edge jump, which was missing in all previous reports; (3) a clearly-resolved edge jump between pre-L 3 and post-L 2 regions from an extremely dilute sample; (4) an evaluation of an alternative normalization standard for L-edge sum rule analysis. XAS from two copper (Cu) proteins measured using a conventional semiconductor X-ray detector are also repeated as bridges between Ni complexes and dilute Ni enzymes. The differences between measuring 1% Cu enzymes and measuring <0.1% Ni enzymes are compared and discussed. This study extends L-edge sum rule analysis to virtually any 3d metal complex and any dilute biological samples that contain 3d metals.

  9. Comparison of deformation mechanics for two different carbonates: oolitic limestone and laminites

    NASA Astrophysics Data System (ADS)

    Zihms, Stephanie; Lewis, Helen; Couples, Gary; Hall, Stephen; Somerville, Jim

    2016-04-01

    Carbonate rocks form under a range of conditions which leads to a diverse rock group. Even though carbonates are overall mineralogically simple, the solid-space distribution ranges from simple compositions such as oolitic limestones to highly complex networks of pores and solids as seen in coquinas. Their fundamental mechanical behaviour has been identified to be like clastic rocks (Vajdova 2004, Brantut, Heap et al. 2014). However it is very likely that this observation is not true for more complex carbonates. Triaxial tests were performed on cylindrical samples of two different carbonates; a) oolitic limestone (Bicqueley quarry, France) and b) laminite (Ariripe basin, Brazil). The samples were deformed under confining pressures of 8, 12 and 20MPa, and 20, 30 and 40MPa, respectively. All tests were stopped as soon as peak load was observed to preserve as many deformation characteristics as possible. Photographs of the samples were taken before and after deformation to allow surface analysis of deformation features. Additionally, samples were analysed post-deformation with X-ray tomography (XRT) (using the Zeiss XRadia XRM 520 at the 4D Imaging Lab at Lund University). The 3D tomography images represent the post-deformation samples' density distribution, allowing detailed, non-destructive, 3D analysis of the deformation features that developed in the triaxial testing, including the complex geometries and interactions of fractures, deformation bands and sedimentary layering. They also provide an insight into the complexity of deformation features produced due to the carbonate response. Initial results show that the oolitic limestone forms single shear bands almost the length of the sample, exhibiting similar characteristics to sandstones deformed under similar conditions. These features are observed for all three applied loads. The laminate sample deformed at the lowest confining pressure exhibits compactive features. However, the laminite samples deformed at the two higher confining pressures both show highly complex fracture networks comprising open fractures and fracture propagation. This suggests that the laminate changes from compactive to dilational responses over the selected confining conditions. The XRT analysis indicates that a more complex fracture distribution could be linked to rock component properties e.g. grain size and composition. For the laminite these are variable with the layers. This is in agreement with field observations of laminite microfabrics (Calvo, Rodriguez-Pascua et al. 1998). Additionally, the typical grain size of the laminate (μm) is much smaller than the oolitic limestone (mm), which suggests that fracture network complexity can also be linked to bulk system complexity i.e. pore & grain network. These deformation experiments show that, as previously observed, oolitic limestones seem to behave similarly to sandstones. However this observation is not true for laminites and it is very likely that more complex carbonates will develop even more complicated deformation behaviour. It is therefore necessary to systematically test different carbonate rocks to understand the impact of geometry and composition, as well as the interplay with the pore network. Brantut, N., et al. (2014). Journal of Geophysical Research: Solid Earth 119(7): 5444-5463. Calvo, J. P., et al. (1998). Sedimentology 45: 279-292. Vajdova, V. (2004). Journal of Geophysical Research 109(B5).

  10. Miniaturized and direct spectrophotometric multi-sample analysis of trace metals in natural waters.

    PubMed

    Albendín, Gemma; López-López, José A; Pinto, Juan J

    2016-03-15

    Trends in the analysis of trace metals in natural waters are mainly based on the development of sample treatment methods to isolate and pre-concentrate the metal from the matrix in a simpler extract for further instrumental analysis. However, direct analysis is often possible using more accessible techniques such as spectrophotometry. In this case a proper ligand is required to form a complex that absorbs radiation in the ultraviolet-visible (UV-Vis) spectrum. In this sense, the hydrazone derivative, di-2-pyridylketone benzoylhydrazone (dPKBH), forms complexes with copper (Cu) and vanadium (V) that absorb light at 370 and 395 nm, respectively. Although spectrophotometric methods are considered as time- and reagent-consuming, this work focused on its miniaturization by reducing the volume of sample as well as time and cost of analysis. In both methods, a micro-amount of sample is placed into a microplate reader with a capacity for 96 samples, which can be analyzed in times ranging from 5 to 10 min. The proposed methods have been optimized using a Box-Behnken design of experiments. For Cu determination, concentration of phosphate buffer solution at pH 8.33, masking agents (ammonium fluoride and sodium citrate), and dPKBH were optimized. For V analysis, sample (pH 4.5) was obtained using acetic acid/sodium acetate buffer, and masking agents were ammonium fluoride and 1,2-cyclohexanediaminetetraacetic acid. Under optimal conditions, both methods were applied to the analysis of certified reference materials TMDA-62 (lake water), LGC-6016 (estuarine water), and LGC-6019 (river water). In all cases, results proved the accuracy of the method. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    PubMed

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.

  12. The assessment of pi-pi selective stationary phases for two-dimensional HPLC analysis of foods: application to the analysis of coffee.

    PubMed

    Mnatsakanyan, Mariam; Stevenson, Paul G; Shock, David; Conlan, Xavier A; Goodie, Tiffany A; Spencer, Kylie N; Barnett, Neil W; Francis, Paul S; Shalliker, R Andrew

    2010-09-15

    Differences between alkyl, dipole-dipole, hydrogen bonding, and pi-pi selective surfaces represented by non-resonance and resonance pi-stationary phases have been assessed for the separation of 'Ristretto' café espresso by employing 2DHPLC techniques with C18 phase selectivity detection. Geometric approach to factor analysis (GAFA) was used to measure the detected peaks (N), spreading angle (beta), correlation, practical peak capacity (n(p)) and percentage usage of the separations space, as an assessment of selectivity differences between regional quadrants of the two-dimensional separation plane. Although all tested systems were correlated to some degree to the C18 dimension, regional measurement of separation divergence revealed that performance of specific systems was better for certain sample components. The results illustrate that because of the complexity of the 'real' sample obtaining a truly orthogonal two-dimensional system for complex samples of natural origin may be practically impossible. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  13. A quantitative approach for pesticide analysis in grape juice by direct interfacing of a matrix compatible SPME phase to dielectric barrier discharge ionization-mass spectrometry.

    PubMed

    Mirabelli, Mario F; Gionfriddo, Emanuela; Pawliszyn, Janusz; Zenobi, Renato

    2018-02-12

    We evaluated the performance of a dielectric barrier discharge ionization (DBDI) source for pesticide analysis in grape juice, a fairly complex matrix due to the high content of sugars (≈20% w/w) and pigments. A fast sample preparation method based on direct immersion solid-phase microextraction (SPME) was developed, and novel matrix compatible SPME fibers were used to reduce in-source matrix suppression effects. A high resolution LTQ Orbitrap mass spectrometer allowed for rapid quantification in full scan mode. This direct SPME-DBDI-MS approach was proven to be effective for the rapid and direct analysis of complex sample matrices, with limits of detection in the parts-per-trillion (ppt) range and inter- and intra-day precision below 30% relative standard deviation (RSD) for samples spiked at 1, 10 and 10 ng ml -1 , with overall performance comparable or even superior to existing chromatographic approaches.

  14. DART - LTQ ORBITRAP as an expedient tool for the identification of synthetic cannabinoids.

    PubMed

    Habala, Ladislav; Valentová, Jindra; Pechová, Iveta; Fuknová, Mária; Devínsky, Ferdinand

    2016-05-01

    Synthetic cannabinoids as designer drugs constitute a major problem due to their rapid increase in number and the difficulties connected with their identification in complex mixtures. DART (Direct Analysis in Real Time) has emerged as an advantageous tool for the direct and rapid analysis of complex samples by mass spectrometry. Here we report on the identification of six synthetic cannabinoids originating from seized material in various matrices, employing the combination of ambient pressure ion source DART and hybrid ion trap - LTQ ORBITRAP mass spectrometer. This report also describes the sampling techniques for the provided herbal material containing the cannabinoids, either directly as plant parts or as an extract in methanol and their influence on the outcome of the analysis. The high resolution mass spectra supplied by the LTQ ORBITRAP instrument allowed for an unambiguous assignment of target compounds. The utilized instrumental coupling proved to be a convenient way for the identification of synthetic cannabinoids in real-world samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  16. Nano-LC/MALDI-MS using a column-integrated spotting probe for analysis of complex biomolecule samples.

    PubMed

    Hioki, Yusaku; Tanimura, Ritsuko; Iwamoto, Shinichi; Tanaka, Koichi

    2014-03-04

    Nanoflow liquid chromatography (nano-LC) is an essential technique for highly sensitive analysis of complex biological samples, and matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is advantageous for rapid identification of proteins and in-depth analysis of post-translational modifications (PTMs). A combination of nano-LC and MALDI-MS (nano-LC/MALDI-MS) is useful for highly sensitive and detailed analysis in life sciences. However, the existing system does not fully utilize the advantages of each technique, especially in the interface of eluate transfer from nano-LC to a MALDI plate. To effectively combine nano-LC with MALDI-MS, we integrated a nano-LC column and a deposition probe for the first time (column probe) and incorporated it into a nano-LC/MALDI-MS system. Spotting nanoliter eluate droplets directly from the column onto the MALDI plate prevents postcolumn diffusion and preserves the chromatographic resolution. A DHB prespotted plate was prepared to suit the fabricated column probe to concentrate the droplets of nano-LC eluate. The performance of the advanced nano-LC/MALDI-MS system was substantiated by analyzing protein digests. When the system was coupled with multidimensional liquid chromatography (MDLC), trace amounts of glycopeptides that spiked into complex samples were successfully detected. Thus, a nano-LC/MALDI-MS direct-spotting system that eliminates postcolumn diffusion was constructed, and the efficacy of the system was demonstrated through highly sensitive analysis of the protein digests or spiked glycopeptides.

  17. Bacterial complexes of a high moor related to different elements of microrelief

    NASA Astrophysics Data System (ADS)

    Dobrovol'skaya, T. G.; Golovchenko, A. V.; Yakushev, A. V.; Yurchenko, E. N.; Manucharov, N. A.; Chernov, I. Yu.

    2017-04-01

    The analysis of bacterial complexes, including the number, taxonomic composition, physiological state, and proportion of ecological trophic groups was performed in a high moorland related to different elements of the microrelief. The abundance of bacteria, their ability for hydrolysis of polymers and the share of r-strategists were found to be higher in the sphagnum hillocks than on the flat surfaces. The total prokaryote biomass was 4 times greater in the sphagnum samples from microhighs (hillocks). On these elements of the microrelief, the density of actinomycetal mycelium was higher. Bacteria of the hydrolytic complex ( Cytophaga and Chitinophaga genera) were found only in microhigh samples.

  18. Factors Associated with the Participation of Children with Complex Communication Needs

    ERIC Educational Resources Information Center

    Clarke, M. T.; Newton, C.; Griffiths, T.; Price, K.; Lysley, A.; Petrides, K. V.

    2011-01-01

    The aim of this study was to conduct a preliminary analysis of relations between child and environmental variables, including factors related to communication aid provision, and participation in informal everyday activities in a sample of children with complex communication needs. Ninety-seven caregivers of children provided with communication…

  19. Chiral Tagging of Verbenone with 3-BUTYN-2-OL for Establishing Absolute Configuration and Determining Enantiomeric Excess

    NASA Astrophysics Data System (ADS)

    Evangelisti, Luca; Mayer, Kevin J.; Holdren, Martin S.; Smart, Taylor; West, Channing; Pate, Brooks; Sedo, Galen; Marshall, Frank E.; Grubbs, G. S., II

    2017-06-01

    Chiral analysis of a commercial sample of (1S)-(-)-verbenone has been performed using the chiral tag approach. The chirped-pulse Fourier transform microwave spectrum of the verbenone-butynol complex is measured in the 2-8 GHz frequency range. Verbenone is placed in a nozzle reservoir heated to 333K (about 1 Torr vapor pressure). The complex is formed by using a carrier gas of neon with approximately 0.1% butynol. The expansion pressure is about 2 atm. A measurement using racemic butynol is performed to identify isomers of both diastereomer complexes. Quantum chemistry calculations using the B3LYP-D3BJ method with the def2TZVP basis set provided estimated spectroscopic constants for the homochiral and heterochiral complexes. This analysis included 8 isomers for each diastereomer. Four rotational spectra are identified for isomers of the homochiral complex and correspond to the four lowest energy isomers from the theoretical study. Three heterochiral complexes are identified and also correspond to the lowest energy isomers from theory. Subsequent measurements were made with enantiopure tag (both (R)-(+)-3-buty-2-nol and (S)-(-)-3-butyn-2-ol) to establish the absolute configuration of verbenone. The sensitivity of the measurement was sufficient to perform ^{13}C-isotopologue analysis of three of the homochiral complexes and two of the heterochiral complexes. These results provide definitive structures of verbenone with correct stereochemistry. The commercial sample has relatively low enantiomeric excess with the certificate of analysis reporting an EE of 53.6%. Using the intensities of assigned transitions of the chiral tag complexes, the enantiomeric excess was determined from the broadband rotational spectrum through the ratio of the intensities of pairs of transitions. A total of 2617 pairs of transitions were analyzed. The average EE was found to be 53.6% with a standard deviation of 2%.

  20. Viewpoints: Interactive Exploration of Large Multivariate Earth and Space Science Data Sets

    NASA Astrophysics Data System (ADS)

    Levit, C.; Gazis, P. R.

    2006-05-01

    Analysis and visualization of extremely large and complex data sets may be one of the most significant challenges facing earth and space science investigators in the forthcoming decades. While advances in hardware speed and storage technology have roughly kept up with (indeed, have driven) increases in database size, the same is not of our abilities to manage the complexity of these data. Current missions, instruments, and simulations produce so much data of such high dimensionality that they outstrip the capabilities of traditional visualization and analysis software. This problem can only be expected to get worse as data volumes increase by orders of magnitude in future missions and in ever-larger supercomputer simulations. For large multivariate data (more than 105 samples or records with more than 5 variables per sample) the interactive graphics response of most existing statistical analysis, machine learning, exploratory data analysis, and/or visualization tools such as Torch, MLC++, Matlab, S++/R, and IDL stutters, stalls, or stops working altogether. Fortunately, the graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform application which leverages much of the power latent in the GPU to enable smooth interactive exploration and analysis of large high- dimensional data using a variety of classical and recent techniques. The targeted application is the interactive analysis of large, complex, multivariate data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 106-108.

  1. Varieties of Stimulus Control in Matching-to-Sample: A Kernel Analysis

    ERIC Educational Resources Information Center

    Fields, Lanny; Garruto, Michelle; Watanabe, Mari

    2010-01-01

    Conditional discrimination or matching-to-sample procedures have been used to study a wide range of complex psychological phenomena with infrahuman and human subjects. In most studies, the percentage of trials in which a subject selects the comparison stimulus that is related to the sample stimulus is used to index the control exerted by the…

  2. Electrical, structural and thermal studies of carbon nanotubes from natural legume seeds: kala chana

    NASA Astrophysics Data System (ADS)

    Ranu, Rachana; Chauhan, Yatishwar; Singh, Pramod K.; Bhattacharya, B.; Tomar, S. K.

    2016-12-01

    Carbon nanotubes (CNTs) are the carbon materials measured at nanoscale level and they are defined in two types according to the number of concentric layers, i.e. single-layer tube is single-walled nanotubes, while multi-layer tube structure is called multi-walled nanotubes. The green method synthesis for the preparation of CNTs begins with the smashing of legume seeds kala chana, and then they form complex with cobalt salt. Desiccation of the complex compound forms cobalt salt and seed protein. The complex is then decomposed at 625 °C in muffle furnace for 20 min. Purification of the decomposed sample is done through acid wash treatment and dried in vacuum oven. The confirmations of CNTs are done by nuclear magnetic resonance and Fourier transform infrared, which analyzes the denatured protein, reacted to the metal salt. X-Ray diffraction determines the MWNTs with transmission electron microscope (TEM) reports the network structure of CNTs. thermal gravimetric analysis (TGA)-differential thermal analysis (DTA)-thermogravimetric analysis (DTG) tests the amount of sample under thermal treatment. Vibrating sample magnetometer determines the paramagnetic nature of CNTs. CNTs thus prepared can be used in mechanical fields, in solar cells, in electronics fields, etc. because of their multidisciplinary properties. The synthesized CNTs are eco-friendly in nature, prepared by the legume seed natural precursor.

  3. Molecular Analyzer for Complex Refractory Organic-Rich Surfaces (MACROS)

    NASA Technical Reports Server (NTRS)

    Getty, Stephanie A.; Cook, Jamie E.; Balvin, Manuel; Brinckerhoff, William B.; Li, Xiang; Grubisic, Andrej; Cornish, Timothy; Ferrance, Jerome; Southard, Adrian

    2017-01-01

    The Molecular Analyzer for Complex Refractory Organic-rich Surfaces, MACROS, is a novel instrument package being developed at NASA Goddard Space Flight Center. MACROS enables the in situ characterization of a sample's composition by coupling two powerful techniques into one compact instrument package: (1) laser desorption/ionization time-of-flight mass spectrometry (LDMS) for broad detection of inorganic mineral composition and non-volatile organics, and (2) liquid-phase extraction methods to gently isolate the soluble organic and inorganic fraction of a planetary powder for enrichment and detailed analysis by liquid chromatographic separation coupled to LDMS. The LDMS is capable of positive and negative ion detection, precision mass selection, and fragment analysis. Two modes are included for LDMS: single laser LDMS as the broad survey mode and two step laser mass spectrometry (L2MS). The liquid-phase extraction will be done in a newly designed extraction module (EM) prototype, providing selectivity in the analysis of a complex sample. For the sample collection, a diamond drill front end will be used to collect rock/icy powder. With all these components and capabilities together, MACROS offers a versatile analytical instrument for a mission targeting an icy moon, carbonaceous asteroid, or comet, to fully characterize the surface composition and advance our understanding of the chemical inventory present on that body.

  4. Recent trends in sorption-based sample preparation and liquid chromatography techniques for food analysis.

    PubMed

    V Soares Maciel, Edvaldo; de Toffoli, Ana Lúcia; Lanças, Fernando Mauro

    2018-04-20

    The accelerated rising of the world's population increased the consumption of food, thus demanding more rigors in the control of residue and contaminants in food-based products marketed for human consumption. In view of the complexity of most food matrices, including fruits, vegetables, different types of meat, beverages, among others, a sample preparation step is important to provide more reliable results when combined with HPLC separations. An adequate sample preparation step before the chromatographic analysis is mandatory in obtaining higher precision and accuracy in order to improve the extraction of the target analytes, one of the priorities in analytical chemistry. The recent discovery of new materials such as ionic liquids, graphene-derived materials, molecularly imprinted polymers, restricted access media, magnetic nanoparticles, and carbonaceous nanomaterials, provided high sensitivity and selectivity results in an extensive variety of applications. These materials, as well as their several possible combinations, have been demonstrated to be highly appropriate for the extraction of different analytes in complex samples such as food products. The main characteristics and application of these new materials in food analysis will be presented and discussed in this paper. Another topic discussed in this review covers the main advantages and limitations of sample preparation microtechniques, as well as their off-line and on-line combination with HPLC for food analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Search for Chemical Biomarkers on Mars Using the Sample Analysis at Mars Instrument Suite on the Mars Science Laboratory

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    One key goal for the future exploration of Mars is the search for chemical biomarkers including complex organic compounds important in life on Earth. The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) will provide the most sensitive measurements of the organic composition of rocks and regolith samples ever carried out in situ on Mars. SAM consists of a gas chromatograph (GC), quadrupole mass spectrometer (QMS), and tunable laser spectrometer to measure volatiles in the atmosphere and released from rock powders heated up to 1000 C. The measurement of organics in solid samples will be accomplished by three experiments: (1) pyrolysis QMS to identify alkane fragments and simple aromatic compounds; pyrolysis GCMS to separate and identify complex mixtures of larger hydrocarbons; and (3) chemical derivatization and GCMS extract less volatile compounds including amino and carboxylic acids that are not detectable by the other two experiments.

  6. Gas-phase detection of solid-state fission product complexes for post-detonation nuclear forensic analysis

    DOE PAGES

    Stratz, S. Adam; Jones, Steven A.; Oldham, Colton J.; ...

    2016-06-27

    This study presents the first known detection of fission products commonly found in post-detonation nuclear debris samples using solid sample introduction and a uniquely coupled gas chromatography inductively-coupled plasma time-of-flight mass spectrometer. Rare earth oxides were chemically altered to incorporate a ligand that enhances the volatility of the samples. These samples were injected (as solids) into the aforementioned instrument and detected for the first time. Repeatable results indicate the validity of the methodology, and this capability, when refined, will prove to be a valuable asset for rapid post-detonation nuclear forensic analysis.

  7. Gas-phase detection of solid-state fission product complexes for post-detonation nuclear forensic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stratz, S. Adam; Jones, Steven A.; Oldham, Colton J.

    This study presents the first known detection of fission products commonly found in post-detonation nuclear debris samples using solid sample introduction and a uniquely coupled gas chromatography inductively-coupled plasma time-of-flight mass spectrometer. Rare earth oxides were chemically altered to incorporate a ligand that enhances the volatility of the samples. These samples were injected (as solids) into the aforementioned instrument and detected for the first time. Repeatable results indicate the validity of the methodology, and this capability, when refined, will prove to be a valuable asset for rapid post-detonation nuclear forensic analysis.

  8. Modeling ultrasound propagation through material of increasing geometrical complexity.

    PubMed

    Odabaee, Maryam; Odabaee, Mostafa; Pelekanos, Matthew; Leinenga, Gerhard; Götz, Jürgen

    2018-06-01

    Ultrasound is increasingly being recognized as a neuromodulatory and therapeutic tool, inducing a broad range of bio-effects in the tissue of experimental animals and humans. To achieve these effects in a predictable manner in the human brain, the thick cancellous skull presents a problem, causing attenuation. In order to overcome this challenge, as a first step, the acoustic properties of a set of simple bone-modeling resin samples that displayed an increasing geometrical complexity (increasing step sizes) were analyzed. Using two Non-Destructive Testing (NDT) transducers, we found that Wiener deconvolution predicted the Ultrasound Acoustic Response (UAR) and attenuation caused by the samples. However, whereas the UAR of samples with step sizes larger than the wavelength could be accurately estimated, the prediction was not accurate when the sample had a smaller step size. Furthermore, a Finite Element Analysis (FEA) performed in ANSYS determined that the scattering and refraction of sound waves was significantly higher in complex samples with smaller step sizes compared to simple samples with a larger step size. Together, this reveals an interaction of frequency and geometrical complexity in predicting the UAR and attenuation. These findings could in future be applied to poro-visco-elastic materials that better model the human skull. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Textbooks for Responsible Data Analysis in Excel

    ERIC Educational Resources Information Center

    Garrett, Nathan

    2015-01-01

    With 27 million users, Excel (Microsoft Corporation, Seattle, WA) is the most common business data analysis software. However, audits show that almost all complex spreadsheets have errors. The author examined textbooks to understand why responsible data analysis is taught. A purposeful sample of 10 textbooks was coded, and then compared against…

  11. Current trends in sample preparation for cosmetic analysis.

    PubMed

    Zhong, Zhixiong; Li, Gongke

    2017-01-01

    The widespread applications of cosmetics in modern life make their analysis particularly important from a safety point of view. There is a wide variety of restricted ingredients and prohibited substances that primarily influence the safety of cosmetics. Sample preparation for cosmetic analysis is a crucial step as the complex matrices may seriously interfere with the determination of target analytes. In this review, some new developments (2010-2016) in sample preparation techniques for cosmetic analysis, including liquid-phase microextraction, solid-phase microextraction, matrix solid-phase dispersion, pressurized liquid extraction, cloud point extraction, ultrasound-assisted extraction, and microwave digestion, are presented. Furthermore, the research and progress in sample preparation techniques and their applications in the separation and purification of allowed ingredients and prohibited substances are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Classification of time-of-flight secondary ion mass spectrometry spectra from complex Cu-Fe sulphides by principal component analysis and artificial neural networks.

    PubMed

    Kalegowda, Yogesh; Harmer, Sarah L

    2013-01-08

    Artificial neural network (ANN) and a hybrid principal component analysis-artificial neural network (PCA-ANN) classifiers have been successfully implemented for classification of static time-of-flight secondary ion mass spectrometry (ToF-SIMS) mass spectra collected from complex Cu-Fe sulphides (chalcopyrite, bornite, chalcocite and pyrite) at different flotation conditions. ANNs are very good pattern classifiers because of: their ability to learn and generalise patterns that are not linearly separable; their fault and noise tolerance capability; and high parallelism. In the first approach, fragments from the whole ToF-SIMS spectrum were used as input to the ANN, the model yielded high overall correct classification rates of 100% for feed samples, 88% for conditioned feed samples and 91% for Eh modified samples. In the second approach, the hybrid pattern classifier PCA-ANN was integrated. PCA is a very effective multivariate data analysis tool applied to enhance species features and reduce data dimensionality. Principal component (PC) scores which accounted for 95% of the raw spectral data variance, were used as input to the ANN, the model yielded high overall correct classification rates of 88% for conditioned feed samples and 95% for Eh modified samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Synthesis and Properties of Ortho-Nitro-Fe Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, A.; Mishra, Niyati; Sharma, R.

    2011-07-15

    Ortho-Nitro-Fe complex (Transition metal complex) has synthesized by chemical route method and properties of made complex has characterized by X-Ray diffraction (XRD), Moessbauer spectroscopy, Fourier transformation infra-red spectroscopy (FTIR) and X-Ray photoelectron spectroscopy (XPS). XRD analysis shows that sample is crystalline in nature and having particle size in the range of few nano meters. Moessbauer spectroscopy at room temperature shows the oxidation state of Iron (central metal ion) after complaxasion. FTIR spectra of the complex confirms the coordination of metal ion with ligand.

  14. Ground-Based Aerosol Measurements | Science Inventory ...

    EPA Pesticide Factsheets

    Atmospheric particulate matter (PM) is a complex chemical mixture of liquid and solid particles suspended in air (Seinfeld and Pandis 2016). Measurements of this complex mixture form the basis of our knowledge regarding particle formation, source-receptor relationships, data to test and verify complex air quality models, and how PM impacts human health, visibility, global warming, and ecological systems (EPA 2009). Historically, PM samples have been collected on filters or other substrates with subsequent chemical analysis in the laboratory and this is still the major approach for routine networks (Chow 2005; Solomon et al. 2014) as well as in research studies. In this approach, air, at a specified flow rate and time period, is typically drawn through an inlet, usually a size selective inlet, and then drawn through filters, 1 INTRODUCTION Atmospheric particulate matter (PM) is a complex chemical mixture of liquid and solid particles suspended in air (Seinfeld and Pandis 2016). Measurements of this complex mixture form the basis of our knowledge regarding particle formation, source-receptor relationships, data to test and verify complex air quality models, and how PM impacts human health, visibility, global warming, and ecological systems (EPA 2009). Historically, PM samples have been collected on filters or other substrates with subsequent chemical analysis in the laboratory and this is still the major approach for routine networks (Chow 2005; Solomo

  15. Direct mass spectrometry approaches to characterize polyphenol composition of complex samples.

    PubMed

    Fulcrand, Hélène; Mané, Carine; Preys, Sébastien; Mazerolles, Gérard; Bouchut, Claire; Mazauric, Jean-Paul; Souquet, Jean-Marc; Meudec, Emmanuelle; Li, Yan; Cole, Richard B; Cheynier, Véronique

    2008-12-01

    Lower molecular weight polyphenols including proanthocyanidin oligomers can be analyzed after HPLC separation on either reversed-phase or normal phase columns. However, these techniques are time consuming and can have poor resolution as polymer chain length and structural diversity increase. The detection of higher molecular weight compounds, as well as the determination of molecular weight distributions, remain major challenges in polyphenol analysis. Approaches based on direct mass spectrometry (MS) analysis that are proposed to help overcome these problems are reviewed. Thus, direct flow injection electrospray ionization mass spectrometry analysis can be used to establish polyphenol fingerprints of complex extracts such as in wine. This technique enabled discrimination of samples on the basis of their phenolic (i.e. anthocyanin, phenolic acid and flavan-3-ol) compositions, but larger oligomers and polymers were poorly detectable. Detection of higher molecular weight proanthocyanidins was also restricted with matrix-assisted laser desorption ionization (MALDI) MS, suggesting that they are difficult to desorb as gas-phase ions. The mass distribution of polymeric fractions could, however, be determined by analyzing the mass distributions of bovine serum albumin/proanthocyanidin complexes using MALDI-TOF-MS.

  16. Complex mixture analysis by photoionization mass spectrometry with a VUV hydrogen laser source

    NASA Astrophysics Data System (ADS)

    Huth, T. C.; Denton, M. B.

    1985-12-01

    Trace organic analysis in complex matrix presents one of the most challenging problems in analytical mass spectrometry. When ionization is accomplished non-selectively using electron impact, extensive sample clean-up is often necessary in order to isolate the analyte from the matrix. Sample preparation can be greatly reduced when the VUV H2 laser is used to selectively photoionize only a small fraction of compounds introduced into the ion source. This device produces parent ions only for all compounds whose ionization potentials lie below a threshold value determined by the photon energy of 7.8 eV. The only observed interference arises from electron impact ionization, when scattered laser radiation interacts with metal surfaces, producing electrons which are then accelerated by potential fields inside the source. These can be suppressed to levels acceptable for practical analysis through proper instrumental design. Results are presented which indicate the ability of this ion source to discriminate against interfering matrix components, in simple extracts from a variety of complex real world matrices, such as brewed coffee, beer, and urine.

  17. Analysis of interstellar cloud structure based on IRAS images

    NASA Technical Reports Server (NTRS)

    Scalo, John M.

    1992-01-01

    The goal of this project was to develop new tools for the analysis of the structure of densely sampled maps of interstellar star-forming regions. A particular emphasis was on the recognition and characterization of nested hierarchical structure and fractal irregularity, and their relation to the level of star formation activity. The panoramic IRAS images provided data with the required range in spatial scale, greater than a factor of 100, and in column density, greater than a factor of 50. In order to construct densely sampled column density maps of star-forming clouds, column density images of four nearby cloud complexes were constructed from IRAS data. The regions have various degrees of star formation activity, and most of them have probably not been affected much by the disruptive effects of young massive stars. The largest region, the Scorpius-Ophiuchus cloud complex, covers about 1000 square degrees (it was subdivided into a few smaller regions for analysis). Much of the work during the early part of the project focused on an 80 square degree region in the core of the Taurus complex, a well-studied region of low-mass star formation.

  18. Temporal and spatial variation of trace elements in atmospheric deposition around the industrial area of Puchuncaví-Ventanas (Chile) and its influence on exceedances of lead and cadmium critical loads in soils.

    PubMed

    Rueda-Holgado, F; Calvo-Blázquez, L; Cereceda-Balic, F; Pinilla-Gil, E

    2016-02-01

    Fractionation of elemental contents in atmospheric samples is useful to evaluate pollution levels for risk assessment and pollution sources assignment. We present here the main results of long-term characterization of atmospheric deposition by using a recently developed atmospheric elemental fractionation sampler (AEFS) for major and trace elements monitoring around an important industrial complex located in Puchuncaví region (Chile). Atmospheric deposition samples were collected during two sampling campaigns (2010 and 2011) at four sampling locations: La Greda (LG), Los Maitenes (LM), Puchuncaví (PU) and Valle Alegre (VA). Sample digestion and ICP-MS gave elements deposition values (Al, As, Ba, Cd, Co, Cu, Fe, K, Mn, Pb, Sb, Ti, V and Zn) in the insoluble fraction of the total atmospheric deposition. Results showed that LG location, the closest location to the industrial complex, was the more polluted sampling site having the highest values for the analyzed elements. PU and LM were the next more polluted and, finally, the lowest elements concentrations were registered at VA. The application of Principal Component Analysis and Cluster Analysis identified industrial, traffic and mineral-crustal factors. We found critical loads exceedances for Pb at all sampling locations in the area affected by the industrial emissions, more significant in LG close to the industrial complex, with a trend to decrease in 2011, whereas no exceedances due to atmospheric deposition were detected for Cd. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Preconcentration and Determination of Trace Vanadium(V) in Beverages by Combination of Ultrasound Assisted-cloud Point Extraction with Spectrophotometry.

    PubMed

    Kartal Temel, Nuket; Gürkan, Ramazan

    2018-03-01

    A novel ultrasound assisted-cloud point extraction method was developed for preconcentration and determination of V(V) in beverage samples. After complexation by pyrogallol in presence of safranin T at pH 6.0, V(V) ions as ternary complex are extracted into the micellar phase of Triton X-114. The complex was monitored at 533 nm by spectrophotometry. The matrix effect on the recovery of V(V) from the spiked samples at 50 μg L-1 was evaluated. In optimized conditions, the limits of detection and quantification of the method, respectively, was 0.58 and 1.93 μg L-1 in linear range of 2-500 μg L-1 with sensitivity enhancement and preconcentration factors of 47.7 and 40 for preconcentration from 15 mL of sample solution. The recoveries from spiked samples were in range of 93.8-103.2% with a relative standard deviation ranging from 2.6% to 4.1% (25, 100 and 250 μg L-1, n: 5). The accuracy was verified by analysis of two certified samples, and the results were in a good agreement with the certified values. The intra-day and inter-day precision were tested by reproducibility (as 3.3-3.4%) and repeatability (as 3.4-4.1%) analysis for five replicate measurements of V(V) in quality control samples spiked with 5, 10 and 15 μg L-1. Trace V(V) contents of the selected beverage samples by the developed method were successfully determined.

  20. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  1. The contribution of cluster and discriminant analysis to the classification of complex aquifer systems.

    PubMed

    Panagopoulos, G P; Angelopoulou, D; Tzirtzilakis, E E; Giannoulopoulos, P

    2016-10-01

    This paper presents an innovated method for the discrimination of groundwater samples in common groups representing the hydrogeological units from where they have been pumped. This method proved very efficient even in areas with complex hydrogeological regimes. The proposed method requires chemical analyses of water samples only for major ions, meaning that it is applicable to most of cases worldwide. Another benefit of the method is that it gives a further insight of the aquifer hydrogeochemistry as it provides the ions that are responsible for the discrimination of the group. The procedure begins with cluster analysis of the dataset in order to classify the samples in the corresponding hydrogeological unit. The feasibility of the method is proven from the fact that the samples of volcanic origin were separated into two different clusters, namely the lava units and the pyroclastic-ignimbritic aquifer. The second step is the discriminant analysis of the data which provides the functions that distinguish the groups from each other and the most significant variables that define the hydrochemical composition of the aquifer. The whole procedure was highly successful as the 94.7 % of the samples were classified to the correct aquifer system. Finally, the resulted functions can be safely used to categorize samples of either unknown or doubtful origin improving thus the quality and the size of existing hydrochemical databases.

  2. Comparative Characterization of Crofelemer Samples Using Data Mining and Machine Learning Approaches With Analytical Stability Data Sets.

    PubMed

    Nariya, Maulik K; Kim, Jae Hyun; Xiong, Jian; Kleindl, Peter A; Hewarathna, Asha; Fisher, Adam C; Joshi, Sangeeta B; Schöneich, Christian; Forrest, M Laird; Middaugh, C Russell; Volkin, David B; Deeds, Eric J

    2017-11-01

    There is growing interest in generating physicochemical and biological analytical data sets to compare complex mixture drugs, for example, products from different manufacturers. In this work, we compare various crofelemer samples prepared from a single lot by filtration with varying molecular weight cutoffs combined with incubation for different times at different temperatures. The 2 preceding articles describe experimental data sets generated from analytical characterization of fractionated and degraded crofelemer samples. In this work, we use data mining techniques such as principal component analysis and mutual information scores to help visualize the data and determine discriminatory regions within these large data sets. The mutual information score identifies chemical signatures that differentiate crofelemer samples. These signatures, in many cases, would likely be missed by traditional data analysis tools. We also found that supervised learning classifiers robustly discriminate samples with around 99% classification accuracy, indicating that mathematical models of these physicochemical data sets are capable of identifying even subtle differences in crofelemer samples. Data mining and machine learning techniques can thus identify fingerprint-type attributes of complex mixture drugs that may be used for comparative characterization of products. Copyright © 2017 American Pharmacists Association®. All rights reserved.

  3. Detection of pesticides and dioxins in tissue fats and rendering oils using laser-induced breakdown spectroscopy (LIBS).

    PubMed

    Multari, Rosalie A; Cremers, David A; Scott, Thomas; Kendrick, Peter

    2013-03-13

    In laser-induced breakdown spectroscopy (LIBS), a series of powerful laser pulses are directed at a surface to form microplasmas from which light is collected and spectrally analyzed to identify the surface material. In most cases, no sample preparation is needed, and results can be automated and made available within seconds to minutes. Advances in LIBS spectral data analysis using multivariate regression techniques have led to the ability to detect organic chemicals in complex matrices such as foods. Here, the use of LIBS to differentiate samples contaminated with aldrin, 1,2,3,4,6,7,8-heptachlorodibenzo-p-dioxin, chlorpyrifos, and dieldrin in the complex matrices of tissue fats and rendering oils is described. The pesticide concentrations in the samples ranged from 0.005 to 0.1 μg/g. All samples were successfully differentiated from each other and from control samples. Sample concentrations could also be differentiated for all of the pesticides and the dioxin included in this study. The results presented here provide first proof-of-principle data for the ability to create LIBS-based instrumentation for the rapid analysis of pesticide and dioxin contamination in tissue fat and rendered oils.

  4. Environmental Screening for the Scedosporium apiospermum Species Complex in Public Parks in Bangkok, Thailand.

    PubMed

    Luplertlop, Natthanej; Pumeesat, Potjaman; Muangkaew, Watcharamat; Wongsuk, Thanwa; Alastruey-Izquierdo, Ana

    2016-01-01

    The Scedosporium apiospermum species complex, comprising filamentous fungal species S. apiospermum sensu stricto, S. boydii, S. aurantiacum, S. dehoogii and S. minutispora, are important pathogens that cause a wide variety of infections. Although some species (S. boydii and S. apiospermum) have been isolated from patients in Thailand, no environmental surveys of these fungi have been performed in Thailand or surrounding countries. In this study, we isolated and identified species of these fungi from 68 soil and 16 water samples randomly collected from 10 parks in Bangkok. After filtration and subsequent inoculation of samples on Scedo-Select III medium, colony morphological examinations and microscopic observations were performed. Scedosporium species were isolated from soil in 8 of the 10 parks, but were only detected in one water sample. Colony morphologies of isolates from 41 of 68 soil samples (60.29%) and 1 of 15 water samples (6.67%) were consistent with that of the S. apiospermum species complex. Each morphological type was selected for species identification based on DNA sequencing and phylogenetic analysis of the β-tubulin gene. Three species of the S. apiospermum species complex were identified: S. apiospermum (71 isolates), S. aurantiacum (6 isolates) and S. dehoogii (5 isolates). In addition, 16 sequences could not be assigned to an exact Scedosporium species. According to our environmental survey, the S. apiospermum species complex is widespread in soil in Bangkok, Thailand.

  5. Environmental Screening for the Scedosporium apiospermum Species Complex in Public Parks in Bangkok, Thailand

    PubMed Central

    Pumeesat, Potjaman; Muangkaew, Watcharamat; Wongsuk, Thanwa; Alastruey-Izquierdo, Ana

    2016-01-01

    The Scedosporium apiospermum species complex, comprising filamentous fungal species S. apiospermum sensu stricto, S. boydii, S. aurantiacum, S. dehoogii and S. minutispora, are important pathogens that cause a wide variety of infections. Although some species (S. boydii and S. apiospermum) have been isolated from patients in Thailand, no environmental surveys of these fungi have been performed in Thailand or surrounding countries. In this study, we isolated and identified species of these fungi from 68 soil and 16 water samples randomly collected from 10 parks in Bangkok. After filtration and subsequent inoculation of samples on Scedo-Select III medium, colony morphological examinations and microscopic observations were performed. Scedosporium species were isolated from soil in 8 of the 10 parks, but were only detected in one water sample. Colony morphologies of isolates from 41 of 68 soil samples (60.29%) and 1 of 15 water samples (6.67%) were consistent with that of the S. apiospermum species complex. Each morphological type was selected for species identification based on DNA sequencing and phylogenetic analysis of the β-tubulin gene. Three species of the S. apiospermum species complex were identified: S. apiospermum (71 isolates), S. aurantiacum (6 isolates) and S. dehoogii (5 isolates). In addition, 16 sequences could not be assigned to an exact Scedosporium species. According to our environmental survey, the S. apiospermum species complex is widespread in soil in Bangkok, Thailand. PMID:27467209

  6. nES GEMMA Analysis of Lectins and Their Interactions with Glycoproteins - Separation, Detection, and Sampling of Noncovalent Biospecific Complexes

    NASA Astrophysics Data System (ADS)

    Engel, Nicole Y.; Weiss, Victor U.; Marchetti-Deschmann, Martina; Allmaier, Günter

    2017-01-01

    In order to better understand biological events, lectin-glycoprotein interactions are of interest. The possibility to gather more information than the mere positive or negative response for interactions brought mass spectrometry into the center of many research fields. The presented work shows the potential of a nano-electrospray gas-phase electrophoretic mobility molecular analyzer (nES GEMMA) to detect weak, noncovalent, biospecific interactions besides still unbound glycoproteins and unreacted lectins without prior liquid phase separation. First results for Sambucus nigra agglutinin, concanavalin A, and wheat germ agglutinin and their retained noncovalent interactions with glycoproteins in the gas phase are presented. Electrophoretic mobility diameters (EMDs) were obtained by nES GEMMA for all interaction partners correlating very well with molecular masses determined by matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) of the individual molecules. Moreover, EMDs measured for the lectin-glycoprotein complexes were in good accordance with theoretically calculated mass values. Special focus was laid on complex formation for different lectin concentrations and binding specificities to evaluate the method with respect to results obtained in the liquid phase. The latter was addressed by capillary electrophoresis on-a-chip (CE-on-a-chip). Of exceptional interest was the fact that the formed complexes could be sampled according to their size onto nitrocellulose membranes after gas-phase separation. Subsequent immunological investigation further proved that the collected complex actually retained its native structure throughout nES GEMMA analysis and sampling.

  7. Great Salt Lake Composition and Rare Earth Speciation Analysis

    DOE Data Explorer

    Jiao, Yongqin; Lammers, Laura; Brewer, Aaron

    2017-04-19

    We have conducted aqueous speciation analyses of the Great Salt Lake (GSL) brine sample (Table 1) and a mock geo sample (Table 2) spiked with 1 ppb Tb and 100 ppb Tb. The GSL speciation (Figure 1) aligns with our basic speciation expectations that strong carbonate complexes would form at mid to higher pH's. Although we expected strong aqueous complexes with fluorides at neutral pH and with chlorides, and hydroxides at low pH, we observe that the dominant species in the low to mid pH range to be Tb3+ as a free ion. Still, we do see the presence of fluoride and chloride complexes within the expected low to mid pH range.

  8. Sample entropy analysis of cervical neoplasia gene-expression signatures

    PubMed Central

    Botting, Shaleen K; Trzeciakowski, Jerome P; Benoit, Michelle F; Salama, Salama A; Diaz-Arrastia, Concepcion R

    2009-01-01

    Background We introduce Approximate Entropy as a mathematical method of analysis for microarray data. Approximate entropy is applied here as a method to classify the complex gene expression patterns resultant of a clinical sample set. Since Entropy is a measure of disorder in a system, we believe that by choosing genes which display minimum entropy in normal controls and maximum entropy in the cancerous sample set we will be able to distinguish those genes which display the greatest variability in the cancerous set. Here we describe a method of utilizing Approximate Sample Entropy (ApSE) analysis to identify genes of interest with the highest probability of producing an accurate, predictive, classification model from our data set. Results In the development of a diagnostic gene-expression profile for cervical intraepithelial neoplasia (CIN) and squamous cell carcinoma of the cervix, we identified 208 genes which are unchanging in all normal tissue samples, yet exhibit a random pattern indicative of the genetic instability and heterogeneity of malignant cells. This may be measured in terms of the ApSE when compared to normal tissue. We have validated 10 of these genes on 10 Normal and 20 cancer and CIN3 samples. We report that the predictive value of the sample entropy calculation for these 10 genes of interest is promising (75% sensitivity, 80% specificity for prediction of cervical cancer over CIN3). Conclusion The success of the Approximate Sample Entropy approach in discerning alterations in complexity from biological system with such relatively small sample set, and extracting biologically relevant genes of interest hold great promise. PMID:19232110

  9. High-resolution, submicron particle size distribution analysis using gravitational-sweep sedimentation.

    PubMed Central

    Mächtle, W

    1999-01-01

    Sedimentation velocity is a powerful tool for the analysis of complex solutions of macromolecules. However, sample turbidity imposes an upper limit to the size of molecular complexes currently amenable to such analysis. Furthermore, the breadth of the particle size distribution, combined with possible variations in the density of different particles, makes it difficult to analyze extremely complex mixtures. These same problems are faced in the polymer industry, where dispersions of latices, pigments, lacquers, and emulsions must be characterized. There is a rich history of methods developed for the polymer industry finding use in the biochemical sciences. Two such methods are presented. These use analytical ultracentrifugation to determine the density and size distributions for submicron-sized particles. Both methods rely on Stokes' equations to estimate particle size and density, whereas turbidity, corrected using Mie's theory, provides the concentration measurement. The first method uses the sedimentation time in dispersion media of different densities to evaluate the particle density and size distribution. This method works provided the sample is chemically homogeneous. The second method splices together data gathered at different sample concentrations, thus permitting the high-resolution determination of the size distribution of particle diameters ranging from 10 to 3000 nm. By increasing the rotor speed exponentially from 0 to 40,000 rpm over a 1-h period, size distributions may be measured for extremely broadly distributed dispersions. Presented here is a short history of particle size distribution analysis using the ultracentrifuge, along with a description of the newest experimental methods. Several applications of the methods are provided that demonstrate the breadth of its utility, including extensions to samples containing nonspherical and chromophoric particles. PMID:9916040

  10. Determination of Reaction Stoichiometries by Flow Injection Analysis.

    ERIC Educational Resources Information Center

    Rios, Angel; And Others

    1986-01-01

    Describes a method of flow injection analysis intended for calculation of complex-formation and redox reaction stoichiometries based on a closed-loop configuration. The technique is suitable for use in undergraduate laboratories. Information is provided for equipment, materials, procedures, and sample results. (JM)

  11. Kissing loop interaction in adenine riboswitch: insights from umbrella sampling simulations.

    PubMed

    Di Palma, Francesco; Bottaro, Sandro; Bussi, Giovanni

    2015-01-01

    Riboswitches are cis-acting regulatory RNA elements prevalently located in the leader sequences of bacterial mRNA. An adenine sensing riboswitch cis-regulates adeninosine deaminase gene (add) in Vibrio vulnificus. The structural mechanism regulating its conformational changes upon ligand binding mostly remains to be elucidated. In this open framework it has been suggested that the ligand stabilizes the interaction of the distal "kissing loop" complex. Using accurate full-atom molecular dynamics with explicit solvent in combination with enhanced sampling techniques and advanced analysis methods it could be possible to provide a more detailed perspective on the formation of these tertiary contacts. In this work, we used umbrella sampling simulations to study the thermodynamics of the kissing loop complex in the presence and in the absence of the cognate ligand. We enforced the breaking/formation of the loop-loop interaction restraining the distance between the two loops. We also assessed the convergence of the results by using two alternative initialization protocols. A structural analysis was performed using a novel approach to analyze base contacts. Contacts between the two loops were progressively lost when larger inter-loop distances were enforced. Inter-loop Watson-Crick contacts survived at larger separation when compared with non-canonical pairing and stacking interactions. Intra-loop stacking contacts remained formed upon loop undocking. Our simulations qualitatively indicated that the ligand could stabilize the kissing loop complex. We also compared with previously published simulation studies. Kissing complex stabilization given by the ligand was compatible with available experimental data. However, the dependence of its value on the initialization protocol of the umbrella sampling simulations posed some questions on the quantitative interpretation of the results and called for better converged enhanced sampling simulations.

  12. Methyl-CpG island-associated genome signature tags

    DOEpatents

    Dunn, John J

    2014-05-20

    Disclosed is a method for analyzing the organismic complexity of a sample through analysis of the nucleic acid in the sample. In the disclosed method, through a series of steps, including digestion with a type II restriction enzyme, ligation of capture adapters and linkers and digestion with a type IIS restriction enzyme, genome signature tags are produced. The sequences of a statistically significant number of the signature tags are determined and the sequences are used to identify and quantify the organisms in the sample. Various embodiments of the invention described herein include methods for using single point genome signature tags to analyze the related families present in a sample, methods for analyzing sequences associated with hyper- and hypo-methylated CpG islands, methods for visualizing organismic complexity change in a sampling location over time and methods for generating the genome signature tag profile of a sample of fragmented DNA.

  13. XAP, a program for deconvolution and analysis of complex X-ray spectra

    USGS Publications Warehouse

    Quick, James E.; Haleby, Abdul Malik

    1989-01-01

    The X-ray analysis program (XAP) is a spectral-deconvolution program written in BASIC and specifically designed to analyze complex spectra produced by energy-dispersive X-ray analytical systems (EDS). XAP compensates for spectrometer drift, utilizes digital filtering to remove background from spectra, and solves for element abundances by least-squares, multiple-regression analysis. Rather than base analyses on only a few channels, broad spectral regions of a sample are reconstructed from standard reference spectra. The effects of this approach are (1) elimination of tedious spectrometer adjustments, (2) removal of background independent of sample composition, and (3) automatic correction for peak overlaps. Although the program was written specifically to operate a KEVEX 7000 X-ray fluorescence analytical system, it could be adapted (with minor modifications) to analyze spectra produced by scanning electron microscopes, electron microprobes, and probes, and X-ray defractometer patterns obtained from whole-rock powders.

  14. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  15. Nanoengineered capsules for selective SERS analysis of biological samples

    NASA Astrophysics Data System (ADS)

    You, Yil-Hwan; Schechinger, Monika; Locke, Andrea; Coté, Gerard; McShane, Mike

    2018-02-01

    Metal nanoparticles conjugated with DNA oligomers have been intensively studied for a variety of applications, including optical diagnostics. Assays based on aggregation of DNA-coated particles in proportion to the concentration of target analyte have not been widely adopted for clinical analysis, however, largely due to the nonspecific responses observed in complex biofluids. While sample pre-preparation such as dialysis is helpful to enable selective sensing, here we sought to prove that assay encapsulation in hollow microcapsules could remove this requirement and thereby facilitate more rapid analysis on complex samples. Gold nanoparticle-based assays were incorporated into capsules comprising polyelectrolyte multilayer (PEMs), and the response to small molecule targets and larger proteins were compared. Gold nanoparticles were able to selectively sense small Raman dyes (Rhodamine 6G) in the presence of large protein molecules (BSA) when encapsulated. A ratiometric based microRNA-17 sensing assay exhibited drastic reduction in response after encapsulation, with statistically-significant relative Raman intensity changes only at a microRNA-17 concentration of 10 nM compared to a range of 0-500 nM for the corresponding solution-phase response.

  16. Cumulative trauma and symptom complexity in children: a path analysis.

    PubMed

    Hodges, Monica; Godbout, Natacha; Briere, John; Lanktree, Cheryl; Gilbert, Alicia; Kletzka, Nicole Taylor

    2013-11-01

    Multiple trauma exposures during childhood are associated with a range of psychological symptoms later in life. In this study, we examined whether the total number of different types of trauma experienced by children (cumulative trauma) is associated with the complexity of their subsequent symptomatology, where complexity is defined as the number of different symptom clusters simultaneously elevated into the clinical range. Children's symptoms in six different trauma-related areas (e.g., depression, anger, posttraumatic stress) were reported both by child clients and their caretakers in a clinical sample of 318 children. Path analysis revealed that accumulated exposure to multiple different trauma types predicts symptom complexity as reported by both children and their caretakers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. L-edge sum rule analysis on 3d transition metal sites: from d 10 to d 0 and towards application to extremely dilute metallo-enzymes

    DOE PAGES

    Wang, Hongxin; Friedrich, Stephan; Li, Lei; ...

    2018-02-13

    According to L-edge sum rules, the number of 3d vacancies at a transition metal site is directly proportional to the integrated intensity of the L-edge X-ray absorption spectrum (XAS) for the corresponding metal complex. In this study, the numbers of 3d holes are characterized quantitatively or semi-quantitatively for a series of manganese (Mn) and nickel (Ni) complexes, including the electron configurations 3d 10 → 3d 0. In addition, extremely dilute (<0.1% wt/wt) Ni enzymes were examined by two different approaches: (1) by using a high resolution superconducting tunnel junction X-ray detector to obtain XAS spectra with a very high signal-to-noisemore » ratio, especially in the non-variant edge jump region; and (2) by adding an inert tracer to the sample that provides a prominent spectral feature to replace the weak edge jump for intensity normalization. In this publication, we present for the first time: (1) L-edge sum rule analysis for a series of Mn and Ni complexes that include electron configurations from an open shell 3d0 to a closed shell 3d 10; (2) a systematic analysis on the uncertainties, especially on that from the edge jump, which was missing in all previous reports; (3) a clearly-resolved edge jump between pre-L 3 and post-L 2 regions from an extremely dilute sample; (4) an evaluation of an alternative normalization standard for L-edge sum rule analysis. XAS from two copper (Cu) proteins measured using a conventional semiconductor X-ray detector are also repeated as bridges between Ni complexes and dilute Ni enzymes. The differences between measuring 1% Cu enzymes and measuring <0.1% Ni enzymes are compared and discussed. As a result, this study extends L-edge sum rule analysis to virtually any 3d metal complex and any dilute biological samples that contain 3d metals.« less

  18. L-edge sum rule analysis on 3d transition metal sites: from d 10 to d 0 and towards application to extremely dilute metallo-enzymes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hongxin; Friedrich, Stephan; Li, Lei

    According to L-edge sum rules, the number of 3d vacancies at a transition metal site is directly proportional to the integrated intensity of the L-edge X-ray absorption spectrum (XAS) for the corresponding metal complex. In this study, the numbers of 3d holes are characterized quantitatively or semi-quantitatively for a series of manganese (Mn) and nickel (Ni) complexes, including the electron configurations 3d 10 → 3d 0. In addition, extremely dilute (<0.1% wt/wt) Ni enzymes were examined by two different approaches: (1) by using a high resolution superconducting tunnel junction X-ray detector to obtain XAS spectra with a very high signal-to-noisemore » ratio, especially in the non-variant edge jump region; and (2) by adding an inert tracer to the sample that provides a prominent spectral feature to replace the weak edge jump for intensity normalization. In this publication, we present for the first time: (1) L-edge sum rule analysis for a series of Mn and Ni complexes that include electron configurations from an open shell 3d0 to a closed shell 3d 10; (2) a systematic analysis on the uncertainties, especially on that from the edge jump, which was missing in all previous reports; (3) a clearly-resolved edge jump between pre-L 3 and post-L 2 regions from an extremely dilute sample; (4) an evaluation of an alternative normalization standard for L-edge sum rule analysis. XAS from two copper (Cu) proteins measured using a conventional semiconductor X-ray detector are also repeated as bridges between Ni complexes and dilute Ni enzymes. The differences between measuring 1% Cu enzymes and measuring <0.1% Ni enzymes are compared and discussed. As a result, this study extends L-edge sum rule analysis to virtually any 3d metal complex and any dilute biological samples that contain 3d metals.« less

  19. Kinetics and mechanism of catalytic hydroprocessing of components of coal-derived liquids. Sixteenth quarterly report, February 16, 1983-May 15, 1983.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gates, B. C.; Olson, H. H.; Schuit, G. C.A.

    1983-08-22

    A new method of structural analysis is applied to a group of hydroliquefied coal samples. The method uses elemental analysis and NMR data to estimate the concentrations of functional groups in the samples. The samples include oil and asphaltene fractions obtained in a series of hydroliquefaction experiments, and a set of 9 fractions separated from a coal-derived oil. The structural characterization of these samples demonstrates that estimates of functional group concentrations can be used to provide detailed structural profiles of complex mixtures and to obtain limited information about reaction pathways. 11 references, 1 figure, 7 tables.

  20. Sample-space-based feature extraction and class preserving projection for gene expression data.

    PubMed

    Wang, Wenjun

    2013-01-01

    In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.

  1. Microplate-reader method for the rapid analysis of copper in natural waters with chemiluminescence detection.

    PubMed

    Durand, Axel; Chase, Zanna; Remenyi, Tomas; Quéroué, Fabien

    2012-01-01

    We have developed a method for the determination of copper in natural waters at nanomolar levels. The use of a microplate-reader minimizes sample processing time (~25 s per sample), reagent consumption (~120 μL per sample), and sample volume (~700 μL). Copper is detected by chemiluminescence. This technique is based on the formation of a complex between copper and 1,10-phenanthroline and the subsequent emission of light during the oxidation of the complex by hydrogen peroxide. Samples are acidified to pH 1.7 and then introduced directly into a 24-well plate. Reagents are added during data acquisition via two reagent injectors. When trace metal clean protocols are employed, the reproducibility is generally less than 7% on blanks and the detection limit is 0.7 nM for seawater and 0.4 nM for freshwater. More than 100 samples per hour can be analyzed with this technique, which is simple, robust, and amenable to at-sea analysis. Seawater samples from Storm Bay in Tasmania illustrate the utility of the method for environmental science. Indeed other trace metals for which optical detection methods exist (e.g., chemiluminescence, fluorescence, and absorbance) could be adapted to the microplate-reader.

  2. On-line approaches for the determination of residues and contaminants in complex samples.

    PubMed

    Fumes, Bruno Henrique; Andrade, Mariane Aissa; Franco, Maraíssa Silva; Lanças, Fernando Mauro

    2017-01-01

    The determination of residues and contaminants in complex matrices such as in the case of food, environmental, and biological samples requires a combination of several steps to succeed in the aimed goal. At least three independent steps are integrated to provide the best available situation to deal with such matrices: (1) a sample preparation technique is employed to isolate the target compounds from the rest of the matrix; (2) a chromatographic (second) step further "purifies" the isolated compounds from the co-extracted matrix interferences; (3) a spectroscopy-based device acts as chromatographic detector (ideally containing a tandem high-resolution mass analyzer) for the qualitative and quantitative analysis. These techniques can be operated in different modes including the off-line and the on-line modes. The present report focus the on-line coupling techniques aiming the determination of analytes present in complex matrices. The fundamentals of these approaches as well as the most common set ups are presented and discussed, as well as a review on the recent applications of these two approaches to the fields of bioanalytical, environmental, and food analysis are critically discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less

  4. Nanoliter-Scale Oil-Air-Droplet Chip-Based Single Cell Proteomic Analysis.

    PubMed

    Li, Zi-Yi; Huang, Min; Wang, Xiu-Kun; Zhu, Ying; Li, Jin-Song; Wong, Catherine C L; Fang, Qun

    2018-04-17

    Single cell proteomic analysis provides crucial information on cellular heterogeneity in biological systems. Herein, we describe a nanoliter-scale oil-air-droplet (OAD) chip for achieving multistep complex sample pretreatment and injection for single cell proteomic analysis in the shotgun mode. By using miniaturized stationary droplet microreaction and manipulation techniques, our system allows all sample pretreatment and injection procedures to be performed in a nanoliter-scale droplet with minimum sample loss and a high sample injection efficiency (>99%), thus substantially increasing the analytical sensitivity for single cell samples. We applied the present system in the proteomic analysis of 100 ± 10, 50 ± 5, 10, and 1 HeLa cell(s), and protein IDs of 1360, 612, 192, and 51 were identified, respectively. The OAD chip-based system was further applied in single mouse oocyte analysis, with 355 protein IDs identified at the single oocyte level, which demonstrated its special advantages of high enrichment of sequence coverage, hydrophobic proteins, and enzymatic digestion efficiency over the traditional in-tube system.

  5. Automatic differential analysis of NMR experiments in complex samples.

    PubMed

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2018-06-01

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  6. [Recent advances in analysis of petroleum geological samples by comprehensive two-dimensional gas chromatography].

    PubMed

    Gao, Xuanbo; Chang, Zhenyang; Dai, Wei; Tong, Ting; Zhang, Wanfeng; He, Sheng; Zhu, Shukui

    2014-10-01

    Abundant geochemical information can be acquired by analyzing the chemical compositions of petroleum geological samples. The information obtained from the analysis provides scientifical evidences for petroleum exploration. However, these samples are complicated and can be easily influenced by physical (e. g. evaporation, emulsification, natural dispersion, dissolution and sorption), chemical (photodegradation) and biological (mainly microbial degradation) weathering processes. Therefore, it is very difficult to analyze the petroleum geological samples and they cannot be effectively separated by traditional gas chromatography/mass spectrometry. A newly developed separation technique, comprehensive two-dimensional gas chromatography (GC x GC), has unique advantages in complex sample analysis, and recently it has been applied to petroleum geological samples. This article mainly reviews the research progres- ses in the last five years, the main problems and the future research about GC x GC applied in the area of petroleum geology.

  7. Continuous flow analysis combined with a light-absorption ratio variation approach for determination of copper at ng/ml level in natural water.

    PubMed

    Gao, Hong-Wen; Wang, Chun-Lei; Jia, Jiang-Yan; Zhang, Ya-Lei

    2007-06-01

    The complexation between Cu(II) and naphthochrome green (NG) is very sensitive at pH 4.09 with the formation of complex ion [Cu(NG)2(H2O)2](2-). It can thus used for the determination of Cu(II) by the light-absorption ratio variation approach (LARVA) with a good selectivity. Both the ordinary detection procedure and continuous flow analysis (CFA) were carried out, where the latter is fit for continuous and rapid analysis of samples. The limit of detection (LOD) of Cu(II) is only 1 ng/ml, which is favorable for direct monitoring of natural water. About 30 samples could be analyzed per hour by CFA. Cu(II) contents in Yangtze River, West Lake, Taihu Lake of China and seawater near Shanghai were determined with satisfactory results. The CFA-LARVA spectrophotometry was the first to be coupled and it will play an important role in the in-situ analysis of natural water quality.

  8. Method of analysis of polymerizable monomeric species in a complex mixture

    DOEpatents

    Hermes, Robert E

    2014-03-18

    Method of selective quantitation of a polymerizable monomeric species in a well spacer fluid, said method comprising the steps of adding at least one solvent having a refractive index of less than about 1.33 to a sample of the complex mixture to produce a solvent phase, and measuring the refractive index of the solvent phase.

  9. Surface complexation and precipitate geometry for aqueous Zn(II) sorption on ferrihydrite I: X-ray absorption extended fine structure spectroscopy analysis

    USGS Publications Warehouse

    Waychunas, G.A.; Fuller, C.C.; Davis, J.A.

    2002-01-01

    "Two-line" ferrihydrite samples precipitated and then exposed to a range of aqueous Zn solutions (10-5 to 10-3 M), and also coprecipitated in similar Zn solutions (pH 6.5), have been examined by Zn and Fe K-edge X-ray absorption spectroscopy. Typical Zn complexes on the surface have Zn-O distances of 1.97(0.2) A?? and coordination numbers of about 4.0(0.5), consistent with tetrahedral oxygen coordination. This contrasts with Zn-O distances of 2.11(.02) A?? and coordination numbers of 6 to 7 in the aqueous Zn solutions used in sample preparation. X-ray absorption extended fine structure spectroscopy (EXAFS) fits to the second shell of cation neighbors indicate as many as 4 Zn-Fe neighbors at 3.44(.04) A?? in coprecipitated samples, and about two Zn-Fe neighbors at the same distance in adsorption samples. In both sets of samples, the fitted coordination number of second shell cations decreases as sorption density increases, indicating changes in the number and type of available complexing sites or the onset of competitive precipitation processes. Comparison of our results with the possible geometries for surface complexes and precipitates suggests that the Zn sorption complexes are inner sphere and at lowest adsorption densities are bidentate, sharing apical oxygens with adjacent edge-sharing Fe(O,OH)6 octahedra. Coprecipitation samples have complexes with similar geometry, but these are polydentate, sharing apices with more than two adjacent edge-sharing Fe(O,OH)6 polyhedra. The results are inconsistent with Zn entering the ferrihydrite structure (i.e., solid solution formation) or formation of other Zn-Fe precipitates. The fitted Zn-Fe coordination numbers drop with increasing Zn density with a minimum of about 0.8(.2) at Zn/(Zn + Fe) of 0.08 or more. This change appears to be attributable to the onset of precipitation of zinc hydroxide polymers with mainly tetrahedral Zn coordination. At the highest loadings studied, the nature of the complexes changes further, and a second type of precipitate forms. This has a structure based on a brucite layer topology, with mainly octahedral Zn coordination. Amorphous zinc hydroxide samples prepared for comparison had a closely similar local structure. Analysis of the Fe K-edge EXAFS is consistent with surface complexation reactions and surface precipitation at high Zn loadings with little or no Fe-Zn solid solution formation. The formation of Zn-containing precipitates at solution conditions two or more orders of magnitude below their solubility limit is compared with other sorption and spectroscopic studies that describe similar behavior. Copyright ?? 2002 Elsevier Science Ltd.

  10. A novel conformation of gel grown biologically active cadmium nicotinate

    NASA Astrophysics Data System (ADS)

    Nair, Lekshmi P.; Bijini, B. R.; Divya, R.; Nair, Prabitha B.; Eapen, S. M.; Dileep Kumar, B. S.; Nishanth Kumar, S.; Nair, C. M. K.; Deepa, M.; Rajendra Babu, K.

    2017-11-01

    The elimination of toxic heavy metals by the formation of stable co-ordination compounds with biologically active ligands is applicable in drug designing. A new crystalline complex of cadmium with nicotinic acid is grown at ambient temperature using the single gel diffusion method in which the crystal structure is different from those already reported. Single crystal x-ray diffraction reveals the identity of crystal structure belonging to monoclinic system, P21/c space group with cell dimensions a = 17.220 (2) Å, b = 10.2480 (2) Å, c = 7.229(9) Å, β = 91.829(4)°. Powder x-ray diffraction analysis confirmed the crystallinity of the sample. The unidentate mode of co-ordination between the metal atom and the carboxylate group is supported by the Fourier Transform Infra Red spectral data. Thermal analysis ensures the thermal stability of the complex. Kinetic and thermodynamic parameters are also calculated. The stoichiometry of the complex is confirmed by the elemental analysis. The UV-visible spectral analysis shows the wide transparency window of the complex in the visible region. The band gap of the complex is found to be 3.92 eV. The complex shows excellent antibacterial and antifungal activity.

  11. Application of targeted quantitative proteomics analysis in human cerebrospinal fluid using a liquid chromatography matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (LC MALDI TOF/TOF) platform.

    PubMed

    Pan, Sheng; Rush, John; Peskind, Elaine R; Galasko, Douglas; Chung, Kathryn; Quinn, Joseph; Jankovic, Joseph; Leverenz, James B; Zabetian, Cyrus; Pan, Catherine; Wang, Yan; Oh, Jung Hun; Gao, Jean; Zhang, Jianpeng; Montine, Thomas; Zhang, Jing

    2008-02-01

    Targeted quantitative proteomics by mass spectrometry aims to selectively detect one or a panel of peptides/proteins in a complex sample and is particularly appealing for novel biomarker verification/validation because it does not require specific antibodies. Here, we demonstrated the application of targeted quantitative proteomics in searching, identifying, and quantifying selected peptides in human cerebrospinal spinal fluid (CSF) using a matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (MALDI TOF/TOF)-based platform. The approach involved two major components: the use of isotopic-labeled synthetic peptides as references for targeted identification and quantification and a highly selective mass spectrometric analysis based on the unique characteristics of the MALDI instrument. The platform provides high confidence for targeted peptide detection in a complex system and can potentially be developed into a high-throughput system. Using the liquid chromatography (LC) MALDI TOF/TOF platform and the complementary identification strategy, we were able to selectively identify and quantify a panel of targeted peptides in the whole proteome of CSF without prior depletion of abundant proteins. The effectiveness and robustness of the approach associated with different sample complexity, sample preparation strategies, as well as mass spectrometric quantification were evaluated. Other issues related to chromatography separation and the feasibility for high-throughput analysis were also discussed. Finally, we applied targeted quantitative proteomics to analyze a subset of previously identified candidate markers in CSF samples of patients with Parkinson's disease (PD) at different stages and Alzheimer's disease (AD) along with normal controls.

  12. Local kernel nonparametric discriminant analysis for adaptive extraction of complex structures

    NASA Astrophysics Data System (ADS)

    Li, Quanbao; Wei, Fajie; Zhou, Shenghan

    2017-05-01

    The linear discriminant analysis (LDA) is one of popular means for linear feature extraction. It usually performs well when the global data structure is consistent with the local data structure. Other frequently-used approaches of feature extraction usually require linear, independence, or large sample condition. However, in real world applications, these assumptions are not always satisfied or cannot be tested. In this paper, we introduce an adaptive method, local kernel nonparametric discriminant analysis (LKNDA), which integrates conventional discriminant analysis with nonparametric statistics. LKNDA is adept in identifying both complex nonlinear structures and the ad hoc rule. Six simulation cases demonstrate that LKNDA have both parametric and nonparametric algorithm advantages and higher classification accuracy. Quartic unilateral kernel function may provide better robustness of prediction than other functions. LKNDA gives an alternative solution for discriminant cases of complex nonlinear feature extraction or unknown feature extraction. At last, the application of LKNDA in the complex feature extraction of financial market activities is proposed.

  13. Language Sample Analysis and Elicitation Technique Effects in Bilingual Children with and without Language Impairment

    ERIC Educational Resources Information Center

    Kapantzoglou, Maria; Fergadiotis, Gerasimos; Restrepo, M. Adelaida

    2017-01-01

    Purpose: This study examined whether the language sample elicitation technique (i.e., storytelling and story-retelling tasks with pictorial support) affects lexical diversity (D), grammaticality (grammatical errors per communication unit [GE/CU]), sentence length (mean length of utterance in words [MLUw]), and sentence complexity (subordination…

  14. Lexical decision as an endophenotype for reading comprehension: An exploration of an association

    PubMed Central

    NAPLES, ADAM; KATZ, LEN; GRIGORENKO, ELENA L.

    2012-01-01

    Based on numerous suggestions in the literature, we evaluated lexical decision (LD) as a putative endophenotype for reading comprehension by investigating heritability estimates and segregation analyses parameter estimates for both of these phenotypes. Specifically, in a segregation analysis of a large sample of families, we established that there is little to no overlap between genes contributing to LD and reading comprehension and that the genetic mechanism behind LD derived from this analysis appears to be more complex than that for reading comprehension. We conclude that in our sample, LD is not a good candidate as an endophenotype for reading comprehension, despite previous suggestions from the literature. Based on this conclusion, we discuss the role and benefit of the endophenotype approach in studies of complex human cognitive functions. PMID:23062302

  15. Deferiprone, a non-toxic reagent for determination of iron in samples via sequential injection analysis

    NASA Astrophysics Data System (ADS)

    Pragourpun, Kraivinee; Sakee, Uthai; Fernandez, Carlos; Kruanetr, Senee

    2015-05-01

    We present for the first time the use of deferiprone as a non-toxic complexing agent for the determination of iron by sequential injection analysis in pharmaceuticals and food samples. The method was based on the reaction of Fe(III) and deferiprone in phosphate buffer at pH 7.5 to give a Fe(III)-deferiprone complex, which showed a maximum absorption at 460 nm. Under the optimum conditions, the linearity range for iron determination was found over the range of 0.05-3.0 μg mL-1 with a correlation coefficient (r2) of 0.9993. The limit of detection and limit of quantitation were 0.032 μg mL-1 and 0.055 μg mL-1, respectively. The relative standard deviation (%RSD) of the method was less than 5.0% (n = 11), and the percentage recovery was found in the range of 96.0-104.0%. The proposed method was satisfactorily applied for the determination of Fe(III) in pharmaceuticals, water and food samples with a sampling rate of 60 h-1.

  16. The Analysis of Cyanide and Its Breakdown Products in Biological Samples

    DTIC Science & Technology

    2010-01-01

    simultaneous GC-mass spectrometric (MS) analysis of cyanide and thiocyanate, and Funazo et al. (53) quantita- tively methylated cyanide and thiocyanate for...selective membrane electrode for thiocyanate ion based on a bis-taurine- salicylic binuclear copper(II) complex as ionophore. Chinese Journal of Chemistry

  17. Multiple parallel mass spectrometry for lipid and vitamin D analysis

    USDA-ARS?s Scientific Manuscript database

    Liquid chromatography (LC) coupled to mass spectrometry (MS) has become the method of choice for analysis of complex lipid samples. Two types of ionization sources have emerged as the most commonly used to couple LC to MS: atmospheric pressure chemical ionization (APCI) and electrospray ionization ...

  18. Spatial analysis of NDVI readings with difference sampling density

    USDA-ARS?s Scientific Manuscript database

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  19. Meta-analysis, complexity, and heterogeneity: a qualitative interview study of researchers' methodological values and practices.

    PubMed

    Lorenc, Theo; Felix, Lambert; Petticrew, Mark; Melendez-Torres, G J; Thomas, James; Thomas, Sian; O'Mara-Eves, Alison; Richardson, Michelle

    2016-11-16

    Complex or heterogeneous data pose challenges for systematic review and meta-analysis. In recent years, a number of new methods have been developed to meet these challenges. This qualitative interview study aimed to understand researchers' understanding of complexity and heterogeneity and the factors which may influence the choices researchers make in synthesising complex data. We conducted interviews with a purposive sample of researchers (N = 19) working in systematic review or meta-analysis across a range of disciplines. We analysed data thematically using a framework approach. Participants reported using a broader range of methods and data types in complex reviews than in traditional reviews. A range of techniques are used to explore heterogeneity, but there is some debate about their validity, particularly when applied post hoc. Technical considerations of how to synthesise complex evidence cannot be isolated from questions of the goals and contexts of research. However, decisions about how to analyse data appear to be made in a largely informal way, drawing on tacit expertise, and their relation to these broader questions remains unclear.

  20. Capillary zone electrophoresis for analysis of phytochelatins and other thiol peptides in complex biological samples derivatized with monobromobimane.

    PubMed

    Perez-Rama, Mónica; Torres Vaamonde, Enrique; Abalde Alonso, Julio

    2005-02-01

    A new method to improve the analysis of phytochelatins and their precursors (cysteine, gamma-Glu-Cys, and glutathione) derivatized with monobromobimane (mBrB) in complex biological samples by capillary zone electrophoresis is described. The effects of the background electrolyte pH, concentration, and different organic additives (acetonitrile, methanol, and trifluoroethanol) on the separation were studied to achieve optimum resolution and number of theoretical plates of the analyzed compounds in the electropherograms. Optimum separation of the thiol peptides was obtained with 150 mM phosphate buffer at pH 1.60. Separation efficiency was improved when 2.5% v/v methanol was added to the background electrolyte. The electrophoretic conditions were 13 kV and capillary dimensions with 30 cm length from the inlet to the detector (38 cm total length) and 50 microm inner diameter. The injection was by pressure at 50 mbar for 17 s. Under these conditions, the separation between desglycyl-peptides and phytochelatins was also achieved. We also describe the optimum conditions for the derivatization of biological samples with mBrB to increase electrophoretic sensitivity and number of theoretical plates. The improved method was shown to be simple, reproducible, selective, and accurate in measuring thiol peptides in complex biological samples, the detection limit being 2.5 microM glutathione at a wavelength of 390 nm.

  1. Separation of Gd-humic complexes and Gd-based magnetic resonance imaging contrast agent in river water with QAE-Sephadex A-25 for the fractionation analysis.

    PubMed

    Matsumiya, Hiroaki; Inoue, Hiroto; Hiraide, Masataka

    2014-10-01

    Gadolinium complexed with naturally occurring, negatively charged humic substances (humic and fulvic acids) was collected from 500 mL of sample solution onto a column packed with 150 mg of a strongly basic anion-exchanger (QAE-Sephadex A-25). A Gd-based magnetic resonance imaging contrast agent (diethylenetriamine-N,N,N',N″,N″-pentaacetato aquo gadolinium(III), Gd-DTPA(2-)) was simultaneously collected on the same column. The Gd-DTPA complex was desorbed by anion-exchange with 50mM tetramethylammonium sulfate, leaving the Gd-humic complexes on the column. The Gd-humic complexes were subsequently dissociated with 1M nitric acid to desorb the humic fraction of Gd. The two-step desorption with small volumes of the eluting agents allowed the 100-fold preconcentration for the fractionation analysis of Gd at low ng L(-1) levels by inductively coupled plasma-mass spectrometry (ICP-MS). On the other hand, Gd(III) neither complexed with humic substances nor DTPA, i.e., free species, was not sorbed on the column. The free Gd in the effluent was preconcentrated 100-fold by a conventional solid-phase extraction with an iminodiacetic acid-type chelating resin and determined by ICP-MS. The proposed analytical fractionation method was applied to river water samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Metagenomic analyses of the late Pleistocene permafrost - additional tools for reconstruction of environmental conditions

    NASA Astrophysics Data System (ADS)

    Rivkina, Elizaveta; Petrovskaya, Lada; Vishnivetskaya, Tatiana; Krivushin, Kirill; Shmakova, Lyubov; Tutukina, Maria; Meyers, Arthur; Kondrashov, Fyodor

    2016-04-01

    A comparative analysis of the metagenomes from two 30 000-year-old permafrost samples, one of lake-alluvial origin and the other from late Pleistocene Ice Complex sediments, revealed significant differences within microbial communities. The late Pleistocene Ice Complex sediments (which have been characterized by the absence of methane with lower values of redox potential and Fe2+ content) showed a low abundance of methanogenic archaea and enzymes from both the carbon and nitrogen cycles, but a higher abundance of enzymes associated with the sulfur cycle. The metagenomic and geochemical analyses described in the paper provide evidence that the formation of the sampled late Pleistocene Ice Complex sediments likely took place under much more aerobic conditions than lake-alluvial sediments.

  3. Novel immunoassay formats for integrated microfluidic circuits: diffusion immunoassays (DIA)

    NASA Astrophysics Data System (ADS)

    Weigl, Bernhard H.; Hatch, Anson; Kamholz, Andrew E.; Yager, Paul

    2000-03-01

    Novel designs of integrated fluidic microchips allow separations, chemical reactions, and calibration-free analytical measurements to be performed directly in very small quantities of complex samples such as whole blood and contaminated environmental samples. This technology lends itself to applications such as clinical diagnostics, including tumor marker screening, and environmental sensing in remote locations. Lab-on-a-Chip based systems offer many *advantages over traditional analytical devices: They consume extremely low volumes of both samples and reagents. Each chip is inexpensive and small. The sampling-to-result time is extremely short. They perform all analytical functions, including sampling, sample pretreatment, separation, dilution, and mixing steps, chemical reactions, and detection in an integrated microfluidic circuit. Lab-on-a-Chip systems enable the design of small, portable, rugged, low-cost, easy to use, yet extremely versatile and capable diagnostic instruments. In addition, fluids flowing in microchannels exhibit unique characteristics ('microfluidics'), which allow the design of analytical devices and assay formats that would not function on a macroscale. Existing Lab-on-a-chip technologies work very well for highly predictable and homogeneous samples common in genetic testing and drug discovery processes. One of the biggest challenges for current Labs-on-a-chip, however, is to perform analysis in the presence of the complexity and heterogeneity of actual samples such as whole blood or contaminated environmental samples. Micronics has developed a variety of Lab-on-a-Chip assays that can overcome those shortcomings. We will now present various types of novel Lab- on-a-Chip-based immunoassays, including the so-called Diffusion Immunoassays (DIA) that are based on the competitive laminar diffusion of analyte molecules and tracer molecules into a region of the chip containing antibodies that target the analyte molecules. Advantages of this technique are a reduction in reagents, higher sensitivity, minimal preparation of complex samples such as blood, real-time calibration, and extremely rapid analysis.

  4. Effect-directed analysis supporting monitoring of aquatic ...

    EPA Pesticide Factsheets

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required,and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, includingtheir strengths andweaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies onfractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determiningthe chemical structures causing effects is analytical toxi

  5. Absolute Configuration of 3-METHYLCYCLOHEXANONE by Chiral Tag Rotational Spectroscopy and Vibrational Circular Dichroism

    NASA Astrophysics Data System (ADS)

    Evangelisti, Luca; Holdren, Martin S.; Mayer, Kevin J.; Smart, Taylor; West, Channing; Pate, Brooks

    2017-06-01

    The absolute configuration of 3-methylcyclohexanone was established by chiral tag rotational spectroscopy measurements using 3-butyn-2-ol as the tag partner. This molecule was chosen because it is a benchmark measurement for vibrational circular dichroism (VCD). A comparison of the analysis approaches of chiral tag rotational spectroscopy and VCD will be presented. One important issue in chiral analysis by both methods is the conformational flexibility of the molecule being analyzed. The analysis of conformational composition of samples will be illustrated. In this case, the high spectral resolution of molecular rotational spectroscopy and potential for spectral simplification by conformational cooling in the pulsed jet expansion are advantages for chiral tag spectroscopy. The computational chemistry requirements for the two methods will also be discussed. In this case, the need to perform conformer searches for weakly bound complexes and to perform reasonably high level quantum chemistry geometry optimizations on these complexes makes the computational time requirements less favorable for chiral tag rotational spectroscopy. Finally, the issue of reliability of the determination of the absolute configuration will be considered. In this case, rotational spectroscopy offers a "gold standard" analysis method through the determination of the ^{13}C-subsitution structure of the complex between 3-methylcyclohexanone and an enantiopure sample of the 3-butyn-2-ol tag.

  6. An Enhanced K-Means Algorithm for Water Quality Analysis of The Haihe River in China.

    PubMed

    Zou, Hui; Zou, Zhihong; Wang, Xiaojing

    2015-11-12

    The increase and the complexity of data caused by the uncertain environment is today's reality. In order to identify water quality effectively and reliably, this paper presents a modified fast clustering algorithm for water quality analysis. The algorithm has adopted a varying weights K-means cluster algorithm to analyze water monitoring data. The varying weights scheme was the best weighting indicator selected by a modified indicator weight self-adjustment algorithm based on K-means, which is named MIWAS-K-means. The new clustering algorithm avoids the margin of the iteration not being calculated in some cases. With the fast clustering analysis, we can identify the quality of water samples. The algorithm is applied in water quality analysis of the Haihe River (China) data obtained by the monitoring network over a period of eight years (2006-2013) with four indicators at seven different sites (2078 samples). Both the theoretical and simulated results demonstrate that the algorithm is efficient and reliable for water quality analysis of the Haihe River. In addition, the algorithm can be applied to more complex data matrices with high dimensionality.

  7. Comparison of Metabolomics Approaches for Evaluating the Variability of Complex Botanical Preparations: Green Tea (Camellia sinensis) as a Case Study.

    PubMed

    Kellogg, Joshua J; Graf, Tyler N; Paine, Mary F; McCune, Jeannine S; Kvalheim, Olav M; Oberlies, Nicholas H; Cech, Nadja B

    2017-05-26

    A challenge that must be addressed when conducting studies with complex natural products is how to evaluate their complexity and variability. Traditional methods of quantifying a single or a small range of metabolites may not capture the full chemical complexity of multiple samples. Different metabolomics approaches were evaluated to discern how they facilitated comparison of the chemical composition of commercial green tea [Camellia sinensis (L.) Kuntze] products, with the goal of capturing the variability of commercially used products and selecting representative products for in vitro or clinical evaluation. Three metabolomic-related methods-untargeted ultraperformance liquid chromatography-mass spectrometry (UPLC-MS), targeted UPLC-MS, and untargeted, quantitative 1 HNMR-were employed to characterize 34 commercially available green tea samples. Of these methods, untargeted UPLC-MS was most effective at discriminating between green tea, green tea supplement, and non-green-tea products. A method using reproduced correlation coefficients calculated from principal component analysis models was developed to quantitatively compare differences among samples. The obtained results demonstrated the utility of metabolomics employing UPLC-MS data for evaluating similarities and differences between complex botanical products.

  8. Sample entropy and regularity dimension in complexity analysis of cortical surface structure in early Alzheimer's disease and aging.

    PubMed

    Chen, Ying; Pham, Tuan D

    2013-05-15

    We apply for the first time the sample entropy (SampEn) and regularity dimension model for measuring signal complexity to quantify the structural complexity of the brain on MRI. The concept of the regularity dimension is based on the theory of chaos for studying nonlinear dynamical systems, where power laws and entropy measure are adopted to develop the regularity dimension for modeling a mathematical relationship between the frequencies with which information about signal regularity changes in various scales. The sample entropy and regularity dimension of MRI-based brain structural complexity are computed for early Alzheimer's disease (AD) elder adults and age and gender-matched non-demented controls, as well as for a wide range of ages from young people to elder adults. A significantly higher global cortical structure complexity is detected in AD individuals (p<0.001). The increase of SampEn and the regularity dimension are also found to be accompanied with aging which might indicate an age-related exacerbation of cortical structural irregularity. The provided model can be potentially used as an imaging bio-marker for early prediction of AD and age-related cognitive decline. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Comparison of Metabolomics Approaches for Evaluating the Variability of Complex Botanical Preparations: Green Tea (Camellia sinensis) as a Case Study

    PubMed Central

    2017-01-01

    A challenge that must be addressed when conducting studies with complex natural products is how to evaluate their complexity and variability. Traditional methods of quantifying a single or a small range of metabolites may not capture the full chemical complexity of multiple samples. Different metabolomics approaches were evaluated to discern how they facilitated comparison of the chemical composition of commercial green tea [Camellia sinensis (L.) Kuntze] products, with the goal of capturing the variability of commercially used products and selecting representative products for in vitro or clinical evaluation. Three metabolomic-related methods—untargeted ultraperformance liquid chromatography–mass spectrometry (UPLC-MS), targeted UPLC-MS, and untargeted, quantitative 1HNMR—were employed to characterize 34 commercially available green tea samples. Of these methods, untargeted UPLC-MS was most effective at discriminating between green tea, green tea supplement, and non-green-tea products. A method using reproduced correlation coefficients calculated from principal component analysis models was developed to quantitatively compare differences among samples. The obtained results demonstrated the utility of metabolomics employing UPLC-MS data for evaluating similarities and differences between complex botanical products. PMID:28453261

  10. Nontargeted Screening Method for Illegal Additives Based on Ultrahigh-Performance Liquid Chromatography-High-Resolution Mass Spectrometry.

    PubMed

    Fu, Yanqing; Zhou, Zhihui; Kong, Hongwei; Lu, Xin; Zhao, Xinjie; Chen, Yihui; Chen, Jia; Wu, Zeming; Xu, Zhiliang; Zhao, Chunxia; Xu, Guowang

    2016-09-06

    Identification of illegal additives in complex matrixes is important in the food safety field. In this study a nontargeted screening strategy was developed to find illegal additives based on ultrahigh-performance liquid chromatography-high-resolution mass spectrometry (UHPLC-HRMS). First, an analytical method for possible illegal additives in complex matrixes was established including fast sample pretreatment, accurate UHPLC separation, and HRMS detection. Second, efficient data processing and differential analysis workflow were suggested and applied to find potential risk compounds. Third, structure elucidation of risk compounds was performed by (1) searching online databases [Metlin and the Human Metabolome Database (HMDB)] and an in-house database which was established at the above-defined conditions of UHPLC-HRMS analysis and contains information on retention time, mass spectra (MS), and tandem mass spectra (MS/MS) of 475 illegal additives, (2) analyzing fragment ions, and (3) referring to fragmentation rules. Fish was taken as an example to show the usefulness of the nontargeted screening strategy, and six additives were found in suspected fish samples. Quantitative analysis was further carried out to determine the contents of these compounds. The satisfactory application of this strategy in fish samples means that it can also be used in the screening of illegal additives in other kinds of food samples.

  11. Sample preparation techniques for the determination of trace residues and contaminants in foods.

    PubMed

    Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M

    2007-06-15

    The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Distler, T. M.; Wong, C. M.

    Runoff-water samples for the first, third, and fourth quarters of 1975 were analyzed for pesticide residues at LLL and independently by the LFE Environmental Analysis Laboratories. For the compounds analyzed, upper limits to possible contamination were placed conservatively at the low parts-per-billion level. In addition, soil samples were also analyzed. Future work will continue to include quarterly sampling and will be broadened in scope to include quantitative analysis of a larger number of compounds. A study of recovery efficiency is planned. Because of the high backgrounds on soil samples together with the uncertainties introduced by the cleanup procedures, there ismore » little hope of evaluating the distribution of a complex mixture of pesticides among the aqueous and solid phases in a drainage sample. No further sampling of soil from the streambed is therefore contemplated.« less

  13. Conventional and Advanced Separations in Mass Spectrometry-Based Metabolomics: Methodologies and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heyman, Heino M.; Zhang, Xing; Tang, Keqi

    2016-02-16

    Metabolomics is the quantitative analysis of all metabolites in a given sample. Due to the chemical complexity of the metabolome, optimal separations are required for comprehensive identification and quantification of sample constituents. This chapter provides an overview of both conventional and advanced separations methods in practice for reducing the complexity of metabolite extracts delivered to the mass spectrometer detector, and covers gas chromatography (GC), liquid chromatography (LC), capillary electrophoresis (CE), supercritical fluid chromatography (SFC) and ion mobility spectrometry (IMS) separation techniques coupled with mass spectrometry (MS) as both uni-dimensional and as multi-dimensional approaches.

  14. Evanescent wave sensing and absorption analysis of herbal tea floral extracts in the presence of silver metal complexes

    NASA Astrophysics Data System (ADS)

    Priyamvada, V. C.; Radhakrishnan, P.

    2017-06-01

    Fiber optic evanescent wave sensors are used for studying the absorption properties of biochemical samples. The studies give precise information regarding the actual ingredients of the samples. Recent studies report the corrosion of silver in the presence glucose dissolved in water and heated to a temperature of 70°C. Based on this report evanescent absorption studies are carried out in hibiscus herbal tea floral extracts in the presence of silver metal complexes. These studies can also lead to the evaluation of the purity of the herbal tea extract.

  15. Lunar sample analysis

    NASA Technical Reports Server (NTRS)

    Housley, R. M.

    1983-01-01

    The evolution of the lunar regolith under solar wind and micrometeorite bombardment is discussed as well as the size distribution of ultrafine iron in lunar soil. The most important characteristics of complex graphite, sulfide, arsenide, palladium, and platinum mineralization in a pegmatoid pyroxenite of the Stillwater Complex in Montana are examined. Oblique reflected light micrographs and backscattered electron SEM images of the graphite associations are included.

  16. Synthesis, characterization and solid-state properties of [Zn(Hdmmthiol)2]\\cdot2H2O complex

    NASA Astrophysics Data System (ADS)

    Dagdelen, Fethi; Aydogdu, Yildirim; Dey, Kamalendu; Biswas, Susobhan

    2016-05-01

    The zinc(II) complex with tridentate thiohydrazone ligand have been prepared by metal template reaction. The metal template reaction was used to prepare the zinc (II) complex with tridentate thiohydrazone ligand. The reaction of diacetylmonoxime and, morpholine N-thiohydrazidewith Zn(OAc)2 \\cdot2H2O under reflux yielded the formation of the [Zn(Hdmmthiol )2]\\cdot2H2O complex. The complex was characterized by a combination of protocols including elemental analysis, UV+vis, FT-IR, TG and PXRD. The temperature dependence of the electrical conductivity and the optical property of the [Zn(Hdmmthiol )2] \\cdot2H2O complex is called H2dammthiol was studied. Powder X-ray diffraction (PXRD) method was used to investigate the crystal structure of the sample. The zinc complex was shown to be a member of the triclinic system. The zinc complex was determined to have n-type conductivity as demonstrated in the hot probe measurements. The complex was determined to display direct optical transition with band gaps of 2.52eV as determined by the optical absorption analysis.

  17. Growth and characterization of barium complex of 1,3,5-triazinane-2,4,6-trione in gel: a corrosion inhibiting material

    NASA Astrophysics Data System (ADS)

    Divya, R.; Nair, Lekshmi P.; Bijini, B. R.; Nair, C. M. K.; Babu, K. Rajendra

    2018-05-01

    Good quality prismatic crystals of industrially applicable corrosion inhibiting barium complex of 1,3,5-triazinane-2,4,6-trione have been grown by conventional gel method. The crystal structure, packing, and nature of bonds are revealed in the single crystal X-ray diffraction analysis. The crystal has a three-dimensional polymeric structure having a triclinic crystal system with the space group P-1. The powder X-ray diffraction analysis confirms its crystalline nature. The functional groups present in the crystal are identified by Fourier transform infrared spectroscopy. Elemental analysis confirms the stoichiometry of the elements present in the complex. Thermogravimetric analysis and differential thermal analysis reveal its good thermal stability. The optical properties like band gap, refractive index and extinction coefficient are evaluated from the UV-visible spectral analysis. The singular property of the material, corrosion inhibition efficiency achieved by the adsorption of the sample molecules is determined by the weight loss method.

  18. Utility of the summation chromatographic peak integration function to avoid manual reintegrations in the analysis of targeted analytes

    USDA-ARS?s Scientific Manuscript database

    As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...

  19. Impact analysis of off-road-vehicle use on vegetation in the Grand Mere dune environment. [Lake Michigan

    NASA Technical Reports Server (NTRS)

    Schultink, G. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. A linear regression between percent nonvegetative land and the time variable was completed for the two sample areas. Sample area no. 1 showed an average vegetation loss of 1.901% per year, while the loss for sample area no. 2 amounted to 5.889% per year. Two basic reasons for the difference were assumed to play a role: the difference in access potential and the amount of already fragmented vegetation complexes in existence during the first year of the comparative analysis - 1970. Sample area no. 2 was located closer to potential access points and was more fragmented initially.

  20. Structural system reliability calculation using a probabilistic fault tree analysis method

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  1. Poisoning: fact or fiction?

    PubMed

    Flanagan, Robert J

    2012-01-01

    Analytical toxicology is a complex discipline. Simply detecting a poison in a biological sample does not necessarily mean that the individual from whom the sample was obtained had been poisoned. An analysis can prove exposure and perhaps give an indication of the magnitude of exposure, but the results have to be placed in proper context. Even if sampling was ante-mortem an analysis does not necessarily prove the effects that the drug or poison had on the victim immediately before or at the time of sampling. Tolerance is one big issue, the mechanism of exposure (how the drug got into the body) is another, and of course with post-mortem work there are always additional considerations such as site of sample collection and the possibility of post-mortem change in analyte concentration. There are also questions of quality and reliability, and whether a particular analysis and the interpretation placed upon the result are appropriate in a particular case.

  2. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  3. Assessment of antibody library diversity through next generation sequencing and technical error compensation

    PubMed Central

    Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error. PMID:28505201

  4. Assessment of antibody library diversity through next generation sequencing and technical error compensation.

    PubMed

    Fantini, Marco; Pandolfini, Luca; Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Terrigno, Marco; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error.

  5. Online combination of reversed-phase/reversed-phase and porous graphitic carbon liquid chromatography for multicomponent separation of proteomics and glycoproteomics samples.

    PubMed

    Lam, Maggie P Y; Lau, Edward; Siu, S O; Ng, Dominic C M; Kong, Ricky P W; Chiu, Philip C N; Yeung, William S B; Lo, Clive; Chu, Ivan K

    2011-11-01

    In this paper, we describe an online combination of reversed-phase/reversed-phase (RP-RP) and porous graphitic carbon (PGC) liquid chromatography (LC) for multicomponent analysis of proteomics and glycoproteomics samples. The online RP-RP portion of this system provides comprehensive 2-D peptide separation based on sequence hydrophobicity at pH 2 and 10. Hydrophilic components (e.g. glycans, glycopeptides) that are not retained by RP are automatically diverted downstream to a PGC column for further trapping and separation. Furthermore, the RP-RP/PGC system can provide simultaneous extension of the hydropathy range and peak capacity for analysis. Using an 11-protein mixture, we found that the system could efficiently separate native peptides and released N-glycans from a single sample. We evaluated the applicability of the system to the analysis of complex biological samples using 25 μg of the lysate of a human choriocarcinoma cell line (BeWo), confidently identifying a total of 1449 proteins from a single experiment and up to 1909 distinct proteins from technical triplicates. The PGC fraction increased the sequence coverage through the inclusion of additional hydrophilic sequences that accounted for up to 6.9% of the total identified peptides from the BeWo lysate, with apparent preference for the detection of hydrophilic motifs and proteins. In addition, RP-RP/PGC is applicable to the analysis of complex glycomics samples, as demonstrated by our analysis of a concanavalin A-extracted glycoproteome from human serum; in total, 134 potentially N-glycosylated serum proteins, 151 possible N-glycosylation sites, and more than 40 possible N-glycan structures recognized by concanavalin A were simultaneously detected. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Preservation of commonly applied fluorescent tracers in complex water samples

    NASA Astrophysics Data System (ADS)

    Cao, Viet; Schaffer, Mario; Jin, Yulan; Licha, Tobias

    2017-06-01

    Water sample preservation and pre-treatment are important steps for achieving accurate and reproductive results from tracer tests. However, this is particularly challenging for complex water mixtures prior to fluorescence analysis. In this study, the interference of iron and calcium precipitation with nine commonly applied conservative tracers, uranine, eosin, 1-naphthalene sulfonate, 1,5-naphthalene disulfonate, 2,6-naphthalene disulfonate, 4-amino-1-naphthalene sulfonate, 6-hydroxy-2-naphthalene sulfonate, 1,3,6-naphthalene trisulfonate, and 1,3,6,8-pyrene tetrasulfonate, was investigated in batch experiments. In general, the observed results are influenced by precipitates. A technique consisting of pH adjustment and centrifugation is described for preserving samples and avoiding the impact of these precipitates on the tracer test results.

  7. Challenges to evaluating complex interventions: a content analysis of published papers

    PubMed Central

    2013-01-01

    Background There is continuing interest among practitioners, policymakers and researchers in the evaluation of complex interventions stemming from the need to further develop the evidence base on the effectiveness of healthcare and public health interventions, and an awareness that evaluation becomes more challenging if interventions are complex. We undertook an analysis of published journal articles in order to identify aspects of complexity described by writers, the fields in which complex interventions are being evaluated and the challenges experienced in design, implementation and evaluation. This paper outlines the findings of this documentary analysis. Methods The PubMed electronic database was searched for the ten year period, January 2002 to December 2011, using the term “complex intervention*” in the title and/or abstract of a paper. We extracted text from papers to a table and carried out a thematic analysis to identify authors’ descriptions of challenges faced in developing, implementing and evaluating complex interventions. Results The search resulted in a sample of 221 papers of which full text of 216 was obtained and 207 were included in the analysis. The 207 papers broadly cover clinical, public health and methodological topics. Challenges described included the content and standardisation of interventions, the impact of the people involved (staff and patients), the organisational context of implementation, the development of outcome measures, and evaluation. Conclusions Our analysis of these papers suggests that more detailed reporting of information on outcomes, context and intervention is required for complex interventions. Future revisions to reporting guidelines for both primary and secondary research may need to take aspects of complexity into account to enhance their value to both researchers and users of research. PMID:23758638

  8. [Social-professional status, identity, social participation and media utilization. Analysis of a complex dynamics].

    PubMed

    Laflamme, Simon; Roggero, Pascal; Southcott, Chris

    2010-08-01

    This article examines the link between the domain and level of occupation, on the one hand, and use of media, including internet, on the other. It adds to this investigation an analysis of identity in its relation to media use and accessibility. It challenges the hypothesis of a strong correlation between level of occupation and use and accessibility to media. It reveals complex phenomena of social homogenization and differentiation. Data is extracted from a sample of workers who completed a questionnaire which focused on use of media.

  9. Simultaneous separation of inorganic anions and metal-citrate complexes on a zwitterionic stationary phase with on-column complexation.

    PubMed

    Nesterenko, Ekaterina P; Nesterenko, Pavel N; Paull, Brett

    2008-12-05

    The retention and separation selectivity of inorganic anions and on-column derivatised negatively charged citrate or oxalate metal complexes on reversed-phase stationary phases dynamically coated with N-(dodecyl-N,N-dimethylammonio)undecanoate (DDMAU) has been investigated. The retention mechanism for the metal-citrate complexes was predominantly anion exchange, although the amphoteric/zwitterionic nature of the stationary phase coating undoubtedly also contributed to the unusual separation selectivity shown. A mixture of 10 inorganic anions and metal cations was achieved using a 20 cm monolithic DDMAU modified column and a 1 mM citrate eluent, pH 4.0, flow rate equal to 0.8 mL/min. Selectivity was found to be strongly pH dependent, allowing additional scope for manipulation of solute retention, and thus application to complex samples. This is illustrated with the analysis of an acidic mine drainage sample with a range of inorganic anions and transition metal cations, varying significantly in their concentrations levels.

  10. Biomolecular signatures of diabetic wound healing by structural mass spectrometry

    PubMed Central

    Hines, Kelly M.; Ashfaq, Samir; Davidson, Jeffrey M.; Opalenik, Susan R.; Wikswo, John P.; McLean, John A.

    2013-01-01

    Wound fluid is a complex biological sample containing byproducts associated with the wound repair process. Contemporary techniques, such as immunoblotting and enzyme immunoassays, require extensive sample manipulation and do not permit the simultaneous analysis of multiple classes of biomolecular species. Structural mass spectrometry, implemented as ion mobility-mass spectrometry (IM-MS), comprises two sequential, gas-phase dispersion techniques well suited for the study of complex biological samples due to its ability to separate and simultaneously analyze multiple classes of biomolecules. As a model of diabetic wound healing, polyvinyl alcohol (PVA) sponges were inserted subcutaneously into non-diabetic (control) and streptozotocin-induced diabetic rats to elicit a granulation tissue response and to collect acute wound fluid. Sponges were harvested at days 2 or 5 to capture different stages of the early wound healing process. Utilizing IM-MS, statistical analysis, and targeted ultra-performance liquid chromatography (UPLC) analysis, biomolecular signatures of diabetic wound healing have been identified. The protein S100-A8 was highly enriched in the wound fluids collected from day 2 diabetic rats. Lysophosphatidylcholine (20:4) and cholic acid also contributed significantly to the differences between diabetic and control groups. This report provides a generalized workflow for wound fluid analysis demonstrated with a diabetic rat model. PMID:23452326

  11. Sample preparation for SFM imaging of DNA, proteins, and DNA-protein complexes.

    PubMed

    Ristic, Dejan; Sanchez, Humberto; Wyman, Claire

    2011-01-01

    Direct imaging is invaluable for understanding the mechanism of complex genome transactions where proteins work together to organize, transcribe, replicate, and repair DNA. Scanning (or atomic) force microscopy is an ideal tool for this, providing 3D information on molecular structure at nanometer resolution from defined components. This is a convenient and practical addition to in vitro studies as readily obtainable amounts of purified proteins and DNA are required. The images reveal structural details on the size and location of DNA-bound proteins as well as protein-induced arrangement of the DNA, which are directly correlated in the same complexes. In addition, even from static images, the different forms observed and their relative distributions can be used to deduce the variety and stability of different complexes that are necessarily involved in dynamic processes. Recently available instruments that combine fluorescence with topographic imaging allow the identification of specific molecular components in complex assemblies, which broadens the applications and increases the information obtained from direct imaging of molecular complexes. We describe here basic methods for preparing samples of proteins, DNA, and complexes of the two for topographic imaging and quantitative analysis. We also describe special considerations for combined fluorescence and topographic imaging of molecular complexes.

  12. Corrosion in drinking water pipes: the importance of green rusts.

    PubMed

    Swietlik, Joanna; Raczyk-Stanisławiak, Urszula; Piszora, Paweł; Nawrocki, Jacek

    2012-01-01

    Complex crystallographic composition of the corrosion products is studied by diffraction methods and results obtained after different pre-treatment of samples are compared. The green rusts are found to be much more abundant in corrosion scales than it has been assumed so far. The characteristic and crystallographic composition of corrosion scales and deposits suspended in steady waters were analyzed by X-ray diffraction (XRD). The necessity of the examination of corrosion products in the wet conditions is indicated. The drying of the samples before analysis is shown to substantially change the crystallographic phases originally present in corrosion products. On sample drying the unstable green rusts is converted into more stable phases such as goethite and lepidocrocite, while the content of magnetite and siderite decreases. Three types of green rusts in wet materials sampled from tubercles are identified. Unexpectedly, in almost all corrosion scale samples significant amounts of the least stable green rust in chloride form was detected. Analysis of corrosion products suspended in steady water, which remained between tubercles and possibly in their interiors, revealed complex crystallographic composition of the sampled material. Goethite, lepidocrocite and magnetite as well as low amounts of siderite and quartz were present in all samples. Six different forms of green rusts were identified in the deposits separated from steady waters and the most abundant was carbonate green rust GR(CO(3)(2-))(I). Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. A Multiplex PCR for the Simultaneous Detection and Genotyping of the Echinococcus granulosus Complex

    PubMed Central

    Boubaker, Ghalia; Macchiaroli, Natalia; Prada, Laura; Cucher, Marcela A.; Rosenzvit, Mara C.; Ziadinov, Iskender; Deplazes, Peter; Saarma, Urmas; Babba, Hamouda; Gottstein, Bruno; Spiliotis, Markus

    2013-01-01

    Echinococcus granulosus is characterized by high intra-specific variability (genotypes G1–G10) and according to the new molecular phylogeny of the genus Echinococcus, the E. granulosus complex has been divided into E. granulosus sensu stricto (G1–G3), E. equinus (G4), E. ortleppi (G5), and E. canadensis (G6–G10). The molecular characterization of E. granulosus isolates is fundamental to understand the spatio-temporal epidemiology of this complex in many endemic areas with the simultaneous occurrence of different Echinococcus species and genotypes. To simplify the genotyping of the E. granulosus complex we developed a single-tube multiplex PCR (mPCR) allowing three levels of discrimination: (i) Echinococcus genus, (ii) E. granulosus complex in common, and (iii) the specific genotype within the E. granulosus complex. The methodology was established with known DNA samples of the different strains/genotypes, confirmed on 42 already genotyped samples (Spain: 22 and Bulgaria: 20) and then successfully applied on 153 unknown samples (Tunisia: 114, Algeria: 26 and Argentina: 13). The sensitivity threshold of the mPCR was found to be 5 ng Echinoccoccus DNA in a mixture of up to 1 µg of foreign DNA and the specificity was 100% when template DNA from closely related members of the genus Taenia was used. Additionally to DNA samples, the mPCR can be carried out directly on boiled hydatid fluid or on alkaline-lysed frozen or fixed protoscoleces, thus avoiding classical DNA extractions. However, when using Echinococcus eggs obtained from fecal samples of infected dogs, the sensitivity of the mPCR was low (<40%). Thus, except for copro analysis, the mPCR described here has a high potential for a worldwide application in large-scale molecular epidemiological studies on the Echinococcus genus. PMID:23350011

  14. A multiplex PCR for the simultaneous detection and genotyping of the Echinococcus granulosus complex.

    PubMed

    Boubaker, Ghalia; Macchiaroli, Natalia; Prada, Laura; Cucher, Marcela A; Rosenzvit, Mara C; Ziadinov, Iskender; Deplazes, Peter; Saarma, Urmas; Babba, Hamouda; Gottstein, Bruno; Spiliotis, Markus

    2013-01-01

    Echinococcus granulosus is characterized by high intra-specific variability (genotypes G1-G10) and according to the new molecular phylogeny of the genus Echinococcus, the E. granulosus complex has been divided into E. granulosus sensu stricto (G1-G3), E. equinus (G4), E. ortleppi (G5), and E. canadensis (G6-G10). The molecular characterization of E. granulosus isolates is fundamental to understand the spatio-temporal epidemiology of this complex in many endemic areas with the simultaneous occurrence of different Echinococcus species and genotypes. To simplify the genotyping of the E. granulosus complex we developed a single-tube multiplex PCR (mPCR) allowing three levels of discrimination: (i) Echinococcus genus, (ii) E. granulosus complex in common, and (iii) the specific genotype within the E. granulosus complex. The methodology was established with known DNA samples of the different strains/genotypes, confirmed on 42 already genotyped samples (Spain: 22 and Bulgaria: 20) and then successfully applied on 153 unknown samples (Tunisia: 114, Algeria: 26 and Argentina: 13). The sensitivity threshold of the mPCR was found to be 5 ng Echinoccoccus DNA in a mixture of up to 1 µg of foreign DNA and the specificity was 100% when template DNA from closely related members of the genus Taenia was used. Additionally to DNA samples, the mPCR can be carried out directly on boiled hydatid fluid or on alkaline-lysed frozen or fixed protoscoleces, thus avoiding classical DNA extractions. However, when using Echinococcus eggs obtained from fecal samples of infected dogs, the sensitivity of the mPCR was low (<40%). Thus, except for copro analysis, the mPCR described here has a high potential for a worldwide application in large-scale molecular epidemiological studies on the Echinococcus genus.

  15. Tile-Based Fisher Ratio Analysis of Comprehensive Two-Dimensional Gas Chromatography Time-of-Flight Mass Spectrometry (GC × GC – TOFMS) Data using a Null Distribution Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsons, Brendon A.; Marney, Luke C.; Siegler, William C.

    Multi-dimensional chromatographic instrumentation produces information-rich, and chemically complex data containing meaningful chemical signals and/or chemical patterns. Two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC – TOFMS) is a prominent instrumental platform that has been applied extensively for discovery-based experimentation, where samples are sufficiently volatile or amenable to derivatization. Use of GC × GC – TOFMS and associated data analysis strategies aim to uncover meaningful chemical signals or chemical patterns. However, for complex samples, meaningful chemical information is often buried in a background of less meaningful chemical signal and noise. In this report, we utilize the tile-based F-ratiomore » software in concert with the standard addition method by spiking non-native chemicals into a diesel fuel matrix at low concentrations. While the previous work studied the concentration range of 100-1000 ppm, the current study focuses on the 0 ppm to 100 ppm analyte spike range. This study demonstrates the sensitivity and selectivity of the tile-based F-ratio software for discovery of true positives in the non-targeted analysis of a chemically complex and analytically challenging sample matrix. By exploring the low concentration spike levels, we gain a better understanding of the limit of detection (LOD) of the tile-based F-ratio software with GC × GC – TOFMS data.« less

  16. Fusarium diversity in soil using a specific molecular approach and a cultural approach.

    PubMed

    Edel-Hermann, Véronique; Gautheron, Nadine; Mounier, Arnaud; Steinberg, Christian

    2015-04-01

    Fusarium species are ubiquitous in soil. They cause plant and human diseases and can produce mycotoxins. Surveys of Fusarium species diversity in environmental samples usually rely on laborious culture-based methods. In the present study, we have developed a molecular method to analyze Fusarium diversity directly from soil DNA. We designed primers targeting the translation elongation factor 1-alpha (EF-1α) gene and demonstrated their specificity toward Fusarium using a large collection of fungi. We used the specific primers to construct a clone library from three contrasting soils. Sequence analysis confirmed the specificity of the assay, with 750 clones identified as Fusarium and distributed among eight species or species complexes. The Fusarium oxysporum species complex (FOSC) was the most abundant one in the three soils, followed by the Fusarium solani species complex (FSSC). We then compared our molecular approach results with those obtained by isolating Fusarium colonies on two culture media and identifying species by sequencing part of the EF-1α gene. The 750 isolates were distributed into eight species or species complexes, with the same dominant species as with the cloning method. Sequence diversity was much higher in the clone library than in the isolate collection. The molecular approach proved to be a valuable tool to assess Fusarium diversity in environmental samples. Combined with high throughput sequencing, it will allow for in-depth analysis of large numbers of samples. Published by Elsevier B.V.

  17. Slow histidine H/D exchange protocol for thermodynamic analysis of protein folding and stability using mass spectrometry.

    PubMed

    Tran, Duc T; Banerjee, Sambuddha; Alayash, Abdu I; Crumbliss, Alvin L; Fitzgerald, Michael C

    2012-02-07

    Described here is a mass spectrometry-based protocol to study the thermodynamic stability of proteins and protein-ligand complexes using the chemical denaturant dependence of the slow H/D exchange reaction of the imidazole C(2) proton in histidine side chains. The protocol is developed using several model protein systems including: ribonuclease (Rnase) A, myoglobin, bovine carbonic anhydrase (BCA) II, hemoglobin (Hb), and the hemoglobin-haptoglobin (Hb-Hp) protein complex. Folding free energies consistent with those previously determined by other more conventional techniques were obtained for the two-state folding proteins, Rnase A and myoglobin. The protocol successfully detected a previously observed partially unfolded intermediate stabilized in the BCA II folding/unfolding reaction, and it could be used to generate a K(d) value of 0.24 nM for the Hb-Hp complex. The compatibility of the protocol with conventional mass spectrometry-based proteomic sample preparation and analysis methods was also demonstrated in an experiment in which the protocol was used to detect the binding of zinc to superoxide dismutase in the yeast cell lysate sample. The yeast cell sample analyses also helped define the scope of the technique, which requires the presence of globally protected histidine residues in a protein's three-dimensional structure for successful application. © 2011 American Chemical Society

  18. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  19. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  20. A Tool for Determining the Number of Contributors: Interpreting Complex, Compromised Low-Template Dna Samples

    DTIC Science & Technology

    2017-09-28

    SECURITY CLASSIFICATION OF: In forensic DNA analysis, the interpretation of a sample acquired from the environment may be dependent upon the...sample acquired from the environment may be dependent upon the assumption on the number of individuals from which the evidence arose. Degraded and...NOCIt results to those obtained when allele counting or maxiumum likelihood estimator (MLE) methods are employed. NOCIt does not depend upon an AT and

  1. Coaxial test fixture

    DOEpatents

    Praeg, W.F.

    1984-03-30

    This invention pertains to arrangements for performing electrical tests on contact material samples, and in particular for testing contact material test samples in an evacuated environment under high current loads. Frequently, it is desirable in developing high-current separable contact material, to have at least a preliminary analysis of selected candidate conductor materials. Testing of material samples will hopefully identify materials unsuitable for high current electrical contact without requiring incorporation of the materials into a completed and oftentimes complex structure.

  2. Quantitation of heat-shock proteins in clinical samples using mass spectrometry.

    PubMed

    Kaur, Punit; Asea, Alexzander

    2011-01-01

    Mass spectrometry (MS) is a powerful analytical tool for proteomics research and drug and biomarker discovery. MS enables identification and quantification of known and unknown compounds by revealing their structural and chemical properties. Proper sample preparation for MS-based analysis is a critical step in the proteomics workflow because the quality and reproducibility of sample extraction and preparation for downstream analysis significantly impact the separation and identification capabilities of mass spectrometers. The highly expressed proteins represent potential biomarkers that could aid in diagnosis, therapy, or drug development. Because the proteome is so complex, there is no one standard method for preparing protein samples for MS analysis. Protocols differ depending on the type of sample, source, experiment, and method of analysis. Molecular chaperones play significant roles in almost all biological functions due to their capacity for detecting intracellular denatured/unfolded proteins, initiating refolding or denaturation of such malfolded protein sequences and more recently for their role in the extracellular milieu as chaperokines. In this chapter, we describe the latest techniques for quantitating the expression of molecular chaperones in human clinical samples.

  3. Noise and complexity in human postural control: interpreting the different estimations of entropy.

    PubMed

    Rhea, Christopher K; Silver, Tobin A; Hong, S Lee; Ryu, Joong Hyun; Studenka, Breanna E; Hughes, Charmayne M L; Haddad, Jeffrey M

    2011-03-17

    Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses.

  4. Precambrian evolution of the Salalah Crystalline Basement from structural analysis and 40Ar/39Ar geochronology

    NASA Astrophysics Data System (ADS)

    Al-Doukhi, Hanadi Abulateef

    The Salalah Crystalline Basement (SCB) is the largest Precambrian exposure in Oman located on the southern margin of the Arabian Plate at the Arabian Sea shore. This work used remote sensing, detailed structural analysis and the analysis of ten samples using 40Ar/39Ar age dating to establish the Precambrian evolution of the SCB by focusing on its central and southwestern parts. This work found that the SCB evolved through four deformational events that shaped its final architecture: (1) Folding and thrusting event that resulted in the emplacement of the Sadh complex atop the Juffa complex. This event resulted in the formation of possibly N-verging nappe structure; (2) Regional folding event around SE- and SW-plunging axes that deformed the regional fabric developed during the N-verging nappe structure and produced map-scale SE- and SW-plunging antiforms shaping the complexes into a semi-dome structure; (3) Strike-slip shearing event that produced a conjugate set of NE-trending sinistral and NW-trending dextral strike-slip shear zones; and (4) Localized SE-directed gravitational collapse manifested by top-to-the-southeast kinematic indicators. Deformation within the SCB might have ceased by 752.2+/-2.7 Ma as indicated by an age given by an undeformed granite. The thermochron of samples collected throughout the SCB complexes shows a single cooling event that occurred between about 800 and 760 Ma. This cooling event could be accomplished by crustal exhumation resulting in regional collapse following the prolonged period of the contractional deformation of the SCB. This makes the SCB a possible metamorphic core complex.

  5. Applicability of direct total reflection X-ray fluorescence analysis for selenium determination in solutions related to environmental and geochemical studies

    NASA Astrophysics Data System (ADS)

    Marguí, E.; Floor, G. H.; Hidalgo, M.; Kregsamer, P.; Roman-Ross, G.; Streli, C.; Queralt, I.

    2010-12-01

    A significant amount of environmental studies related to selenium determination in different environmental compartments have been published in the last years due to the narrow range between the Se nutritious requirement as essential element and toxic effects upon exposure. However, the direct analysis of complex liquid samples like natural waters and extraction solutions presents significant problems related to the low Se concentrations and the complicated matrix of this type of samples. The goal of the present research was to study the applicability of direct TXRF analysis of different type of solutions commonly used in environmental and geochemical studies, confirm the absence or presence of matrix effects and evaluate the limits of detection and accuracy for Se determination in the different matrices. Good analytical results were obtained for the direct analysis of ground and rain water samples with limits of detection for Se two orders of magnitude lower than the permissible Se concentration in drinking waters ([Se] = 10 μg/L) according to the WHO. However, the Se detection limits for more complex liquid samples such as thermal waters and extraction solutions were in the μg/L range due to the presence of high contents of other elements present in the matrix (i.e., Br, Fe, Zn) or the high background of the TXRF spectrum that hamper the Se determination at trace levels. Our results give insight into the possibilities and drawbacks of direct TXRF analysis and to a certain extent the potential applications in the environmental and geochemical field.

  6. Analysis of Metals Concentration in the Soils of SIPCOT Industrial Complex, Cuddalore, Tamil Nadu

    PubMed Central

    Mathivanan, V.; Prabavathi, R.; Prithabai, C.; Selvisabhanayakam

    2010-01-01

    Phytoremediation is a promising area of new research, both for its low cost and great benefit to society in the clean retrieval of contaminated sites. Phytoremediation is the use of living green plants for in situ risk reduction and/or removal of contaminants from contaminated soil, water, sediments, and air. Specially selected or engineered plants are used in the process. The soil samples were taken from Cuddalore Old Town (OT) and the samples from SIPCOT industrial complex, which was the study area and analyzed for various metals concentrations. Fifteen metals have been analyzed by adopting standard procedure. The detection limits of metal concentration are drawn as control. The various (15) metal concentrations in the soil samples were found higher in soil taken from SIPCOT industrial complex, compared with samples taken from Cuddalore OT. In all the observations, it was found that most of the metals like calcium, cadmium, chromium, cobalt, nickel, and zinc showed maximum concentrations, whereas arsenic, antimony, lead, magnesium, sodium have shown minimum concentrations, both when compared with control. From the present study, it was found that the soil collected from SIPCOT complex area were more polluted due to the presence of various industrial effluents, municipal wastes, and sewages when compared with the soil collected from Cuddalore OT. PMID:21170256

  7. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    A batch of four samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch diameter optics labeled windows no. PR14 and PR17 and MgF2 mirrors 9-93 PPPC exp. and control DMES 26-92. The analyses emphasized surface contamination or modification. In these studies, pulsed desorption by 355 nm laser light and single-photon ionization (SPI) above the sample by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2)) were used, emphasizing organic analysis. For the two windows with an apparent yellowish contaminant film, higher desorption laser power was needed to provide substantial signals, indicating a less volatile contamination than for the two mirrors. Window PR14 and the 9-93 mirror showed more hydrocarbon components than the other two samples. The mass spectra, which show considerable complexity, are discussed in terms of various potential chemical assignments.

  8. Use of Multivariate Linkage Analysis for Dissection of a Complex Cognitive Trait

    PubMed Central

    Marlow, Angela J.; Fisher, Simon E.; Francks, Clyde; MacPhie, I. Laurence; Cherny, Stacey S.; Richardson, Alex J.; Talcott, Joel B.; Stein, John F.; Monaco, Anthony P.; Cardon, Lon R.

    2003-01-01

    Replication of linkage results for complex traits has been exceedingly difficult, owing in part to the inability to measure the precise underlying phenotype, small sample sizes, genetic heterogeneity, and statistical methods employed in analysis. Often, in any particular study, multiple correlated traits have been collected, yet these have been analyzed independently or, at most, in bivariate analyses. Theoretical arguments suggest that full multivariate analysis of all available traits should offer more power to detect linkage; however, this has not yet been evaluated on a genomewide scale. Here, we conduct multivariate genomewide analyses of quantitative-trait loci that influence reading- and language-related measures in families affected with developmental dyslexia. The results of these analyses are substantially clearer than those of previous univariate analyses of the same data set, helping to resolve a number of key issues. These outcomes highlight the relevance of multivariate analysis for complex disorders for dissection of linkage results in correlated traits. The approach employed here may aid positional cloning of susceptibility genes in a wide spectrum of complex traits. PMID:12587094

  9. Identifying relationships between baseflow geochemistry and land use with synoptic sampling and R-Mode factor analysis

    USGS Publications Warehouse

    Wayland, Karen G.; Long, David T.; Hyndman, David W.; Pijanowski, Bryan C.; Woodhams, Sarah M.; Haak, Sheridan K.

    2003-01-01

    The relationship between land use and stream chemistry is often explored through synoptic sampling rivers at baseflow condition. However, base flow chemistry is likely to vary temporally and spatially with land use. The purpose of our study is to examine the usefulness of the synoptic sampling approach for identifying the relationship between complex land use configurations and stream water quality. This study compares biogeochemical data from three synoptic sampling events representing the temporal variability of baseflow chemistry and land use using R-mode factor analysis. Separate R-mode factor analyses of the data from individual sampling events yielded only two consistent factors. Agricultural activity was associated with elevated levels of Ca2+, Mg2+, alkalinity, and frequently K+, SO42-, and NO3-. Urban areas were associated with higher concentrations of Na+, K+, and Cl-. Other retained factors were not  consistent among sampling events, and some factors were difficult to interpret in the context of biogeochemical sources and processes. When all data were combined, further associations were revealed such as an inverse relationship between the proportion of wetlands and stream nitrate concentrations. We also found that barren lands were associated with elevated sulfate levels. This research suggests that an individual sampling event is unlikely to characterize adequately the complex processes controlling interactions between land uses and stream chemistry. Combining data collected over two years during three synoptic sampling events appears to enhance our ability to understand processes linking stream chemistry and land use.  

  10. An enhanced droplet-based liquid microjunction surface sampling system coupled with HPLC-ESI-MS/MS for spatially resolved analysis

    DOE PAGES

    Van Berkel, Gary J.; Weiskittel, Taylor M.; Kertesz, Vilmos

    2014-11-07

    Droplet-based liquid microjunction surface sampling coupled with high-performance liquid chromatography (HPLC)-electrospray ionization (ESI)-tandem mass spectrometry (MS/MS) for spatially resolved analysis provides the possibility of effective analysis of complex matrix samples and can provide a greater degree of chemical information from a single spot sample than is typically possible with a direct analysis of an extract. Described here is the setup and enhanced capabilities of a discrete droplet liquid microjunction surface sampling system employing a commercially available CTC PAL autosampler. The system enhancements include incorporation of a laser distance sensor enabling unattended analysis of samples and sample locations of dramatically disparatemore » height as well as reliably dispensing just 0.5 μL of extraction solvent to make the liquid junction to the surface, wherein the extraction spot size was confined to an area about 0.7 mm in diameter; software modifications improving the spatial resolution of sampling spot selection from 1.0 to 0.1 mm; use of an open bed tray system to accommodate samples as large as whole-body rat thin tissue sections; and custom sample/solvent holders that shorten sampling time to approximately 1 min per sample. Lastly, the merit of these new features was demonstrated by spatially resolved sampling, HPLC separation, and mass spectral detection of pharmaceuticals and metabolites from whole-body rat thin tissue sections and razor blade (“crude”) cut mouse tissue.« less

  11. An enhanced droplet-based liquid microjunction surface sampling system coupled with HPLC-ESI-MS/MS for spatially resolved analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Weiskittel, Taylor M.; Kertesz, Vilmos

    Droplet-based liquid microjunction surface sampling coupled with high-performance liquid chromatography (HPLC)-electrospray ionization (ESI)-tandem mass spectrometry (MS/MS) for spatially resolved analysis provides the possibility of effective analysis of complex matrix samples and can provide a greater degree of chemical information from a single spot sample than is typically possible with a direct analysis of an extract. Described here is the setup and enhanced capabilities of a discrete droplet liquid microjunction surface sampling system employing a commercially available CTC PAL autosampler. The system enhancements include incorporation of a laser distance sensor enabling unattended analysis of samples and sample locations of dramatically disparatemore » height as well as reliably dispensing just 0.5 μL of extraction solvent to make the liquid junction to the surface, wherein the extraction spot size was confined to an area about 0.7 mm in diameter; software modifications improving the spatial resolution of sampling spot selection from 1.0 to 0.1 mm; use of an open bed tray system to accommodate samples as large as whole-body rat thin tissue sections; and custom sample/solvent holders that shorten sampling time to approximately 1 min per sample. Lastly, the merit of these new features was demonstrated by spatially resolved sampling, HPLC separation, and mass spectral detection of pharmaceuticals and metabolites from whole-body rat thin tissue sections and razor blade (“crude”) cut mouse tissue.« less

  12. An evaluation of information-theoretic methods for detecting structural microbial biosignatures.

    PubMed

    Wagstaff, Kiri L; Corsetti, Frank A

    2010-05-01

    The first observations of extraterrestrial environments will most likely be in the form of digital images. Given an image of a rock that contains layered structures, is it possible to determine whether the layers were created by life (biogenic)? While conclusive judgments about biogenicity are unlikely to be made solely on the basis of image features, an initial assessment of the importance of a given sample can inform decisions about follow-up searches for other types of possible biosignatures (e.g., isotopic or chemical analysis). In this study, we evaluated several quantitative measures that capture the degree of complexity in visible structures, in terms of compressibility (to detect order) and the entropy (spread) of their intensity distributions. Computing complexity inside a sliding analysis window yields a map of each of these features that indicates how they vary spatially across the sample. We conducted experiments on both biogenic and abiogenic terrestrial stromatolites and on laminated structures found on Mars. The degree to which each feature separated biogenic from abiogenic samples (separability) was assessed quantitatively. None of the techniques provided a consistent, statistically significant distinction between all biogenic and abiogenic samples. However, the PNG compression ratio provided the strongest distinction (2.80 in standard deviation units) and could inform future techniques. Increasing the analysis window size or the magnification level, or both, improved the separability of the samples. Finally, data from all four Mars samples plotted well outside the biogenic field suggested by the PNG analyses, although we caution against a direct comparison of terrestrial stromatolites and martian non-stromatolites.

  13. One-step analysis of DNA/chitosan complexes by field-flow fractionation reveals particle size and free chitosan content.

    PubMed

    Ma, Pei Lian; Buschmann, Michael D; Winnik, Françoise M

    2010-03-08

    The composition of samples obtained upon complexation of DNA with chitosan was analyzed by asymmetrical flow field flow fractionation (AF4) with online UV-visible, multiangle light scattering (MALS), and dynamic light scattering (DLS) detectors. A chitosan labeled with rhodamine B to facilitate UV-vis detection of the polycation was complexed with DNA under conditions commonly used for transfection (chitosan glucosamine to DNA phosphate molar ratio of 5). AF4 analysis revealed that 73% of the chitosan-rhodamine remained free in the dispersion and that the DNA/chitosan complexes had a broad size distribution ranging from 20 to 160 nm in hydrodynamic radius. The accuracy of the data was assessed by comparison with data from batch-mode DLS and scanning electron microscopy. The AF4 combined with DLS allowed the characterization of small particles that were not detected by conventional batch-mode DLS. The AF4 analysis will prove to be an important tool in the field of gene therapy because it readily provides, in a single measurement, three important physicochemical parameters of the complexes: the amount of unbound polycation, the hydrodynamic size of the complexes, and their size distribution.

  14. Development of methodology for identification the nature of the polyphenolic extracts by FTIR associated with multivariate analysis

    NASA Astrophysics Data System (ADS)

    Grasel, Fábio dos Santos; Ferrão, Marco Flôres; Wolf, Carlos Rodolfo

    2016-01-01

    Tannins are polyphenolic compounds of complex structures formed by secondary metabolism in several plants. These polyphenolic compounds have different applications, such as drugs, anti-corrosion agents, flocculants, and tanning agents. This study analyses six different type of polyphenolic extracts by Fourier transform infrared spectroscopy (FTIR) combined with multivariate analysis. Through both principal component analysis (PCA) and hierarchical cluster analysis (HCA), we observed well-defined separation between condensed (quebracho and black wattle) and hydrolysable (valonea, chestnut, myrobalan, and tara) tannins. For hydrolysable tannins, it was also possible to observe the formation of two different subgroups between samples of chestnut and valonea and between samples of tara and myrobalan. Among all samples analysed, the chestnut and valonea showed the greatest similarity, indicating that these extracts contain equivalent chemical compositions and structure and, therefore, similar properties.

  15. The Beck Depression Inventory, Second Edition (BDI-II): A Cross-Sample Structural Analysis

    ERIC Educational Resources Information Center

    Strunk, Kamden K.; Lane, Forrest C.

    2017-01-01

    A common concern about the Beck Depression Inventory, Second edition (BDI-II) among researchers in the area of depression has long been the single-factor scoring scheme. Methods exist for making cross-sample comparisons of latent structure but tend to rely on estimation methods that can be imprecise and unnecessarily complex. This study presents a…

  16. Stepping Back to Gain Perspective: Pregnancy Loss History, Depression, and Parenting Capacity in the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B)

    ERIC Educational Resources Information Center

    Price, Sarah Kye

    2008-01-01

    Previous empirical studies of pregnancy loss have predominantly focused on complex grief response and emergent problems associated with future parenting in self-selected samples of bereaved women. This article presents findings from a retrospective secondary data analysis conducted with a racially and ethnically diverse sample of currently…

  17. DGGE and multivariate analysis of a yeast community in spontaneous cocoa fermentation process.

    PubMed

    Ferreira, A C R; Marques, E L S; Dias, J C T; Rezende, R P

    2015-12-28

    Cocoa bean is the main raw material used in the production of chocolate. In southern Bahia, Brazil, cocoa farming and processing is an important economic activity. The fermentation of cocoa is the processing stage that yields important chocolate flavor precursors and complex microbial involvement is essential for this process. In this study, PCR-denaturing gradient gel electrophoreses (DGGE) was used to investigate the diversity of yeasts present during the spontaneous fermentation of cocoa in southern Bahia. The DGGE analysis revealed a richness of 8 to 13 distinct bands of varied intensities among the samples; and samples taken at 24, 36, and 48 h into the fermentation process were found to group with 70% similarity and showed the greatest diversity of bands. Hierarchical clustering showed that all samples had common operational taxonomic units (OTUs) and the highest number of OTUs was found in the 48 h sample. Variations in pH and temperature observed within the fermenting mass over time possibly had direct effects on the composition of the existing microbial community. The findings reported here indicate that a heterogeneous yeast community is involved in the complex cocoa fermentation process, which is known to involve a succession of specialized microorganisms.

  18. A comprehensive and scalable database search system for metaproteomics.

    PubMed

    Chatterjee, Sandip; Stupp, Gregory S; Park, Sung Kyu Robin; Ducom, Jean-Christophe; Yates, John R; Su, Andrew I; Wolan, Dennis W

    2016-08-16

    Mass spectrometry-based shotgun proteomics experiments rely on accurate matching of experimental spectra against a database of protein sequences. Existing computational analysis methods are limited in the size of their sequence databases, which severely restricts the proteomic sequencing depth and functional analysis of highly complex samples. The growing amount of public high-throughput sequencing data will only exacerbate this problem. We designed a broadly applicable metaproteomic analysis method (ComPIL) that addresses protein database size limitations. Our approach to overcome this significant limitation in metaproteomics was to design a scalable set of sequence databases assembled for optimal library querying speeds. ComPIL was integrated with a modified version of the search engine ProLuCID (termed "Blazmass") to permit rapid matching of experimental spectra. Proof-of-principle analysis of human HEK293 lysate with a ComPIL database derived from high-quality genomic libraries was able to detect nearly all of the same peptides as a search with a human database (~500x fewer peptides in the database), with a small reduction in sensitivity. We were also able to detect proteins from the adenovirus used to immortalize these cells. We applied our method to a set of healthy human gut microbiome proteomic samples and showed a substantial increase in the number of identified peptides and proteins compared to previous metaproteomic analyses, while retaining a high degree of protein identification accuracy and allowing for a more in-depth characterization of the functional landscape of the samples. The combination of ComPIL with Blazmass allows proteomic searches to be performed with database sizes much larger than previously possible. These large database searches can be applied to complex meta-samples with unknown composition or proteomic samples where unexpected proteins may be identified. The protein database, proteomic search engine, and the proteomic data files for the 5 microbiome samples characterized and discussed herein are open source and available for use and additional analysis.

  19. Structural Analysis of N- and O-glycans Using ZIC-HILIC/Dialysis Coupled to NMR Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qu, Yi; Feng, Ju; Deng, Shuang

    2014-11-19

    Protein glycosylation, an important and complex post-translational modification (PTM), is involved in various biological processes including the receptor-ligand and cell-cell interaction, and plays a crucial role in many biological functions. However, little is known about the glycan structures of important biological complex samples, and the conventional glycan enrichment strategy (i.e., size-exclusion column [SEC] separation,) prior to nuclear magnetic resonance (NMR) detection is time-consuming and tedious. In this study, we employed SEC, Zwitterionic hydrophilic interaction liquid chromatography (ZIC-HILIC), and ZIC-HILIC coupled with dialysis strategies to enrich the glycopeptides from the pronase E digests of RNase B, followed by NMR analysis ofmore » the glycoconjugate. Our results suggest that the ZIC-HILIC enrichment coupled with dialysis is the most efficient, which was thus applied to the analysis of biological complex sample, the pronase E digest of the secreted proteins from the fungi Aspergillus niger. The NMR spectra revealed that the secreted proteins from A. niger contain both N-linked glycans with a high-mannose core and O-linked glycans bearing mannose and glucose with 1->3 and 1->6 linkages. In all, our study provides compelling evidence that ZIC-HILIC separation coupled to dialysis is superior to the commonly used SEC separation to prepare glycopeptides for the downstream NMR analysis, which could greatly facilitate the future NMR-based glycoproteomics research.« less

  20. Nonlinear Complexity Analysis of Brain fMRI Signals in Schizophrenia

    PubMed Central

    Sokunbi, Moses O.; Gradin, Victoria B.; Waiter, Gordon D.; Cameron, George G.; Ahearn, Trevor S.; Murray, Alison D.; Steele, Douglas J.; Staff, Roger T.

    2014-01-01

    We investigated the differences in brain fMRI signal complexity in patients with schizophrenia while performing the Cyberball social exclusion task, using measures of Sample entropy and Hurst exponent (H). 13 patients meeting diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM IV) criteria for schizophrenia and 16 healthy controls underwent fMRI scanning at 1.5 T. The fMRI data of both groups of participants were pre-processed, the entropy characterized and the Hurst exponent extracted. Whole brain entropy and H maps of the groups were generated and analysed. The results after adjusting for age and sex differences together show that patients with schizophrenia exhibited higher complexity than healthy controls, at mean whole brain and regional levels. Also, both Sample entropy and Hurst exponent agree that patients with schizophrenia have more complex fMRI signals than healthy controls. These results suggest that schizophrenia is associated with more complex signal patterns when compared to healthy controls, supporting the increase in complexity hypothesis, where system complexity increases with age or disease, and also consistent with the notion that schizophrenia is characterised by a dysregulation of the nonlinear dynamics of underlying neuronal systems. PMID:24824731

  1. Arsenic distribution and valence state variation studied by fast hierarchical length-scale morphological, compositional, and speciation imaging at the Nanoscopium, Synchrotron Soleil

    NASA Astrophysics Data System (ADS)

    Somogyi, Andrea; Medjoubi, Kadda; Sancho-Tomas, Maria; Visscher, P. T.; Baranton, Gil; Philippot, Pascal

    2017-09-01

    The understanding of real complex geological, environmental and geo-biological processes depends increasingly on in-depth non-invasive study of chemical composition and morphology. In this paper we used scanning hard X-ray nanoprobe techniques in order to study the elemental composition, morphology and As speciation in complex highly heterogeneous geological samples. Multivariate statistical analytical techniques, such as principal component analysis and clustering were used for data interpretation. These measurements revealed the quantitative and valance state inhomogeneity of As and its relation to the total compositional and morphological variation of the sample at sub-μm scales.

  2. Sample size and power considerations in network meta-analysis

    PubMed Central

    2012-01-01

    Background Network meta-analysis is becoming increasingly popular for establishing comparative effectiveness among multiple interventions for the same disease. Network meta-analysis inherits all methodological challenges of standard pairwise meta-analysis, but with increased complexity due to the multitude of intervention comparisons. One issue that is now widely recognized in pairwise meta-analysis is the issue of sample size and statistical power. This issue, however, has so far only received little attention in network meta-analysis. To date, no approaches have been proposed for evaluating the adequacy of the sample size, and thus power, in a treatment network. Findings In this article, we develop easy-to-use flexible methods for estimating the ‘effective sample size’ in indirect comparison meta-analysis and network meta-analysis. The effective sample size for a particular treatment comparison can be interpreted as the number of patients in a pairwise meta-analysis that would provide the same degree and strength of evidence as that which is provided in the indirect comparison or network meta-analysis. We further develop methods for retrospectively estimating the statistical power for each comparison in a network meta-analysis. We illustrate the performance of the proposed methods for estimating effective sample size and statistical power using data from a network meta-analysis on interventions for smoking cessation including over 100 trials. Conclusion The proposed methods are easy to use and will be of high value to regulatory agencies and decision makers who must assess the strength of the evidence supporting comparative effectiveness estimates. PMID:22992327

  3. All-integrated and highly sensitive paper based device with sample treatment platform for Cd2+ immunodetection in drinking/tap waters.

    PubMed

    López Marzo, Adaris M; Pons, Josefina; Blake, Diane A; Merkoçi, Arben

    2013-04-02

    Nowadays, the development of systems, devices, or methods that integrate several process steps into one multifunctional step for clinical, environmental, or industrial purposes constitutes a challenge for many ongoing research projects. Here, we present a new integrated paper based cadmium (Cd(2+)) immunosensing system in lateral flow format, which integrates the sample treatment process with the analyte detection process. The principle of Cd(2+) detection is based on competitive reaction between the cadmium-ethylenediaminetetraacetic acid-bovine serum albumin-gold nanoparticles (Cd-EDTA-BSA-AuNP) conjugate deposited on the conjugation pad strip and the Cd-EDTA complex formed in the analysis sample for the same binding sites of the 2A81G5 monoclonal antibody (mAb), specific to Cd-EDTA but not Cd(2+) free, which is immobilized onto the test line. This platform operates without any sample pretreatment step for Cd(2+) detection thanks to an extra conjugation pad that ensures Cd(2+) complexation with EDTA and interference masking through ovalbumin (OVA). The detection and quantification limits found for the device were 0.1 and 0.4 ppb, respectively, these being the lowest limits reported up to now for metal sensors based on paper. The accuracy of the device was evaluated by addition of known quantities of Cd(2+) to different drinking water samples and subsequent Cd(2+) content analysis. Sample recoveries ranged from 95 to 105% and the coefficient of variation for the intermediate precision assay was less than 10%. In addition, the results obtained here were compared with those obtained with the well-established inductively coupled plasma emission spectroscopy (ICPES) and the analysis of certificate standard samples.

  4. Determination of hexavalent chromium in industrial hygiene samples using ultrasonic extraction and flow injection analysis.

    PubMed

    Wang, J; Ashley, K; Kennedy, E R; Neumeister, C

    1997-11-01

    A simple, fast, and sensitive method was developed for the determination of hexavalent chromium (CrVI) in workplace samples. Ultrasonic extraction in alkaline solutions with 0.05 M (NH4)2SO4-0.05 M NH3 provided good extraction efficiency of CrVI from the sample and allowed the retention of CrVI on an ion-exchange resin (95%). The CrVI in the sample solution was then separated as an anion from trivalent chromium [CrIII] and other cations by elution from the anion-exchange resin with 0.5 M (NH4)2SO4 in 0.1 M NH3 (pH 8) buffer solution. The eluate was then acidified with hydrochloric acid and complexed with 1,5-diphenylcarbazide reagent prior to flow injection analysis. By analyzing samples with and without oxidation of CrIII to CrVI using CeIV, the method can measure CrVI and total Cr. For optimizing the separation and determination procedure, preliminary trials conducted with two certified reference materials (CRMs 013-050 and NIST 1633a) and three spiked samples (ammonia buffer solution, cellulose ester filters and acid washed sand) indicated that the recovery of CrVI was quantitative (> 90%) with this method. The limit of detection for FIA-UV/VIS determination of the Cr-diphenylcarbazone complex was in the sub-nanogram range (0.11 ng). The technique was also applied successfully to a workplace coal fly ash sample that was collected from a power plant and paint chips that were collected from a heating gas pipe and a university building. The principal advantages of this method are its simplicity, sensitivity, speed and potential portability for field analysis.

  5. Effects of ultrasonic treatment on amylose-lipid complex formation and properties of sweet potato starch-based films.

    PubMed

    Liu, Pengfei; Wang, Rui; Kang, Xuemin; Cui, Bo; Yu, Bin

    2018-06-01

    To investigate the effect of ultrasonic treatment on the properties of sweet potato starch and sweet potato starch-based films, the complexing index, thermograms and diffractograms of the sweet potato starch-lauric acid composite were tested, and light transmission, microstructure, and mechanical and moisture barrier properties of the films were measured. The results indicated that the low power density ultrasound was beneficial to the formation of an inclusion complex. In thermograms, the gelatinization enthalpies of the ultrasonically treated starches were lower than those of the untreated sample. With the ultrasonic amplitude increased from 40% to 70%, the melting enthalpy (ΔH) of the inclusion complex gradually decreased. X-ray diffraction revealed that the diffraction intensity of the untreated samples was weaker than that of the ultrasonically treated samples. When the ultrasonic amplitude was above 40%, the diffraction intensity and relative crystallinity of inclusion complex gradually decreased. The scanning electronic microscope showed that the surface of the composite films became smooth after being treated by ultrasonication. Ultrasonication led to a reduction in film surface roughness under atomic force microscopy analysis. The films with ultrasonic treatment exhibited higher light transmission, lower elongation at break, higher tensile strength and better moisture barrier property than those without ultrasonic treatment. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Spermine oxidase (SMO) activity in breast tumor tissues and biochemical analysis of the anticancer spermine analogues BENSpm and CPENSpm.

    PubMed

    Cervelli, Manuela; Bellavia, Gabriella; Fratini, Emiliano; Amendola, Roberto; Polticelli, Fabio; Barba, Marco; Federico, Rodolfo; Signore, Fabrizio; Gucciardo, Giacomo; Grillo, Rosalba; Woster, Patrick M; Casero, Robert A; Mariottini, Paolo

    2010-10-14

    Polyamine metabolism has a critical role in cell death and proliferation representing a potential target for intervention in breast cancer (BC). This study investigates the expression of spermine oxidase (SMO) and its prognostic significance in BC. Biochemical analysis of Spm analogues BENSpm and CPENSpm, utilized in anticancer therapy, was also carried out to test their property in silico and in vitro on the recombinant SMO enzyme. BC tissue samples were analyzed for SMO transcript level and SMO activity. Student's t test was applied to evaluate the significance of the differences in value observed in T and NT samples. The structure modeling analysis of BENSpm and CPENSpm complexes formed with the SMO enzyme and their inhibitory activity, assayed by in vitro experiments, were examined. Both the expression level of SMO mRNA and SMO enzyme activity were significantly lower in BC samples compared to NT samples. The modeling of BENSpm and CPENSpm complexes formed with SMO and their inhibition properties showed that both were good inhibitors. This study shows that underexpression of SMO is a negative marker in BC. The SMO induction is a remarkable chemotherapeutical target. The BENSpm and CPENSpm are efficient SMO inhibitors. The inhibition properties shown by these analogues could explain their poor positive outcomes in Phases I and II of clinical trials.

  7. Spermine oxidase (SMO) activity in breast tumor tissues and biochemical analysis of the anticancer spermine analogues BENSpm and CPENSpm

    PubMed Central

    2010-01-01

    Background Polyamine metabolism has a critical role in cell death and proliferation representing a potential target for intervention in breast cancer (BC). This study investigates the expression of spermine oxidase (SMO) and its prognostic significance in BC. Biochemical analysis of Spm analogues BENSpm and CPENSpm, utilized in anticancer therapy, was also carried out to test their property in silico and in vitro on the recombinant SMO enzyme. Methods BC tissue samples were analyzed for SMO transcript level and SMO activity. Student's t test was applied to evaluate the significance of the differences in value observed in T and NT samples. The structure modeling analysis of BENSpm and CPENSpm complexes formed with the SMO enzyme and their inhibitory activity, assayed by in vitro experiments, were examined. Results Both the expression level of SMO mRNA and SMO enzyme activity were significantly lower in BC samples compared to NT samples. The modeling of BENSpm and CPENSpm complexes formed with SMO and their inhibition properties showed that both were good inhibitors. Conclusions This study shows that underexpression of SMO is a negative marker in BC. The SMO induction is a remarkable chemotherapeutical target. The BENSpm and CPENSpm are efficient SMO inhibitors. The inhibition properties shown by these analogues could explain their poor positive outcomes in Phases I and II of clinical trials. PMID:20946629

  8. A Preliminary Bayesian Analysis of Incomplete Longitudinal Data from a Small Sample: Methodological Advances in an International Comparative Study of Educational Inequality

    ERIC Educational Resources Information Center

    Hsieh, Chueh-An; Maier, Kimberly S.

    2009-01-01

    The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…

  9. Genomic Analysis of Complex Microbial Communities in Wounds

    DTIC Science & Technology

    2012-01-01

    thoroughly in the ecology literature. Permutation Multivariate Analysis of Variance ( PerMANOVA ). We used PerMANOVA to test the null-hypothesis of no...difference between the bacterial communities found within a single wound compared to those from different patients (α = 0.05). PerMANOVA is a...permutation-based version of the multivariate analysis of variance (MANOVA). PerMANOVA uses the distances between samples to partition variance and

  10. Shame, Dissociation, and Complex PTSD Symptoms in Traumatized Psychiatric and Control Groups: Direct and Indirect Associations With Relationship Distress.

    PubMed

    Dorahy, Martin J; Corry, Mary; Black, Rebecca; Matheson, Laura; Coles, Holly; Curran, David; Seager, Lenaire; Middleton, Warwick; Dyer, Kevin F W

    2017-04-01

    Elevated shame and dissociation are common in dissociative identity disorder (DID) and chronic posttraumatic stress disorder (PTSD) and are part of the constellation of symptoms defined as complex PTSD. Previous work examined the relationship between shame, dissociation, and complex PTSD and whether they are associated with intimate relationship anxiety, relationship depression, and fear of relationships. This study investigated these variables in traumatized clinical samples and a nonclinical community group. Participants were drawn from the DID (n = 20), conflict-related chronic PTSD (n = 65), and nonclinical (n = 125) populations and completed questionnaires assessing the variables of interest. A model examining the direct impact of shame and dissociation on relationship functioning, and their indirect effect via complex PTSD symptoms, was tested through path analysis. The DID sample reported significantly higher dissociation, shame, complex PTSD symptom severity, relationship anxiety, relationship depression, and fear of relationships than the other two samples. Support was found for the proposed model, with shame directly affecting relationship anxiety and fear of relationships, and pathological dissociation directly affecting relationship anxiety and relationship depression. The indirect effect of shame and dissociation via complex PTSD symptom severity was evident on all relationship variables. Shame and pathological dissociation are important for not only the effect they have on the development of other complex PTSD symptoms, but also their direct and indirect effects on distress associated with relationships. © 2016 Wiley Periodicals, Inc.

  11. Complexes of Small Chiral Molecules: Propylene Oxide and 3-BUTYN-2OL

    NASA Astrophysics Data System (ADS)

    Evangelisti, Luca; West, Channing; Coles, Ellie; Pate, Brooks

    2017-06-01

    Complexes of propylene oxide with 3-butyn-2-ol were observed in the molecular rotational spectra, and isotopologue analysis allowed for structural determination of the complexes. Using a gas mixture of 0.1% propylene oxide and 0.1% 3-butyn-2-ol in neon, the broadband rotational spectrum was measured in the 2-8 GHz frequency range using a chirped-pulse Fourier transform microwave spectrometer. Four isomers of each diastereomer pair, formed by a hydrogen bond between the two monomers, are identified in quantum chemistry study of the complex using B3LYP-D3BJ with the def2TZVP basis set. The initial measurement used racemic samples of both molecules in order to obtain all possible isomers of the complex in the pulsed jet expansion. A total of six distinct spectra were assigned in the racemic measurement - three for both the homochiral and heterochiral complex. Substitution structures for the most intense homochiral and heterochiral complexes were obtained. These complexes use the two lowest energy conformations of butynol despite conformational cooling of the monomer, resulting in a single identified isomer. This result shows that a wide range monomer conformational geometries need to be examined when performing searches for the lowest energy geometry. Analysis of the diastereomer spectra was used to develop a method for determining the enantiomeric excess of 3-butyn-2-ol and propylene oxide for use as a chiral tag, which could be used in subsequent measurements to determine enantiomeric excess. The sensitivity limits for enantiomeric excess determination and the linearity of the rotational spectroscopy signals as a function of sample enantiomeric excess will be presented.

  12. An Enhanced K-Means Algorithm for Water Quality Analysis of The Haihe River in China

    PubMed Central

    Zou, Hui; Zou, Zhihong; Wang, Xiaojing

    2015-01-01

    The increase and the complexity of data caused by the uncertain environment is today’s reality. In order to identify water quality effectively and reliably, this paper presents a modified fast clustering algorithm for water quality analysis. The algorithm has adopted a varying weights K-means cluster algorithm to analyze water monitoring data. The varying weights scheme was the best weighting indicator selected by a modified indicator weight self-adjustment algorithm based on K-means, which is named MIWAS-K-means. The new clustering algorithm avoids the margin of the iteration not being calculated in some cases. With the fast clustering analysis, we can identify the quality of water samples. The algorithm is applied in water quality analysis of the Haihe River (China) data obtained by the monitoring network over a period of eight years (2006–2013) with four indicators at seven different sites (2078 samples). Both the theoretical and simulated results demonstrate that the algorithm is efficient and reliable for water quality analysis of the Haihe River. In addition, the algorithm can be applied to more complex data matrices with high dimensionality. PMID:26569283

  13. Electrochemical pesticide detection with AutoDip--a portable platform for automation of crude sample analyses.

    PubMed

    Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils

    2015-02-07

    Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission.

  14. Exploiting Complexity Information for Brain Activation Detection

    PubMed Central

    Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui

    2016-01-01

    We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838

  15. BIOMONITORING OF EXPOSURE IN FARMWORKER STUDIES

    EPA Science Inventory

    Though biomonitoring has been used in many occupational and environmental health and exposure studies, we are only beginning to understand the complexities and uncertainties involved with the biomonitoring process -- from study design, to sample collection, to chemical analysis -...

  16. Argumentation: A Methodology to Facilitate Critical Thinking.

    PubMed

    Makhene, Agnes

    2017-06-20

    Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.

  17. Gradient elution moving boundary electrophoresis enables rapid analysis of acids in complex biomass-derived streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.

    Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less

  18. Gradient elution moving boundary electrophoresis enables rapid analysis of acids in complex biomass-derived streams

    DOE PAGES

    Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.; ...

    2016-09-27

    Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less

  19. Waste Sampling & Characterization Facility (WSCF) Complex Safety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MELOY, R.T.

    2002-04-01

    This document was prepared to analyze the Waste Sampling and Characterization Facility for safety consequences by: Determining radionuclide and highly hazardous chemical inventories; Comparing these inventories to the appropriate regulatory limits; Documenting the compliance status with respect to these limits; and Identifying the administrative controls necessary to maintain this status. The primary purpose of the Waste Sampling and Characterization Facility (WSCF) is to perform low-level radiological and chemical analyses on various types of samples taken from the Hanford Site. These analyses will support the fulfillment of federal, Washington State, and Department of Energy requirements.

  20. Inverse supercritical fluid extraction as a sample preparation method for the analysis of the nanoparticle content in sunscreen agents.

    PubMed

    Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; de Vries, Tjerk; Portugal-Cohen, Meital; Antonio, Diana C; Cascio, Claudia; Calzolai, Luigi; Gilliland, Douglas; de Mello, Andrew

    2016-04-01

    We demonstrate the use of inverse supercritical carbon dioxide (scCO2) extraction as a novel method of sample preparation for the analysis of complex nanoparticle-containing samples, in our case a model sunscreen agent with titanium dioxide nanoparticles. The sample was prepared for analysis in a simplified process using a lab scale supercritical fluid extraction system. The residual material was easily dispersed in an aqueous solution and analyzed by Asymmetrical Flow Field-Flow Fractionation (AF4) hyphenated with UV- and Multi-Angle Light Scattering detection. The obtained results allowed an unambiguous determination of the presence of nanoparticles within the sample, with almost no background from the matrix itself, and showed that the size distribution of the nanoparticles is essentially maintained. These results are especially relevant in view of recently introduced regulatory requirements concerning the labeling of nanoparticle-containing products. The novel sample preparation method is potentially applicable to commercial sunscreens or other emulsion-based cosmetic products and has important ecological advantages over currently used sample preparation techniques involving organic solvents. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Characterization of microbial communities in heavy crude oil from Saudi Arabia.

    PubMed

    Albokari, Majed; Mashhour, Ibrahim; Alshehri, Mohammed; Boothman, Chris; Al-Enezi, Mousa

    The complete mineralization of crude oil into carbon dioxide, water, inorganic compounds and cellular constituents can be carried out as part of a bioremediation strategy. This involves the transformation of complex organic contaminants into simpler organic compounds by microbial communities, mainly bacteria. A crude oil sample and an oil sludge sample were obtained from Saudi ARAMCO Oil Company and investigated to identify the microbial communities present using PCR-based culture-independent techniques. In total, analysis of 177 clones yielded 30 distinct bacterial sequences. Clone library analysis of the oil sample was found to contain Bacillus , Clostridia and Gammaproteobacteria species while the sludge sample revealed the presence of members of the Alphaproteobacteria , Betaproteobacteria , Gammaproteobacteria , Clostridia , Spingobacteria and Flavobacteria . The dominant bacterial class identified in oil and sludge samples was found to be Bacilli and Flavobacteria , respectively. Phylogenetic analysis showed that the dominant bacterium in the oil sample has the closest sequence identity to Enterococcus aquimarinus and the dominant bacterium in the sludge sample is most closely related to the uncultured Bacteroidetes bacterium designated AH.KK.

  2. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a chelating reagent to bring out metal ions, such as calcium and iron, which would otherwise interfere with amino acid analysis. After oxalic acid, 1 mL 0.01 N HCl and 1 mL deionized water is used to sequentially rinse the resin. Finally, the amino acids attached to the resin, and the analytes are eluted using 2.5 M NH4OH (1 mL), and the NH4OH eluent is collected in a vial for analysis.

  3. Synthesis and characterization of ligational behavior of curcumin drug towards some transition metal ions: Chelation effect on their thermal stability and biological activity

    NASA Astrophysics Data System (ADS)

    Refat, Moamen S.

    2013-03-01

    Complexes of Cr(III), Mn(II), Fe(III), Co(II), Ni(II), Cu(II) and Zn(II) with curcumin ligand as antitumor activity were synthesized and characterized by elemental analysis, conductometry, magnetic susceptibility, UV-Vis, IR, Raman, ESR, 1H-NMR spectroscopy, X-ray diffraction analysis of powdered samples and thermal analysis, and screened for antimicrobial activity. The IR spectral data suggested that the ligand behaves as a monobasic bidentate ligand towards the central metal ion with an oxygen's donor atoms sequence of both sbnd OH and Cdbnd O groups under keto-enol structure. From the microanalytical data, the stoichiometry of the complexes 1:2 (metal:ligand) was found. The ligand and their metal complexes were screened for antibacterial activity against Escherichia Coli, Staphylococcus aureus, Bacillus subtilis and Pseudomonas aeruginosa and fungicidal activity against Aspergillus flavus and Candida albicans.

  4. Population transcriptomics with single-cell resolution: a new field made possible by microfluidics: a technology for high throughput transcript counting and data-driven definition of cell types.

    PubMed

    Plessy, Charles; Desbois, Linda; Fujii, Teruo; Carninci, Piero

    2013-02-01

    Tissues contain complex populations of cells. Like countries, which are comprised of mixed populations of people, tissues are not homogeneous. Gene expression studies that analyze entire populations of cells from tissues as a mixture are blind to this diversity. Thus, critical information is lost when studying samples rich in specialized but diverse cells such as tumors, iPS colonies, or brain tissue. High throughput methods are needed to address, model and understand the constitutive and stochastic differences between individual cells. Here, we describe microfluidics technologies that utilize a combination of molecular biology and miniaturized labs on chips to study gene expression at the single cell level. We discuss how the characterization of the transcriptome of each cell in a sample will open a new field in gene expression analysis, population transcriptomics, that will change the academic and biomedical analysis of complex samples by defining them as quantified populations of single cells. Copyright © 2013 WILEY Periodicals, Inc.

  5. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  6. Surface complexation and precipitate geometry for aqueous Zn(II) sorption on ferrihydrite: II. XANES analysis and simulation

    USGS Publications Warehouse

    Waychunas, G.A.; Fuller, C.C.; Davis, J.A.; Rehr, J.J.

    2003-01-01

    X-ray absorption near-edge spectroscopy (XANES) analysis of sorption complexes has the advantages of high sensitivity (10- to 20-fold greater than extended X-ray absorption fine structure [EXAFS] analysis) and relative ease and speed of data collection (because of the short k-space range). It is thus a potentially powerful tool for characterization of environmentally significant surface complexes and precipitates at very low surface coverages. However, quantitative analysis has been limited largely to "fingerprint" comparison with model spectra because of the difficulty of obtaining accurate multiple-scattering amplitudes for small clusters with high confidence. In the present work, calculations of the XANES for 50- to 200-atom clusters of structure from Zn model compounds using the full multiple-scattering code Feff 8.0 accurately replicate experimental spectra and display features characteristic of specific first-neighbor anion coordination geometry and second-neighbor cation geometry and number. Analogous calculations of the XANES for small molecular clusters indicative of precipitation and sorption geometries for aqueous Zn on ferrihydrite, and suggested by EXAFS analysis, are in good agreement with observed spectral trends with sample composition, with Zn-oxygen coordination and with changes in second-neighbor cation coordination as a function of sorption coverage. Empirical analysis of experimental XANES features further verifies the validity of the calculations. The findings agree well with a complete EXAFS analysis previously reported for the same sample set, namely, that octahedrally coordinated aqueous Zn2+ species sorb as a tetrahedral complex on ferrihydrite with varying local geometry depending on sorption density. At significantly higher densities but below those at which Zn hydroxide is expected to precipitate, a mainly octahedral coordinated Zn2+ precipitate is observed. An analysis of the multiple scattering paths contributing to the XANES demonstrates the importance of scattering paths involving the anion sublattice. We also describe the specific advantages of complementary quantitative XANES and EXAFS analysis and estimate limits on the extent of structural information obtainable from XANES analysis. ?? 2003 Elsevier Science Ltd.

  7. A Latent Profile Analysis of Math Achievement, Numerosity, and Math Anxiety in Twins

    ERIC Educational Resources Information Center

    Hart, Sara A.; Logan, Jessica A. R.; Thompson, Lee; Kovas, Yulia; McLoughlin, Gráinne; Petrill, Stephen A.

    2016-01-01

    Underperformance in math is a problem with increasing prevalence, complex etiology, and severe repercussions. This study examined the etiological heterogeneity of math performance in a sample of 264 pairs of 12-year-old twins assessed on measures of math achievement, numerosity, and math anxiety. Latent profile analysis indicated 5 groupings of…

  8. Identification of discriminant proteins through antibody profiling, methods and apparatus for identifying an individual

    DOEpatents

    Apel, William A.; Thompson, Vicki S; Lacey, Jeffrey A.; Gentillon, Cynthia A.

    2016-08-09

    A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.

  9. Identification of discriminant proteins through antibody profiling, methods and apparatus for identifying an individual

    DOEpatents

    Thompson, Vicki S; Lacey, Jeffrey A; Gentillon, Cynthia A; Apel, William A

    2015-03-03

    A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.

  10. Hydrogen calibration of GD-spectrometer using Zr-1Nb alloy

    NASA Astrophysics Data System (ADS)

    Mikhaylov, Andrey A.; Priamushko, Tatiana S.; Babikhina, Maria N.; Kudiiarov, Victor N.; Heller, Rene; Laptev, Roman S.; Lider, Andrey M.

    2018-02-01

    To study the hydrogen distribution in Zr-1Nb alloy (Э110 alloy) GD-OES was applied in this work. Qualitative analysis needs the standard samples with hydrogen. However, the standard samples with high concentrations of hydrogen in the zirconium alloy which would meet the requirements of the shape, size are absent. In this work method of Zr + H calibration samples production was performed at the first time. Automated Complex Gas Reaction Controller was used for samples hydrogenation. To calculate the parameters of post-hydrogenation incubation of the samples in an inert gas atmosphere the diffusion equations were used. Absolute hydrogen concentrations in the samples were determined by melting in the inert gas atmosphere using RHEN602 analyzer (LECO Company). Hydrogen distribution was studied using nuclear reaction analysis (HZDR, Dresden, Germany). RF GD-OES was used for calibration. The depth of the craters was measured with the help of a Hommel-Etamic profilometer by Jenoptik, Germany.

  11. Exploring the applicability of analysing X chromosome STRs in Brazilian admixed population.

    PubMed

    Auler-Bittencourt, Eloisa; Iwamura, Edna Sadayo Miazato; Lima, Maria Jenny Mitraud; da Silva, Ismael Dale Cotrim Guerreiro; dos Santos, Sidney Emannuel Batista

    2015-09-01

    Kinship and parentage analyses always involve one sample being compared to another sample or a few samples with a specific relationship question in mind. In most cases, the analysis of autosomal STR markers is sufficient to determine the genetic kinship. However, when genetic profiles are reconstructed from supposed relatives, for whom the family configuration available for analysis is deficient, the examination may be inconclusive. This study reports practical examples of actual cases analysing the efficiency of the chromosome X STR (STR-ChrX) markers. Three cases with different degrees of efficiency and impact were selected as follows: the identification of two charred bodies in a traffic accident, in which the family setting available was not complete, and one filiation analysis resulting from rape. This is the first paper reporting the use of the multiplex STR 12 ChrX in actual cases using the software Familias 1.8 and Brazilian regional frequency data. Our study clarifies the complex analysis using this powerful tool for professionals in the forensic science community, for both civil and criminal justice. We also discuss state-of-the-art ChrX STR markers and its implications and applications for legal procedures. The data presented here should be used in other studies of complex cases to improve the progress of the current justice system. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  12. 3D printed e-tongue

    NASA Astrophysics Data System (ADS)

    Gaál, Gabriel; da Silva, Tatiana A.; Gaál, Vladimir; Hensel, Rafael C.; Amaral, Lucas R.; Rodrigues, Varlei; Riul, Antonio

    2018-05-01

    Nowadays, one of the biggest issues addressed to electronic sensor fabrication is the build-up of efficient electrodes as an alternative way to the expensive, complex and multistage processes required by traditional techniques. Printed electronics arises as an interesting alternative to fulfill this task due to the simplicity and speed to stamp electrodes on various surfaces. Within this context, the Fused Deposition Modeling 3D printing is an emerging, cost-effective and alternative technology to fabricate complex structures that potentiates several fields with more creative ideas and new materials for a rapid prototyping of devices. We show here the fabrication of interdigitated electrodes using a standard home-made CoreXY 3D printer using transparent and graphene-based PLA filaments. Macro 3D printed electrodes were easily assembled within 6 minutes with outstanding reproducibility. The electrodes were also functionalized with different nanostructured thin films via dip-coating Layer-by-Layer technique to develop a 3D printed e-tongue setup. As a proof of concept, the printed e-tongue was applied to soil analysis. A control soil sample was enriched with several macro-nutrients to the plants (N, P, K, S, Mg and Ca) and the discrimination was done by electrical impedance spectroscopy of water solution of the soil samples. The data was analyzed by Principal Component Analysis and the 3D printed sensor distinguished clearly all enriched samples despite the complexity of the soil chemical composition. The 3D printed e-tongue successfully used in soil analysis encourages further investments in developing new sensory tools for precision agriculture and other fields exploiting the simplicity and flexibility offered by the 3D printing techniques.

  13. Novel CE-MS technique for detection of high explosives using perfluorooctanoic acid as a MEKC and mass spectrometric complexation reagent.

    PubMed

    Brensinger, Karen; Rollman, Christopher; Copper, Christine; Genzman, Ashton; Rine, Jacqueline; Lurie, Ira; Moini, Mehdi

    2016-01-01

    To address the need for the forensic analysis of high explosives, a novel capillary electrophoresis mass spectrometry (CE-MS) technique has been developed for high resolution, sensitivity, and mass accuracy detection of these compounds. The technique uses perfluorooctanoic acid (PFOA) as both a micellar electrokinetic chromatography (MEKC) reagent for separation of neutral explosives and as the complexation reagent for mass spectrometric detection of PFOA-explosive complexes in the negative ion mode. High explosives that formed complexes with PFOA included RDX, HMX, tetryl, and PETN. Some nitroaromatics were detected as molecular ions. Detection limits in the high parts per billion range and linear calibration responses over two orders of magnitude were obtained. For proof of concept, the technique was applied to the quantitative analysis of high explosives in sand samples. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Class-conditional feature modeling for ignitable liquid classification with substantial substrate contribution in fire debris analysis.

    PubMed

    Lopatka, Martin; Sigman, Michael E; Sjerps, Marjan J; Williams, Mary R; Vivó-Truyols, Gabriel

    2015-07-01

    Forensic chemical analysis of fire debris addresses the question of whether ignitable liquid residue is present in a sample and, if so, what type. Evidence evaluation regarding this question is complicated by interference from pyrolysis products of the substrate materials present in a fire. A method is developed to derive a set of class-conditional features for the evaluation of such complex samples. The use of a forensic reference collection allows characterization of the variation in complex mixtures of substrate materials and ignitable liquids even when the dominant feature is not specific to an ignitable liquid. Making use of a novel method for data imputation under complex mixing conditions, a distribution is modeled for the variation between pairs of samples containing similar ignitable liquid residues. Examining the covariance of variables within the different classes allows different weights to be placed on features more important in discerning the presence of a particular ignitable liquid residue. Performance of the method is evaluated using a database of total ion spectrum (TIS) measurements of ignitable liquid and fire debris samples. These measurements include 119 nominal masses measured by GC-MS and averaged across a chromatographic profile. Ignitable liquids are labeled using the American Society for Testing and Materials (ASTM) E1618 standard class definitions. Statistical analysis is performed in the class-conditional feature space wherein new forensic traces are represented based on their likeness to known samples contained in a forensic reference collection. The demonstrated method uses forensic reference data as the basis of probabilistic statements concerning the likelihood of the obtained analytical results given the presence of ignitable liquid residue of each of the ASTM classes (including a substrate only class). When prior probabilities of these classes can be assumed, these likelihoods can be connected to class probabilities. In order to compare the performance of this method to previous work, a uniform prior was assumed, resulting in an 81% accuracy for an independent test of 129 real burn samples. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  16. Selective flotation-spectrophotometric determination of trace copper(II) in natural waters, human blood and drug samples using phenanthraquinone monophenylthiosemicarbazone.

    PubMed

    Khalifa, M E; Akl, M A; Ghazy, S E

    2001-06-01

    Copper(II) forms 1:1 and 1:2 intense red complexes with phenanthraquinone monophenylthiosemicarbazone (PPT) at pH 3-3.5 and > or =6.5, respectively. These complexes exhibit maximal absorbance at 545 and 517 nm, the molar absorptivity being 2.3 x 10(4) and 4.8 x 10(4) l mol(-1) cm(-1), respectively. However, the 1:1 complex was quantitatively floated with oleic acid (HOL) surfactant in the pH range 4.5-5.5, providing a highly selective and sensitive procedure for the spectrophotometric determination of CuII. The molar absorptivity of the floated Cu-PPT complex was 1.5 x 10(5) l mol)(-1) cm(-1). Beer's law was obeyed over the range 3-400 ppb at 545 nm. The analytical parameters affecting the flotation process and hence the determination of copper traces were reported. Also, the structure of the isolated solid complex and the mechanism of flotation were suggested. Moreover, the procedure was successfully applied to the analysis of CuII in natural waters, serum blood and some drug samples.

  17. Does the Hertz solution estimate pressures correctly in diamond indentor experiments?

    NASA Astrophysics Data System (ADS)

    Bruno, M. S.; Dunn, K. J.

    1986-05-01

    The Hertz solution has been widely used to estimate pressures in a spherical indentor against flat matrix type high pressure experiments. It is usually assumed that the pressure generated when compressing a sample between the indentor and substrate is the same as that generated when compressing an indentor against a flat surface with no sample present. A non-linear finite element analysis of this problem has shown that the situation is far more complex. The actual peak pressure in the sample is highly dependent on plastic deformation and the change in material properties due to hydrostatic pressure. An analysis with two material models is presented and compared with the Hertz solution.

  18. Procedures for Handling and Chemical Analysis of Sediment and Water Samples,

    DTIC Science & Technology

    1981-05-01

    silts. Particularly suitable for studies of the sediment/ water interface, for studies on depositonal sediment structures. Al pi ne- ravity Cores of 2 m...adverse water quality impacts would occur. Elemental partitioning or sedimentation fractionation studies are the most complex of the tests considered...8217 water %nd blend the core or dredge sample. Place a{js roximalel-i 00 cc of’ the blended sample in an oxygen-free, poly - ca rbor’~ [ ’-l centrifuge bottle

  19. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  20. Development of methodology for identification the nature of the polyphenolic extracts by FTIR associated with multivariate analysis.

    PubMed

    Grasel, Fábio dos Santos; Ferrão, Marco Flôres; Wolf, Carlos Rodolfo

    2016-01-15

    Tannins are polyphenolic compounds of complex structures formed by secondary metabolism in several plants. These polyphenolic compounds have different applications, such as drugs, anti-corrosion agents, flocculants, and tanning agents. This study analyses six different type of polyphenolic extracts by Fourier transform infrared spectroscopy (FTIR) combined with multivariate analysis. Through both principal component analysis (PCA) and hierarchical cluster analysis (HCA), we observed well-defined separation between condensed (quebracho and black wattle) and hydrolysable (valonea, chestnut, myrobalan, and tara) tannins. For hydrolysable tannins, it was also possible to observe the formation of two different subgroups between samples of chestnut and valonea and between samples of tara and myrobalan. Among all samples analysed, the chestnut and valonea showed the greatest similarity, indicating that these extracts contain equivalent chemical compositions and structure and, therefore, similar properties. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Microchip Immunoaffinity Electrophoresis of Antibody-Thymidine Kinase 1 Complex

    PubMed Central

    Pagaduan, Jayson V.; Ramsden, Madison; O’Neill, Kim; Woolley, Adam T.

    2015-01-01

    Thymidine kinase-1 (TK1) is an important cancer biomarker whose serum levels are elevated in early cancer development. We developed a microchip electrophoresis immunoaffinity assay to measure recombinant purified TK1 (pTK1) using an antibody that binds to human TK1. We fabricated poly(methyl methacrylate) microfluidic devices to test the feasibility of detecting antibody (Ab)-pTK1 immune complexes as a step towards TK1 analysis in clinical serum samples. We were able to separate immune complexes from unbound antibodies using 0.5X phosphate buffer saline (pH 7.4) containing 0.01% Tween-20, with 1% w/v methylcellulose that acts as a dynamic surface coating and sieving matrix. Separation of the antibody and Ab-pTK1 complex was observed within a 5 mm effective separation length. This method of detecting pTK1 is easy to perform, requires only a 10 μL sample volume, and takes just 1 minute for separation. PMID:25486911

  2. Hekate: Software Suite for the Mass Spectrometric Analysis and Three-Dimensional Visualization of Cross-Linked Protein Samples

    PubMed Central

    2013-01-01

    Chemical cross-linking of proteins combined with mass spectrometry provides an attractive and novel method for the analysis of native protein structures and protein complexes. Analysis of the data however is complex. Only a small number of cross-linked peptides are produced during sample preparation and must be identified against a background of more abundant native peptides. To facilitate the search and identification of cross-linked peptides, we have developed a novel software suite, named Hekate. Hekate is a suite of tools that address the challenges involved in analyzing protein cross-linking experiments when combined with mass spectrometry. The software is an integrated pipeline for the automation of the data analysis workflow and provides a novel scoring system based on principles of linear peptide analysis. In addition, it provides a tool for the visualization of identified cross-links using three-dimensional models, which is particularly useful when combining chemical cross-linking with other structural techniques. Hekate was validated by the comparative analysis of cytochrome c (bovine heart) against previously reported data.1 Further validation was carried out on known structural elements of DNA polymerase III, the catalytic α-subunit of the Escherichia coli DNA replisome along with new insight into the previously uncharacterized C-terminal domain of the protein. PMID:24010795

  3. Texture Analysis of Poly-Adenylated mRNA Staining Following Global Brain Ischemia and Reperfusion

    PubMed Central

    Szymanski, Jeffrey J.; Jamison, Jill T.; DeGracia, Donald J.

    2011-01-01

    Texture analysis provides a means to quantify complex changes in microscope images. We previously showed that cytoplasmic poly-adenylated mRNAs form mRNA granules in post-ischemic neurons and that these granules correlated with protein synthesis inhibition and hence cell death. Here we utilized the texture analysis software MaZda to quantify mRNA granules in photomicrographs of the pyramidal cell layer of rat hippocampal region CA3 around 1 hour of reperfusion after 10 min of normothermic global cerebral ischemia. At 1 hour reperfusion, we observed variations in the texture of mRNA granules amongst samples that were readily quantified by texture analysis. Individual sample variation was consistent with the interpretation that animal-to-animal variations in mRNA granules reflected the time-course of mRNA granule formation. We also used texture analysis to quantify the effect of cycloheximide, given either before or after brain ischemia, on mRNA granules. If administered before ischemia, cycloheximide inhibited mRNA granule formation, but if administered after ischemia did not prevent mRNA granulation, indicating mRNA granule formation is dependent on dissociation of polysomes. We conclude that texture analysis is an effective means for quantifying the complex morphological changes induced in neurons by brain ischemia and reperfusion. PMID:21477879

  4. Laser-induced breakdown spectroscopy (LIBS) technique for the determination of the chemical composition of complex inorganic materials

    NASA Astrophysics Data System (ADS)

    Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Kozioł, Paweł E.; Stepak, Bogusz; Abramski, Krzysztof M.

    2014-08-01

    Laser-induced breakdown spectroscopy (LIBS) is a fast, fully optical method, that needs little or no sample preparation. In this technique qualitative and quantitative analysis is based on comparison. The determination of composition is generally based on the construction of a calibration curve namely the LIBS signal versus the concentration of the analyte. Typically, to calibrate the system, certified reference materials with known elemental composition are used. Nevertheless, such samples due to differences in the overall composition with respect to the used complex inorganic materials can influence significantly on the accuracy. There are also some intermediate factors which can cause imprecision in measurements, such as optical absorption, surface structure, thermal conductivity etc. This paper presents the calibration procedure performed with especially prepared pellets from the tested materials, which composition was previously defined. We also proposed methods of post-processing which allowed for mitigation of the matrix effects and for a reliable and accurate analysis. This technique was implemented for determination of trace elements in industrial copper concentrates standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for contents of three elements, that is silver, cobalt and vanadium. It has been shown that the described technique can be used to qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates.

  5. Surface modified capillary electrophoresis combined with in solution isoelectric focusing and MALDI-TOF/TOF MS: a gel-free multidimensional electrophoresis approach for proteomic profiling--exemplified on human follicular fluid.

    PubMed

    Hanrieder, Jörg; Zuberovic, Aida; Bergquist, Jonas

    2009-04-24

    Development of miniaturized analytical tools continues to be of great interest to face the challenges in proteomic analysis of complex biological samples such as human body fluids. In the light of these challenges, special emphasis is put on the speed and simplicity of newly designed technological approaches as well as the need for cost efficiency and low sample consumption. In this study, we present an alternative multidimensional bottom-up approach for proteomic profiling for fast, efficient and sensitive protein analysis in complex biological matrices. The presented setup was based on sample pre-fractionation using microscale in solution isoelectric focusing (IEF) followed by tryptic digestion and subsequent capillary electrophoresis (CE) coupled off-line to matrix assisted laser desorption/ionization time of flight tandem mass spectrometry (MALDI TOF MS/MS). For high performance CE-separation, PolyE-323 modified capillaries were applied to minimize analyte-wall interactions. The potential of the analytical setup was demonstrated on human follicular fluid (hFF) representing a typical complex human body fluid with clinical implication. The obtained results show significant identification of 73 unique proteins (identified at 95% significance level), including mostly acute phase proteins but also protein identities that are well known to be extensively involved in follicular development.

  6. MALDI Q-TOF CID MS for Diagnostic Ion Screening of Human Milk Oligosaccharide Samples

    PubMed Central

    Jovanović, Marko; Tyldesley-Worster, Richard; Pohlentz, Gottfried; Peter-Katalinić, Jasna

    2014-01-01

    Human milk oligosaccharides (HMO) represent the bioactive components of human milk, influencing the infant’s gastrointestinal microflora and immune system. Structurally, they represent a highly complex class of analyte, where the main core oligosaccharide structures are built from galactose and N-acetylglucosamine, linked by 1–3 or 1–4 glycosidic linkages and potentially modified with fucose and sialic acid residues. The core structures can be linear or branched. Additional structural complexity in samples can be induced by endogenous exoglycosidase activity or chemical procedures during the sample preparation. Here, we show that using matrix-assisted laser desorption/ionization (MALDI) quadrupole-time-of-flight (Q-TOF) collision-induced dissociation (CID) as a fast screening method, diagnostic structural information about single oligosaccharide components present in a complex mixture can be obtained. According to sequencing data on 14 out of 22 parent ions detected in a single high molecular weight oligosaccharide chromatographic fraction, 20 different oligosaccharide structure types, corresponding to over 30 isomeric oligosaccharide structures and over 100 possible HMO isomers when biosynthetic linkage variations were taken into account, were postulated. For MS/MS data analysis, we used the de novo sequencing approach using diagnostic ion analysis on reduced oligosaccharides by following known biosynthetic rules. Using this approach, de novo characterization has been achieved also for the structures, which could not have been predicted. PMID:24743894

  7. Chemical behavior of Cu, Zn, Cd, and Pb in a eutrophic reservoir: speciation and complexation capacity.

    PubMed

    Tonietto, Alessandra Emanuele; Lombardi, Ana Teresa; Choueri, Rodrigo Brasil; Vieira, Armando Augusto Henriques

    2015-10-01

    This research aimed at evaluating cadmium (Cd), copper (Cu), lead (Pb), and zinc (Zn) speciation in water samples as well as determining water quality parameters (alkalinity, chlorophyll a, chloride, conductivity, dissolved organic carbon, dissolved oxygen, inorganic carbon, nitrate, pH, total suspended solids, and water temperature) in a eutrophic reservoir. This was performed through calculation of free metal ions using the chemical equilibrium software MINEQL+ 4.61, determination of labile, dissolved, and total metal concentrations via differential pulse anodic stripping voltammetry, and determination of complexed metal by the difference between the total concentration of dissolved and labile metal. Additionally, ligand complexation capacities (CC), such as the strength of the association of metals-ligands (logK'ML) and ligand concentrations (C L) were calculated via Ruzic's linearization method. Water samples were taken in winter and summer, and the results showed that for total and dissolved metals, Zn > Cu > Pb > Cd concentration. In general, higher concentrations of Cu and Zn remained complexed with the dissolved fraction, while Pb was mostly complexed with particulate materials. Chemical equilibrium modeling (MINEQL+) showed that Zn(2+) and Cd(2+) dominated the labile species, while Cu and Pb were complexed with carbonates. Zinc was a unique metal for which a direct relation between dissolved species with labile and complexed forms was obtained. The CC for ligands indicated a higher C L for Cu, followed by Pb, Zn, and Cd in decreasing amounts. Nevertheless, the strength of the association of all metals and their respective ligands was similar. Factor analysis with principal component analysis as the extraction procedure confirmed seasonal effects on water quality parameters and metal speciation. Total, dissolved, and complexed Cu and total, dissolved, complexed, and labile Pb species were all higher in winter, whereas in summer, Zn was mostly present in the complexed form. A high degree of deterioration of the reservoir was confirmed by the results of this study.

  8. Analysis of the interactions between GMF and Arp2/3 complex in two binding sites by molecular dynamics simulation.

    PubMed

    Popinako, A; Antonov, M; Dibrova, D; Chemeris, A; Sokolova, O S

    2018-02-05

    The Arp2/3 complex plays a key role in nucleating actin filaments branching. The glia maturation factor (GMF) competes with activators for interacting with the Arp2/3 complex and initiates the debranching of actin filaments. In this study, we performed a comparative analysis of interactions between GMF and the Arp2/3 complex and identified new amino acid residues involved in GMF binding to the Arp2/3 complex at two separate sites, revealed by X-ray and single particle EM techniques. Using molecular dynamics simulations we demonstrated the quantitative and qualitative changes in hydrogen bonds upon binding with GMF. We identified the specific amino acid residues in GMF and Arp2/3 complex that stabilize the interactions and estimated the mean force profile for the GMF using umbrella sampling. Phylogenetic and structural analyses of the recently defined GMF binding site on the Arp3 subunit indicate a new mechanism for Arp2/3 complex inactivation that involves interactions between the Arp2/3 complex and GMF at two binding sites. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Measurement of complex permittivities of biological materials and human skin in vivo in the frequency band

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghodgaonkar, D.K.

    1987-01-01

    A new method, namely, modified infinite sample method, has been developed which is particularly suitable for millimeter-wave dielectric measurements of biological materials. In this method, an impedance transformer is used which reduces the reflectivity of the biological sample. Because of the effect of introducing impendance transformer, the measured reflection coefficients are more sensitive to the complex permittivities of biological samples. For accurate measurement of reflection coefficients, two automated measurment systems were developed which cover the frequencies range of 26.5-60 GHz. An uncertainty analysis was performed to get an estimate of the errors in the measured complex permittivities. The dielectric propertiesmore » were measured for 10% saline solution, whole human blood, 200 mg/ml bovine serum albumin (BSA) solution and suspension of Saccharomyces cerevisiae cells. The Maxwell-Fricke equation, which is derived from dielectric mixture theory, was used for determination bound water in BSA solution. The results of all biological samples were interpreted by fitting Debye relaxation and Cole-Cole model. It is observed that the dielectric data for the biological materials can be explained on the basis of Debye relaxation of water molecule.« less

  10. Two stage algorithm vs commonly used approaches for the suspect screening of complex environmental samples analyzed via liquid chromatography high resolution time of flight mass spectroscopy: A test study.

    PubMed

    Samanipour, Saer; Baz-Lomba, Jose A; Alygizakis, Nikiforos A; Reid, Malcolm J; Thomaidis, Nikolaos S; Thomas, Kevin V

    2017-06-09

    LC-HR-QTOF-MS recently has become a commonly used approach for the analysis of complex samples. However, identification of small organic molecules in complex samples with the highest level of confidence is a challenging task. Here we report on the implementation of a two stage algorithm for LC-HR-QTOF-MS datasets. We compared the performances of the two stage algorithm, implemented via NIVA_MZ_Analyzer™, with two commonly used approaches (i.e. feature detection and XIC peak picking, implemented via UNIFI by Waters and TASQ by Bruker, respectively) for the suspect analysis of four influent wastewater samples. We first evaluated the cross platform compatibility of LC-HR-QTOF-MS datasets generated via instruments from two different manufacturers (i.e. Waters and Bruker). Our data showed that with an appropriate spectral weighting function the spectra recorded by the two tested instruments are comparable for our analytes. As a consequence, we were able to perform full spectral comparison between the data generated via the two studied instruments. Four extracts of wastewater influent were analyzed for 89 analytes, thus 356 detection cases. The analytes were divided into 158 detection cases of artificial suspect analytes (i.e. verified by target analysis) and 198 true suspects. The two stage algorithm resulted in a zero rate of false positive detection, based on the artificial suspect analytes while producing a rate of false negative detection of 0.12. For the conventional approaches, the rates of false positive detection varied between 0.06 for UNIFI and 0.15 for TASQ. The rates of false negative detection for these methods ranged between 0.07 for TASQ and 0.09 for UNIFI. The effect of background signal complexity on the two stage algorithm was evaluated through the generation of a synthetic signal. We further discuss the boundaries of applicability of the two stage algorithm. The importance of background knowledge and experience in evaluating the reliability of results during the suspect screening was evaluated. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Microfluidics for the analysis of membrane proteins: how do we get there?

    PubMed

    Battle, Katrina N; Uba, Franklin I; Soper, Steven A

    2014-08-01

    The development of fully automated and high-throughput systems for proteomics is now in demand because of the need to generate new protein-based disease biomarkers. Unfortunately, it is difficult to identify protein biomarkers that are low abundant when in the presence of highly abundant proteins, especially in complex biological samples such as serum, cell lysates, and other biological fluids. Membrane proteins, which are in many cases of low abundance compared to the cytosolic proteins, have various functions and can provide insight into the state of a disease and serve as targets for new drugs making them attractive biomarker candidates. Traditionally, proteins are identified through the use of gel electrophoretic techniques, which are not always suitable for particular protein samples such as membrane proteins. Microfluidics offers the potential as a fully automated platform for the efficient and high-throughput analysis of complex samples, such as membrane proteins, and do so with performance metrics that exceed their bench-top counterparts. In recent years, there have been various improvements to microfluidics and their use for proteomic analysis as reported in the literature. Consequently, this review presents an overview of the traditional proteomic-processing pipelines for membrane proteins and insights into new technological developments with a focus on the applicability of microfluidics for the analysis of membrane proteins. Sample preparation techniques will be discussed in detail and novel interfacing strategies as it relates to MS will be highlighted. Lastly, some general conclusions and future perspectives are presented. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Sequential derivatization of polar organic compounds in cloud water using O-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine hydrochloride, N,O-bis(trimethylsilyl)trifluoroacetamide, and gas-chromatography/mass spectrometry analysis.

    PubMed

    Sagona, Jessica A; Dukett, James E; Hawley, Harmonie A; Mazurek, Monica A

    2014-10-03

    Cloud water samples from Whiteface Mountain, NY were used to develop a combined sampling and gas chromatography-mass spectrometric (GCMS) protocol for evaluating the complex mixture of highly polar organic compounds (HPOC) present in this atmospheric medium. Specific HPOC of interest were mono- and di keto-acids which are thought to originate from photochemical reactions of volatile unsaturated hydrocarbons from biogenic and manmade emissions and be a major fraction of atmospheric carbon. To measure HPOC mixtures and the individual keto-acids in cloud water, samples first must be derivatized for clean elution and measurement, and second, have low overall background of the target species as validated by GCMS analysis of field and laboratory blanks. Here, we discuss a dual derivatization method with PFBHA and BSTFA which targets only organic compounds that contain functional groups reacting with both reagents. The method also reduced potential contamination by minimizing the amount of sample processing from the field through the GCMS analysis steps. Once derivatized only gas chromatographic separation and selected ion monitoring (SIM) are needed to identify and quantify the polar organic compounds of interest. Concentrations of the detected total keto-acids in individual cloud water samples ranged from 27.8 to 329.3ngmL(-1) (ppb). Method detection limits for the individual HPOC ranged from 0.17 to 4.99ngmL(-1) and the quantification limits for the compounds ranged from 0.57 to 16.64ngmL(-1). The keto-acids were compared to the total organic carbon (TOC) results for the cloud water samples with concentrations of 0.607-3.350mgL(-1) (ppm). GCMS analysis of all samples and blanks indicated good control of the entire collection and analysis steps. Selected ion monitoring by GCMS of target keto-acids was essential for screening the complex organic carbon mixtures present at low ppb levels in cloud water. It was critical for ensuring high levels of quality assurance and quality control and for the correct identification and quantification of key marker compounds. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Letter Report: Stable Hydrogen and Oxygen Isotope Analysis of B-Complex Perched Water Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Brady D.; Moran, James J.; Nims, Megan K.

    Fine-grained sediments associated with the Cold Creek Unit at Hanford have caused the formation of a perched water aquifer in the deep vadose zone at the B Complex area, which includes waste sites in the 200-DV-1 Operable Unit and the single-shell tank farms in Waste Management Area B-BX-BY. High levels of contaminants, such as uranium, technetium-99, and nitrate, make this aquifer a continuing source of contamination for the groundwater located a few meters below the perched zone. Analysis of deuterium ( 2H) and 18-oxygen ( 18O) of nine perched water samples from three different wells was performed. Samples represent timemore » points from hydraulic tests performed on the perched aquifer using the three wells. The isotope analyses showed that the perched water had δ 2H and δ 18O ratios consistent with the regional meteoric water line, indicating that local precipitation events at the Hanford site likely account for recharge of the perched water aquifer. Data from the isotope analysis can be used along with pumping and recovery data to help understand the perched water dynamics related to aquifer size and hydraulic control of the aquifer in the future.« less

  14. Report for the NGFA-5 project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaing, C; Jackson, P; Thissen, J

    The objective of this project is to provide DHS a comprehensive evaluation of the current genomic technologies including genotyping, TaqMan PCR, multiple locus variable tandem repeat analysis (MLVA), microarray and high-throughput DNA sequencing in the analysis of biothreat agents from complex environmental samples. To effectively compare the sensitivity and specificity of the different genomic technologies, we used SNP TaqMan PCR, MLVA, microarray and high-throughput illumine and 454 sequencing to test various strains from B. anthracis, B. thuringiensis, BioWatch aerosol filter extracts or soil samples that were spiked with B. anthracis, and samples that were previously collected during DHS and EPAmore » environmental release exercises that were known to contain B. thuringiensis spores. The results of all the samples against the various assays are discussed in this report.« less

  15. Analysis of polycyclic aromatic hydrocarbons extracted from air particulate matter using a temperature programmable injector coupled to GC-C-IRMS.

    PubMed

    Mikolajczuk, Agnieszka; Przyk, Elzbieta Perez; Geypens, Benny; Berglund, Michael; Taylor, Philip

    2010-03-01

    Compound specific isotopic analysis (CSIA) can provide information about the origin of analysed compounds - in this case, polycyclic aromatic hydrocarbons (PAHs). In the study, PAHs were extracted from three dust samples: winter and summer filter dust and tunnel dust. The measurement was performed using the method validated in our laboratory using pure, solid compounds and EPA 610 reference assortment. CSIA required an appropriate clean-up method to avoid an unresolved complex in the gas chromatographic analysis usually found in the chromatography of PAHs. Extensive sample clean-up for this particular matrix was found to be necessary to obtain good gas chromatography-combustion-isotope ratio mass spectrometry analysis results. The sample purification method included two steps in which the sample is cleaned up and the aliphatic and aromatic hydrocarbons are separated. The concentration of PAHs in the measured samples was low; so a large volume injection technique (100 microl) was applied. The delta(VPDB)(13)C was measured with a final uncertainty smaller than 1 per thousand. Comparison of the delta(VPDB)(13)C signatures of PAHs extracted from different dust samples was feasible with this method and, doing so, significant differences were observed.

  16. Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 μm) and regression tree analysis

    USGS Publications Warehouse

    Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus

    2013-01-01

    This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 μm wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 μm region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

  17. [Pretreatment of Aluminum-Lithium Alloy Sample and Determination of Argentum and Lithium by Spectral Analysis].

    PubMed

    Zhou, Hui; Tan, Qian; Gao, Ya-ling; Sang, Shi-hua; Chen, Wen

    2015-10-01

    Inductively Coupled Plasma Optical Emission Spectrometry (ICP-OES), Flame Atomic Absorption Spectrometry (FAAS) and Visible Spectrometry (VS) was applied for determination of Ag and Li in lithium-aluminium alloy standard sample and test sample, their respective advantages and disadvantages were compared, the excellent selectivity of ICP-OES was confirmed by analyses of certified standard sample. Three different sample digestion methods were compared and discussed in this study. It was found that the better accuracy would be obtained by digesting sample with chloroazotic acid while the content of Li was measured by FAAS, and it was better to digest sample with hydrochloric acid and hydrogen peroxide while determining Ag and Li by ICP-OES simultaneously and determining Ag by FAAS and VS. The interference of co-existing elements and elimination methods was detailedly discussed. Ammonium hydroxide was added to adjust the sample solution into alkalescent and Al, Ti, Zr was precipitated by forming hydroxide precipitation, Mg and Cu was formed complex precipitation with 8-hydroxyquinoline in this condition, then the interference from matrix element to determinate Ag by FAAS was eliminated. In addition, phosphate was used to precipitate Ti to eliminate its interference for determination of Li by FAAS. The same treatment of determination for Ag by FAAS was used to eliminate the interference of matrix element for determination of Ag by VS, the excess of nitrate was added into sample and heated to release Ag+ from silver chloride complex, and the color of 8-hydroxyquinoline was eliminated because of decomposed by heating. The accuracy of analysis result for standard sample was conspicuously improved which confirms the efficient of the method to eliminate interference in this study. The optimal digestion method and eliminate interference method was applied to lithium-aluminium alloy samples. The recovery of samples was from 100.39% to 103.01% by ICP-OES determination for Ag, and from 100.42% to 103.01% by ICP-OES determination for Li. The recovery ranged from 95.91% to 99.98% by FAAS determination for Ag, and ranged from 98.04% to 99.98% for FAAS determination of Li. The recovery was from 98.00% to 101.00 by VS determination for Ag, the analysis results all meet the analysis requirement.

  18. Delayed matching to two-picture samples by individuals with and without disabilities: an analysis of the role of naming.

    PubMed

    Gutowski, Stanley J; Stromer, Robert

    2003-01-01

    Delayed matching to complex, two-picture samples (e.g., cat-dog) may be improved when the samples occasion differential verbal behavior. In Experiment 1, individuals with mental retardation matched picture comparisons to identical single-picture samples or to two-picture samples, one of which was identical to a comparison. Accuracy scores were typically high on single-picture trials under both simultaneous and delayed matching conditions. Scores on two-picture trials were also high during the simultaneous condition but were lower during the delay condition. However, scores improved on delayed two-picture trials when each of the sample pictures was named aloud before comparison responding. Experiment 2 replicated these results with preschoolers with typical development and a youth with mental retardation. Sample naming also improved the preschoolers' matching when the samples were pairs of spoken names and the correct comparison picture matched one of the names. Collectively, the participants could produce the verbal behavior that might have improved performance, but typically did not do so unless the procedure required it. The success of the naming intervention recommends it for improving the observing and remembering of multiple elements of complex instructional stimuli.

  19. FT-IR spectroscopy study on cutaneous neoplasie

    NASA Astrophysics Data System (ADS)

    Crupi, V.; De Domenico, D.; Interdonato, S.; Majolino, D.; Maisano, G.; Migliardo, P.; Venuti, V.

    2001-05-01

    In this work we report a preliminary study of Fourier transform infrared spectroscopy on normal and neoplastic human skin samples suffering from two kinds of cancer, namely epithelioma and basalioma. The analyzed skin samples have been drawn from different parts of the human body, after biopsies. By performing a complex band deconvolution due to the complexity of the tissue composition, the analysis within the considered frequency region (900-4000 cm -1) of the collected IR spectra, allowed us, first of all, to characterize the presence of the pathologies and to show clear different spectral features passing from the normal tissue to the malignant one in particular within the region (1500-2000 cm -1) typical of the lipid bands.

  20. Digital sorting of complex tissues for cell type-specific gene expression profiles.

    PubMed

    Zhong, Yi; Wan, Ying-Wooi; Pang, Kaifang; Chow, Lionel M L; Liu, Zhandong

    2013-03-07

    Cellular heterogeneity is present in almost all gene expression profiles. However, transcriptome analysis of tissue specimens often ignores the cellular heterogeneity present in these samples. Standard deconvolution algorithms require prior knowledge of the cell type frequencies within a tissue or their in vitro expression profiles. Furthermore, these algorithms tend to report biased estimations. Here, we describe a Digital Sorting Algorithm (DSA) for extracting cell-type specific gene expression profiles from mixed tissue samples that is unbiased and does not require prior knowledge of cell type frequencies. The results suggest that DSA is a specific and sensitivity algorithm in gene expression profile deconvolution and will be useful in studying individual cell types of complex tissues.

  1. DSC, X-ray and FTIR studies of a gemfibrozil/dimethyl-β-cyclodextrin inclusion complex produced by co-grinding.

    PubMed

    Aigner, Z; Berkesi, O; Farkas, G; Szabó-Révész, P

    2012-01-05

    The steps of formation of an inclusion complex produced by the co-grinding of gemfibrozil and dimethyl-β-cyclodextrin were investigated by differential scanning calorimetry (DSC), X-ray powder diffractometry (XRPD) and Fourier transform infrared (FTIR) spectroscopy with curve-fitting analysis. The endothermic peak at 59.25°C reflecting the melting of gemfibrozil progressively disappeared from the DSC curves of the products on increase of the duration of co-grinding. The crystallinity of the samples too gradually decreased, and after 35min of co-grinding the product was totally amorphous. Up to this co-grinding time, XRPD and FTIR investigations indicated a linear correlation between the cyclodextrin complexation and the co-grinding time. After co-grinding for 30min, the ratio of complex formation did not increase. These studies demonstrated that co-grinding is a suitable method for the complexation of gemfibrozil with dimethyl-β-cyclodextrin. XRPD analysis revealed the amorphous state of the gemfibrozil-dimethyl-β-cyclodextrin product. FTIR spectroscopy with curve-fitting analysis may be useful as a semiquantitative analytical method for discriminating the molecular and amorphous states of gemfibrozil. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Revisiting Mn and Fe removal in humic rich estuaries

    NASA Astrophysics Data System (ADS)

    Oldham, Véronique E.; Miller, Megan T.; Jensen, Laramie T.; Luther, George W.

    2017-07-01

    Metal removal by estuarine mixing has been studied for several decades, but few studies emphasize dissolved metal speciation and organic ligand complexation. Findings from the last decade indicate that metal-humic complexation can be significant for dissolved metals including Cu(II), Al(III) and Fe(III), but little consideration is given to the precipitation of these complexes with humic material at pH < 2. Given that total soluble metal analysis involves an acidification step for sample preservation, we show that Mn and other metal concentrations may have been underestimated in estuaries, especially when humic substance concentrations are high. A competitive ligand assay of selected samples from our study site, a coastal waterway bordered by wetlands (Broadkill River, DE), showed that Mn(III)-humic complexation is significant, and that some Mn(III)-L complexes precipitate during acidification. In the oxygenated surface waters of the Broadkill River, total dissolved Mn (dMnT) was up to 100% complexed to ambient ligands as Mn(III)-L, and we present evidence for humic-type Mn(III)-L complexes. The Mn(III) complexes were kinetically stabilized against Fe(II) reduction, even when [Fe(II)] was 17 times higher than [dMnT]. Unlike typical oceanic surface waters, [Fe(II)] > [Fe(III)-L] in surface waters, which may be attributed to high rates of photoreduction of Fe(III)-L complexes. Total [Mn(III)-L] ranged from 0.22 to 8.4 μM, in excess of solid MnOx (below 0.28 μM in all samples). Filtration of samples through 0.02 μm filters indicated that all Mn(III)-L complexes pass through the filters and were not colloidal species in contrast to dissolved Fe. Incubation experiments indicated that the reductive dissolution of solid MnOx by ambient ligands may be responsible for Mn(III) formation in this system. Unlike previous studies of estuarine mixing, which demonstrated metal removal during mixing, we show significant export of dMn and dissolved Fe (dFe) in the summer and fall of 2015. Thus, we propose that estuarine removal should be considered seasonal for dMn and dFe, with export in the summer and fall and removal during the winter.

  3. Chemometrics-assisted spectrophotometric method for simultaneous determination of Pb²⁺ and Cu²⁺ ions in different foodstuffs, soil and water samples using 2-benzylspiro [isoindoline-1,5'-oxazolidine]-2',3,4'-trione using continuous wavelet transformation and partial least squares - calculation of pKf of complexes with rank annihilation factor analysis.

    PubMed

    Abbasi Tarighat, Maryam; Nabavi, Masoume; Mohammadizadeh, Mohammad Reza

    2015-06-15

    A new multi-component analysis method based on zero-crossing point-continuous wavelet transformation (CWT) was developed for simultaneous spectrophotometric determination of Cu(2+) and Pb(2+) ions based on the complex formation with 2-benzyl espiro[isoindoline-1,5 oxasolidine]-2,3,4 trione (BSIIOT). The absorption spectra were evaluated with respect to synthetic ligand concentration, time of complexation and pH. Therefore according the absorbance values, 0.015 mmol L(-1) BSIIOT, 10 min after mixing and pH 8.0 were used as optimum values. The complex formation between BSIIOT ligand and the cations Cu(2+) and Pb(2+) by application of rank annihilation factor analysis (RAFA) were investigated. Daubechies-4 (db4), discrete Meyer (dmey), Morlet (morl) and Symlet-8 (sym8) continuous wavelet transforms for signal treatments were found to be suitable among the wavelet families. The applicability of new synthetic ligand and selected mother wavelets were used for the simultaneous determination of strongly overlapped spectra of species without using any pre-chemical treatment. Therefore, CWT signals together with zero crossing technique were directly applied to the overlapping absorption spectra of Cu(2+) and Pb(2+). The calibration graphs for estimation of Pb(2+) and Cu (2+)were obtained by measuring the CWT amplitudes at zero crossing points for Cu(2+) and Pb(2+) at the wavelet domain, respectively. The proposed method was validated by simultaneous determination of Cu(2+) and Pb(2+) ions in red beans, walnut, rice, tea and soil samples. The obtained results of samples with proposed method have been compared with those predicted by partial least squares (PLS) and flame atomic absorption spectrophotometry (FAAS). Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Analysis of single nucleotide polymorphisms in case-control studies.

    PubMed

    Li, Yonghong; Shiffman, Dov; Oberbauer, Rainer

    2011-01-01

    Single nucleotide polymorphisms (SNPs) are the most common type of genetic variants in the human genome. SNPs are known to modify susceptibility to complex diseases. We describe and discuss methods used to identify SNPs associated with disease in case-control studies. An outline on study population selection, sample collection and genotyping platforms is presented, complemented by SNP selection, data preprocessing and analysis.

  5. A simplified Forest Inventory and Analysis database: FIADB-Lite

    Treesearch

    Patrick D. Miles

    2008-01-01

    This publication is a simplified version of the Forest Inventory and Analysis Data Base (FIADB) for users who do not need to compute sampling errors and may find the FIADB unnecessarily complex. Possible users include GIS specialists who may be interested only in identifying and retrieving geographic information and per acre values for the set of plots used in...

  6. Association between suicidal symptoms and repeat suicidal behaviour within a sample of hospital-treated suicide attempters.

    PubMed

    de Beurs, Derek P; van Borkulo, Claudia D; O'Connor, Rory C

    2017-05-01

    Suicidal behaviour is the end result of the complex relation between many factors which are biological, psychological and environmental in nature. Network analysis is a novel method that may help us better understand the complex association between different factors. To examine the relationship between suicidal symptoms as assessed by the Beck Scale for Suicide Ideation and future suicidal behaviour in patients admitted to hospital following a suicide attempt, using network analysis. Secondary analysis was conducted on previously collected data from a sample of 366 patients who were admitted to a Scottish hospital following a suicide attempt. Network models were estimated to visualise and test the association between baseline symptom network structure and suicidal behaviour at 15-month follow-up. Network analysis showed that the desire for an active attempt was found to be the most central, strongly related suicide symptom. Of the 19 suicide symptoms that were assessed at baseline, 10 symptoms were directly related to repeat suicidal behaviour. When comparing baseline network structure of repeaters ( n =94) with the network of non-repeaters ( n =272), no significant differences were found. Network analysis can help us better understand suicidal behaviour by visualising the complex relation between relevant symptoms and by indicating which symptoms are most central within the network. These insights have theoretical implications as well as informing the assessment and treatment of suicidal behaviour. None. © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license.

  7. Identification of Poly-N-Acetyllactosamine-Carrying Glycoproteins from HL-60 Human Promyelocytic Leukemia Cells Using a Site-Specific Glycome Analysis Method, Glyco-RIDGE

    NASA Astrophysics Data System (ADS)

    Togayachi, Akira; Tomioka, Azusa; Fujita, Mika; Sukegawa, Masako; Noro, Erika; Takakura, Daisuke; Miyazaki, Michiyo; Shikanai, Toshihide; Narimatsu, Hisashi; Kaji, Hiroyuki

    2018-04-01

    To elucidate the relationship between the protein function and the diversity and heterogeneity of glycans conjugated to the protein, glycosylation sites, glycan variation, and glycan proportions at each site of the glycoprotein must be analyzed. Glycopeptide-based structural analysis technology using mass spectrometry has been developed; however, complicated analyses of complex spectra obtained by multistage fragmentation are necessary, and sensitivity and throughput of the analyses are low. Therefore, we developed a liquid chromatography/mass spectrometry (MS)-based glycopeptide analysis method to reveal the site-specific glycome (Glycan heterogeneity-based Relational IDentification of Glycopeptide signals on Elution profile, Glyco-RIDGE). This method used accurate masses and retention times of glycopeptides, without requiring MS2, and could be applied to complex mixtures. To increase the number of identified peptide, fractionation of sample glycopeptides for reduction of sample complexity is required. Therefore, in this study, glycopeptides were fractionated into four fractions by hydrophilic interaction chromatography, and each fraction was analyzed using the Glyco-RIDGE method. As a result, many glycopeptides having long glycans were enriched in the highest hydrophilic fraction. Based on the monosaccharide composition, these glycans were thought to be poly-N-acetyllactosamine (polylactosamine [pLN]), and 31 pLN-carrier proteins were identified in HL-60 cells. Gene ontology enrichment analysis revealed that pLN carriers included many molecules related to signal transduction, receptors, and cell adhesion. Thus, these findings provided important insights into the analysis of the glycoproteome using our novel Glyco-RIDGE method. [Figure not available: see fulltext.

  8. Extractive Atmospheric Pressure Photoionization (EAPPI) Mass Spectrometry: Rapid Analysis of Chemicals in Complex Matrices.

    PubMed

    Liu, Chengyuan; Yang, Jiuzhong; Wang, Jian; Hu, Yonghua; Zhao, Wan; Zhou, Zhongyue; Qi, Fei; Pan, Yang

    2016-10-01

    Extractive atmospheric pressure photoionization (EAPPI) mass spectrometry was designed for rapid qualitative and quantitative analysis of chemicals in complex matrices. In this method, an ultrasonic nebulization system was applied to sample extraction, nebulization, and vaporization. Mixed with a gaseous dopant, vaporized analytes were ionized through ambient photon-induced ion-molecule reactions, and were mass-analyzed by a high resolution time-of-flight mass spectrometer (TOF-MS). After careful optimization and testing with pure sample solution, EAPPI was successfully applied to the fast screening of capsules, soil, natural products, and viscous compounds. Analysis was completed within a few seconds without the need for preseparation. Moreover, the quantification capability of EAPPI for matrices was evaluated by analyzing six polycyclic aromatic hydrocarbons (PAHs) in soil. The correlation coefficients (R (2) ) for standard curves of all six PAHs were above 0.99, and the detection limits were in the range of 0.16-0.34 ng/mg. In addition, EAPPI could also be used to monitor organic chemical reactions in real time. Graphical Abstract ᅟ.

  9. A laboratory information management system for the analysis of tritium (3H) in environmental waters.

    PubMed

    Belachew, Dagnachew Legesse; Terzer-Wassmuth, Stefan; Wassenaar, Leonard I; Klaus, Philipp M; Copia, Lorenzo; Araguás, Luis J Araguás; Aggarwal, Pradeep

    2018-07-01

    Accurate and precise measurements of low levels of tritium ( 3 H) in environmental waters are difficult to attain due to complex steps of sample preparation, electrolytic enrichment, liquid scintillation decay counting, and extensive data processing. We present a Microsoft Access™ relational database application, TRIMS (Tritium Information Management System) to assist with sample and data processing of tritium analysis by managing the processes from sample registration and analysis to reporting and archiving. A complete uncertainty propagation algorithm ensures tritium results are reported with robust uncertainty metrics. TRIMS will help to increase laboratory productivity and improve the accuracy and precision of 3 H assays. The software supports several enrichment protocols and LSC counter types. TRIMS is available for download at no cost from the IAEA at www.iaea.org/water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Molecular signature of complex regional pain syndrome (CRPS) and its analysis.

    PubMed

    König, Simone; Schlereth, Tanja; Birklein, Frank

    2017-10-01

    Complex Regional Pain Syndrome (CRPS) is a rare, but often disabling pain disease. Biomarkers are lacking, but several inflammatory substances have been associated with the pathophysiology. This review outlines the current knowledge with respect to target biomolecules and the analytical tools available to measure them. Areas covered: Targets include cytokines, neuropeptides and resolvins; analysis strategies are thus needed for different classes of substances such as proteins, peptides, lipids and small molecules. Traditional methods like immunoassays are of importance next to state-of-the art high-resolution mass spectrometry techniques and 'omics' approaches. Expert commentary: Future biomarker studies need larger cohorts, which improve subgrouping of patients due to their presumed pathophysiology, and highly standardized workflows from sampling to analysis.

  11. Structure in the 3D Galaxy Distribution. III. Fourier Transforming the Universe: Phase and Power Spectra

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Way, M. J.; Gazis, P. G.

    2017-01-01

    We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform of finely binned galaxy positions. In both cases, deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multipoint hierarchy. We identify some threads of modern large-scale inference methodology that will presumably yield detections in new wider and deeper surveys.

  12. Halloysite Nanotubes as a New Adsorbent for Solid Phase Extraction and Spectrophotometric Determination of Iron in Water and Food Samples

    NASA Astrophysics Data System (ADS)

    Samadi, A.; Amjadi, M.

    2016-07-01

    Halloysite nanotubes (HNTs) have been introduced as a new solid phase extraction adsorbent for preconcentration of iron(II) as a complex with 2,2-bipyridine. The cationic complex is effectively adsorbed on the sorbent in the pH range of 3.5-6.0 and efficiently desorbed by trichloroacetic acid. The eluted complex has a strong absorption around 520 nm, which was used for determination of Fe(II). After optimizing extraction conditions, the linear range of the calibration graph was 5.0-500 μg/L with a detection limit of 1.3 μg/L. The proposed method was successfully applied for the determination of trace iron in various water and food samples, and the accuracy was assessed through the recovery experiments and analysis of a certified reference material (NIST 1643e).

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozkendir, Osman Murat, E-mail: ozkendir@gmail.com

    Highlights: • Crystal and electronic structure properties of Nd{sub x}Ti{sub 1−x}BO{sub 2+d} structure were investigated. • New crystal structures for Nd–Ti complexes are determined. • Distortions in the crystal structure were observed as a result of Boron shortage. • Prominent change in electronic properties of the samples with the increasing Nd amount. - Abstract: Neodymium substituted TiBO{sub 3} samples were investigated according to their crystal, electric and electronic properties. Studies were conducted by X-ray absorption fine structure spectroscopy (XAFS) technique for the samples with different substitutions in the preparation processes. To achieve better crystal structure results during the study, XRDmore » pattern results were supported by extended-XAFS (EXAFS) analysis. The electronic structure analysis were studied by X-ray absorption near-edge structure spectroscopy (XANES) measurements at the room temperatures. Due to the substituted Nd atoms, prominent changes in crystal structure, new crystal geometries for Nd-Ti complexes, phase transitions in the crystals structure were detected according to the increasing Nd substitutions in the samples. In the entire stages of the substitutions, Nd atoms were observed as governing the whole phenomena due to their dominant characteristics in Ti geometries. Besides, electrical resistivity decay was determined in the materials with the increasing amount of Nd substitution.« less

  14. FTIR Analysis of Functional Groups in Aerosol Particles

    NASA Astrophysics Data System (ADS)

    Shokri, S. M.; McKenzie, G.; Dransfield, T. J.

    2012-12-01

    Secondary organic aerosols (SOA) are suspensions of particulate matter composed of compounds formed from chemical reactions of organic species in the atmosphere. Atmospheric particulate matter can have impacts on climate, the environment and human health. Standardized techniques to analyze the characteristics and composition of complex secondary organic aerosols are necessary to further investigate the formation of SOA and provide a better understanding of the reaction pathways of organic species in the atmosphere. While Aerosol Mass Spectrometry (AMS) can provide detailed information about the elemental composition of a sample, it reveals little about the chemical moieties which make up the particles. This work probes aerosol particles deposited on Teflon filters using FTIR, based on the protocols of Russell, et al. (Journal of Geophysical Research - Atmospheres, 114, 2009) and the spectral fitting algorithm of Takahama, et al (submitted, 2012). To validate the necessary calibration curves for the analysis of complex samples, primary aerosols of key compounds (e.g., citric acid, ammonium sulfate, sodium benzoate) were generated, and the accumulated masses of the aerosol samples were related to their IR absorption intensity. These validated calibration curves were then used to classify and quantify functional groups in SOA samples generated in chamber studies by MIT's Kroll group. The fitting algorithm currently quantifies the following functionalities: alcohols, alkanes, alkenes, amines, aromatics, carbonyls and carboxylic acids.

  15. Lipidomic analysis of biological samples: Comparison of liquid chromatography, supercritical fluid chromatography and direct infusion mass spectrometry methods.

    PubMed

    Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal

    2017-11-24

    Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Are Complexity Metrics Reliable in Assessing HRV Control in Obese Patients During Sleep?

    PubMed

    Cabiddu, Ramona; Trimer, Renata; Borghi-Silva, Audrey; Migliorini, Matteo; Mendes, Renata G; Oliveira, Antonio D; Costa, Fernando S M; Bianchi, Anna M

    2015-01-01

    Obesity is associated with cardiovascular mortality. Linear methods, including time domain and frequency domain analysis, are normally applied on the heart rate variability (HRV) signal to investigate autonomic cardiovascular control, whose imbalance might promote cardiovascular disease in these patients. However, given the cardiac activity non-linearities, non-linear methods might provide better insight. HRV complexity was hereby analyzed during wakefulness and different sleep stages in healthy and obese subjects. Given the short duration of each sleep stage, complexity measures, normally extracted from long-period signals, needed be calculated on short-term signals. Sample entropy, Lempel-Ziv complexity and detrended fluctuation analysis were evaluated and results showed no significant differences among the values calculated over ten-minute signals and longer durations, confirming the reliability of such analysis when performed on short-term signals. Complexity parameters were extracted from ten-minute signal portions selected during wakefulness and different sleep stages on HRV signals obtained from eighteen obese patients and twenty controls. The obese group presented significantly reduced complexity during light and deep sleep, suggesting a deficiency in the control mechanisms integration during these sleep stages. To our knowledge, this study reports for the first time on how the HRV complexity changes in obesity during wakefulness and sleep. Further investigation is needed to quantify altered HRV impact on cardiovascular mortality in obesity.

  17. Are Complexity Metrics Reliable in Assessing HRV Control in Obese Patients During Sleep?

    PubMed Central

    Cabiddu, Ramona; Trimer, Renata; Borghi-Silva, Audrey; Migliorini, Matteo; Mendes, Renata G.; Oliveira Jr., Antonio D.; Costa, Fernando S. M.; Bianchi, Anna M.

    2015-01-01

    Obesity is associated with cardiovascular mortality. Linear methods, including time domain and frequency domain analysis, are normally applied on the heart rate variability (HRV) signal to investigate autonomic cardiovascular control, whose imbalance might promote cardiovascular disease in these patients. However, given the cardiac activity non-linearities, non-linear methods might provide better insight. HRV complexity was hereby analyzed during wakefulness and different sleep stages in healthy and obese subjects. Given the short duration of each sleep stage, complexity measures, normally extracted from long-period signals, needed be calculated on short-term signals. Sample entropy, Lempel-Ziv complexity and detrended fluctuation analysis were evaluated and results showed no significant differences among the values calculated over ten-minute signals and longer durations, confirming the reliability of such analysis when performed on short-term signals. Complexity parameters were extracted from ten-minute signal portions selected during wakefulness and different sleep stages on HRV signals obtained from eighteen obese patients and twenty controls. The obese group presented significantly reduced complexity during light and deep sleep, suggesting a deficiency in the control mechanisms integration during these sleep stages. To our knowledge, this study reports for the first time on how the HRV complexity changes in obesity during wakefulness and sleep. Further investigation is needed to quantify altered HRV impact on cardiovascular mortality in obesity. PMID:25893856

  18. An analytical platform for mass spectrometry-based identification and chemical analysis of RNA in ribonucleoprotein complexes.

    PubMed

    Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki

    2009-11-01

    We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a approximately 21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes.

  19. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    USGS Publications Warehouse

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially different results and/or computational instability. However, when only fixed effects are of interest, the survey package (svyglm and svyolr) may be suitable for a model-assisted analysis for trend. We provide possible directions for future research into combined analysis for ordinal and continuous vital sign indictors.

  20. The Search Engine for Multi-Proteoform Complexes: An Online Tool for the Identification and Stoichiometry Determination of Protein Complexes.

    PubMed

    Skinner, Owen S; Schachner, Luis F; Kelleher, Neil L

    2016-12-08

    Recent advances in top-down mass spectrometry using native electrospray now enable the analysis of intact protein complexes with relatively small sample amounts in an untargeted mode. Here, we describe how to characterize both homo- and heteropolymeric complexes with high molecular specificity using input data produced by tandem mass spectrometry of whole protein assemblies. The tool described is a "search engine for multi-proteoform complexes," (SEMPC) and is available for free online. The output is a list of candidate multi-proteoform complexes and scoring metrics, which are used to define a distinct set of one or more unique protein subunits, their overall stoichiometry in the intact complex, and their pre- and post-translational modifications. Thus, we present an approach for the identification and characterization of intact protein complexes from native mass spectrometry data. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  1. PIXE analysis of ancient Chinese Changsha porcelain

    NASA Astrophysics Data System (ADS)

    Lin, E. K.; Yu, Y. C.; Wang, C. W.; Liu, T. Y.; Wu, C. M.; Chen, K. M.; Lin, S. S.

    1999-04-01

    In this work, proton induced X-ray emission (PIXE) method was applied for the analysis of ancient Chinese Changsha porcelain produced in the Tang dynasty (AD 618-907). A collection of glazed potsherds was obtained in the complex of the famous kiln site at Tongguan, Changsha city, Hunan province. Studies of elemental composition were carried out on ten selected Changsha potsherds. Minor and trace elements such as Ti, Mn, Fe, Co, Cu, Rb, Sr, and Zr in the material of the porcelain glaze were determined. Variation of these elements from sample to sample was investigated. Details of results are presented and discussed.

  2. Vibrational Micro-Spectroscopy of Human Tissues Analysis: Review.

    PubMed

    Bunaciu, Andrei A; Hoang, Vu Dang; Aboul-Enein, Hassan Y

    2017-05-04

    Vibrational spectroscopy (Infrared (IR) and Raman) and, in particular, micro-spectroscopy and micro-spectroscopic imaging have been used to characterize developmental changes in tissues, to monitor these changes in cell cultures and to detect disease and drug-induced modifications. The conventional methods for biochemical and histophatological tissue characterization necessitate complex and "time-consuming" sample manipulations and the results are rarely quantifiable. The spectroscopy of molecular vibrations using mid-IR or Raman techniques has been applied to samples of human tissue. This article reviews the application of these vibrational spectroscopic techniques for analysis of biological tissue published between 2005 and 2015.

  3. Protein Stains to Detect Antigen on Membranes.

    PubMed

    Dsouza, Anil; Scofield, R Hal

    2015-01-01

    Western blotting (protein blotting/electroblotting) is the gold standard in the analysis of complex protein mixtures. Electroblotting drives protein molecules from a polyacrylamide (or less commonly, of an agarose) gel to the surface of a binding membrane, thereby facilitating an increased availability of the sites with affinity for both general and specific protein reagents. The analysis of these complex protein mixtures is achieved by the detection of specific protein bands on a membrane, which in turn is made possible by the visualization of protein bands either by chemical staining or by reaction with an antibody of a conjugated ligand. Chemical methods employ staining with organic dyes, metal chelates, autoradiography, fluorescent dyes, complexing with silver, or prelabeling with fluorophores. All of these methods have differing sensitivities and quantitative determinations vary significantly. This review will describe the various protein staining methods applied to membranes after western blotting. "Detection" precedes and is a prerequisite to obtaining qualitative and quantitative data on the proteins in a sample, as much as to comparing the protein composition of different samples. "Detection" is often synonymous to staining, i.e., the reversible or irreversible binding by the proteins of a colored organic or inorganic chemical.

  4. Spectroscopic analysis of the powdery complex chitosan-iodine

    NASA Astrophysics Data System (ADS)

    Gegel, Natalia O.; Babicheva, Tatyana S.; Belyakova, Olga A.; Lugovitskaya, Tatyana N.; Shipovskaya, Anna B.

    2018-04-01

    A chitosan-iodine complex was obtained by modification of polymer powder in the vapor of an iodine-containing sorbate and studied by electron and IR spectroscopy, optical rotation dispersion. It was found that the electronic spectra of an aqueous solution of the modified chitosan (the source one and that stored for a year) showed intense absorption bands of triiodide and iodate ions, and also polyiodide ions, bound to the macromolecule by exciton bonding with charge transfer. Analysis of the IR spectra shows destruction of the network of intramolecular and intermolecular hydrogen bonds in the iodinated chitosan powder in comparison with the source polymer and the formation of a new chemical substance. E.g., the absorption band of deformation vibrations of the hydroxyl group disappears in the modified sample, and that of the protonated amino group shifts toward shorter wavelengths. The intensity of the stretching vibration band of the glucopyranose ring atoms significantly reduces. Heating of the modified sample at a temperature below the thermal degradation point of the polymer leads to stabilization of the chitosan-iodine complex. Based on our studies, the hydroxyl and amino groups of the aminopolysaccharide have been recognized as the centers of retention of polyiodide chains in the chitosan matrix.

  5. Protein stains to detect antigen on membranes.

    PubMed

    D'souza, Anil; Scofield, R Hal

    2009-01-01

    Western blotting (protein blotting/electroblotting) is the gold standard in the analysis of complex protein mixtures. Electroblotting drives protein molecules from a polyacrylamide (or less commonly, of an agarose) gel to the surface of a binding membrane, thereby facilitating an increased availability of the sites with affinity for both general and specific protein reagents. The analysis of these complex protein mixtures is achieved by the detection of specific protein bands on a membrane, which in turn is made possible by the visualization of protein bands either by chemical staining or by reaction with an antibody of a conjugated ligand. Chemical methods employ staining with organic dyes, metal chelates, autoradiography, fluorescent dyes, complexing with silver, or prelabeling with fluorophores. All of these methods have differing sensitivities and quantitative determinations vary significantly. This review will describe the various protein staining methods applied to membranes after electrophoresis. "Detection" precedes and is a prerequisite to obtaining qualitative and quantitative data on the proteins in a sample, as much as to comparing the protein composition of different samples. Detection is often synonymous to staining, i.e., the reversible or irreversible binding by the proteins of a colored organic or inorganic chemical.

  6. Corrosion behaviours of the dental magnetic keeper complexes made by different alloys and methods.

    PubMed

    Wu, Min-Ke; Song, Ning; Liu, Fei; Kou, Liang; Lu, Xiao-Wen; Wang, Min; Wang, Hang; Shen, Jie-Fei

    2016-09-29

    The keeper and cast dowel-coping, as a primary component for a magnetic attachment, is easily subjected to corrosion in a wet environment, such as the oral cavity, which contains electrolyte-rich saliva, complex microflora and chewing behaviour and so on. The objective of this in vitro study was to examine the corrosion resistance of a dowel and coping-keeper complex fabricated by finish keeper and three alloys (cobalt-chromium, CoCr; silver-palladium-gold, PdAu; gold-platinum, AuPt) using a laser-welding process and a casting technique. The surface morphology characteristics and microstructures of the samples were examined by means of metallographic microscope and scanning electron microscope (SEM). Energy-dispersive spectroscopy (EDS) with SEM provided elements analysis information for the test samples after 10% oxalic acid solution etching test. Tafel polarization curve recordings demonstrated parameter values indicating corrosion of the samples when subjected to electrochemical testing. This study has suggested that massive oxides are attached to the surface of the CoCr-keeper complex but not to the AuPt-keeper complex. Only the keeper area of cast CoCr-keeper complex displayed obvious intergranular corrosion and changes in the Fe and Co elements. Both cast and laser-welded AuPt-keeper complexes had the highest free corrosion potential, followed by the PdAu-keeper complex. We concluded that although the corrosion resistance of the CoCr-keeper complex was worst, the keeper surface passive film was actually preserved to its maximum extent. The laser-welded CoCr- and PdAu-keeper complexes possessed superior corrosion resistance as compared with their cast specimens, but no significant difference was found between the cast and laser-welded AuPt-keeper complexes. The Fe-poor and Cr-rich band, appearing on the edge of the keeper when casting, has been proven to be a corrosion-prone area.

  7. Corrosion behaviours of the dental magnetic keeper complexes made by different alloys and methods

    PubMed Central

    Wu, Min-Ke; Song, Ning; Liu, Fei; Kou, Liang; Lu, Xiao-Wen; Wang, Min; Wang, Hang; Shen, Jie-Fei

    2016-01-01

    The keeper and cast dowel–coping, as a primary component for a magnetic attachment, is easily subjected to corrosion in a wet environment, such as the oral cavity, which contains electrolyte-rich saliva, complex microflora and chewing behaviour and so on. The objective of this in vitro study was to examine the corrosion resistance of a dowel and coping-keeper complex fabricated by finish keeper and three alloys (cobalt–chromium, CoCr; silver–palladium–gold, PdAu; gold–platinum, AuPt) using a laser-welding process and a casting technique. The surface morphology characteristics and microstructures of the samples were examined by means of metallographic microscope and scanning electron microscope (SEM). Energy-dispersive spectroscopy (EDS) with SEM provided elements analysis information for the test samples after 10% oxalic acid solution etching test. Tafel polarization curve recordings demonstrated parameter values indicating corrosion of the samples when subjected to electrochemical testing. This study has suggested that massive oxides are attached to the surface of the CoCr–keeper complex but not to the AuPt–keeper complex. Only the keeper area of cast CoCr–keeper complex displayed obvious intergranular corrosion and changes in the Fe and Co elements. Both cast and laser-welded AuPt–keeper complexes had the highest free corrosion potential, followed by the PdAu–keeper complex. We concluded that although the corrosion resistance of the CoCr–keeper complex was worst, the keeper surface passive film was actually preserved to its maximum extent. The laser-welded CoCr– and PdAu–keeper complexes possessed superior corrosion resistance as compared with their cast specimens, but no significant difference was found between the cast and laser-welded AuPt–keeper complexes. The Fe-poor and Cr-rich band, appearing on the edge of the keeper when casting, has been proven to be a corrosion-prone area. PMID:27388806

  8. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  9. Interference by the activated sludge matrix on the analysis of soluble microbial products in wastewater.

    PubMed

    Potvin, Christopher M; Zhou, Hongde

    2011-11-01

    The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Interactions and reversal-field memory in complex magnetic nanowire arrays

    NASA Astrophysics Data System (ADS)

    Rotaru, Aurelian; Lim, Jin-Hee; Lenormand, Denny; Diaconu, Andrei; Wiley, John. B.; Postolache, Petronel; Stancu, Alexandru; Spinu, Leonard

    2011-10-01

    Interactions and magnetization reversal of Ni nanowire arrays have been investigated by the first-order reversal curve (FORC) method. Several series of samples with controlled spatial distribution were considered including simple wires of different lengths and diameters (70 and 110 nm) and complex wires with a single modulated diameter along their length. Subtle features of magnetic interactions are revealed through a quantitative analysis of the local interaction field profile distributions obtained from the FORC method. In addition, the FORC analysis indicates that the nanowire systems with a mean diameter of 70 nm appear to be organized in symmetric clusters indicative of a reversal-field memory effect.

  11. Noise and Complexity in Human Postural Control: Interpreting the Different Estimations of Entropy

    PubMed Central

    Rhea, Christopher K.; Silver, Tobin A.; Hong, S. Lee; Ryu, Joong Hyun; Studenka, Breanna E.; Hughes, Charmayne M. L.; Haddad, Jeffrey M.

    2011-01-01

    Background Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. Methods and Findings The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. Conclusions The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses. PMID:21437281

  12. Meta-analysis of mismatch negativity to simple versus complex deviants in schizophrenia.

    PubMed

    Avissar, Michael; Xie, Shanghong; Vail, Blair; Lopez-Calderon, Javier; Wang, Yuanjia; Javitt, Daniel C

    2018-01-01

    Mismatch negativity (MMN) deficits in schizophrenia (SCZ) have been studied extensively since the early 1990s, with the vast majority of studies using simple auditory oddball task deviants that vary in a single acoustic dimension such as pitch or duration. There has been a growing interest in using more complex deviants that violate more abstract rules to probe higher order cognitive deficits. It is still unclear how sensory processing deficits compare to and contribute to higher order cognitive dysfunction, which can be investigated with later attention-dependent auditory event-related potential (ERP) components such as a subcomponent of P300, P3b. In this meta-analysis, we compared MMN deficits in SCZ using simple deviants to more complex deviants. We also pooled studies that measured MMN and P3b in the same study sample and examined the relationship between MMN and P3b deficits within study samples. Our analysis reveals that, to date, studies using simple deviants demonstrate larger deficits than those using complex deviants, with effect sizes in the range of moderate to large. The difference in effect sizes between deviant types was reduced significantly when accounting for magnitude of MMN measured in healthy controls. P3b deficits, while large, were only modestly greater than MMN deficits (d=0.21). Taken together, our findings suggest that MMN to simple deviants may still be optimal as a biomarker for SCZ and that sensory processing dysfunction contributes significantly to MMN deficit and disease pathophysiology. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Graft Immunocomplex Capture Fluorescence Analysis to Detect Donor-Specific Antibodies and HLA Antigen Complexes in the Allograft.

    PubMed

    Nakamura, Tsukasa; Ushigome, Hidetaka; Watabe, Kiyoko; Imanishi, Yui; Masuda, Koji; Matsuyama, Takehisa; Harada, Shumpei; Koshino, Katsuhiro; Iida, Taku; Nobori, Shuji; Yoshimura, Norio

    2017-04-01

    Immunocomplex capture fluorescence analysis (ICFA) is an attractive method to detect donor-specific anti-HLA antibodies (DSA) and HLA antigen complexes. Currently, antibody-mediated rejection (AMR) due to DSA is usually diagnosed by C4d deposition and serological DSA detection. Conversely, there is a discrepancy between these findings frequently. Thereupon, our graft ICFA technique may contribute to establish the diagnosis of AMR. Graft samples were obtained by a percutaneous needle biopsy. Then, the specimen was dissolved in PBS by the lysis buffer. Subsequently, HLA antigens were captured by anti-HLA beads. Then, DSA-HLA complexes were detected by PE-conjugated anti-human IgG antibodies, where DSA had already reacted with the allograft in vivo, analyzed by a Luminex system. A ratio (sample MFI/blank beads MFI) was calculated: ≥ 1.0 was determined as positive. We found that DSA-HLA complexes in the graft were successfully detected from only slight positive 1.03 to 79.27 in a chronic active AMR patient by graft ICFA. Next, positive graft ICFA had predicted the early phase of AMR (MFI ratio: 1.38) even in patients with no serum DSA. Finally, appropriate therapies for AMR deleted DSA deposition (MFI ratio from 0.3 to 0.7) from allografts. This novel application would detect early phase or incomplete pathological cases of AMR, which could lead to a correct diagnosis and initiation of appropriate therapies. Moreover, graft ICFA might address a variety of long-standing questions in terms of DSA. AMR: Antibody-mediated rejection; DSA: Donor-specific antibodies; ICFA: Immunocomplex capture fluorescence analysis.

  14. Hollow porous ionic liquids composite polymers based solid phase extraction coupled online with high performance liquid chromatography for selective analysis of hydrophilic hydroxybenzoic acids from complex samples.

    PubMed

    Dai, Xingping; Wang, Dongsheng; Li, Hui; Chen, Yanyi; Gong, Zhicheng; Xiang, Haiyan; Shi, Shuyun; Chen, Xiaoqing

    2017-02-10

    Polar and hydrophilic properties of hydroxybenzoic acids usually made them coelute with interferences in high performance liquid chromatography (HPLC) analysis. Then selective analysis of them was necessary. Herein, hollow porous ionic liquids composite polymers (PILs) based solid phase extraction (SPE) was firstly fabricated and coupled online with HPLC for selective analysis of hydroxybenzoic acids from complex matrices. Hollow porous PILs were firstly synthesized using Mobil Composition of Matter No. 48 (MCM-48) spheres as sacrificial support, 1-vinyl-3-methylimidazolium chloride (VMIM + Cl - ) as monomer, and ethylene glycol dimethacrylate (EGDMA) as cross-linker. Various parameters affecting synthesis, adsorption and desorption behaviors were investigated and optimized. Steady-state adsorption studies showed the resulting hollow porous PILs exhibited high adsorption capacity, fast adsorption kinetics, and excellent specific adsorption. Subsequently, the application of online SPE system was studied by selective analysis of protocatechuic acid (PCA), 4-hydroxybenzoic acid (4-HBA), and vanillic acid (VA) from Pollen Typha angustifolia. The obtained limit of detection (LOD) varied from 0.002 to 0.01μg/mL, the linear range (0.05-5.0μg/mL) was wide with correlation coefficient (R) from 0.9982 to 0.9994, and the average recoveries at three spiking levels ranged from 82.7 to 102.4%, with column-to-column relative standard deviation (RSD) below 8.1%. The proposed online method showed good accuracy, precision, specificity and convenience, which opened up a universal and efficient route for selective analysis of hydroxybenzoic acids from complex samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Trace elemental analysis of glass and paint samples of forensic interest by ICP-MS using laser ablation solid sample introduction

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Trejos, Tatiana; Hobbs, Andria; Furton, Kenneth G.

    2003-09-01

    The importance of small amounts of glass and paint evidence as a means to associate a crime event to a suspect or a suspect to another individual has been demonstrated in many cases. Glass is a fragile material that is often found at the scenes of crimes such as burglaries, hit-and-run accidents and violent crime offenses. Previous work has demonstrated the utility of elemental analysis by solution ICP-MS of small amounts of glass for the comparison between a fragment found at a crime scene to a possible source of the glass. The multi-element capability and the sensitivity of ICP-MS combined with the simplified sample introduction of laser ablation prior to ion detection provides for an excellent and relatively non-destructive technique for elemental analysis of glass fragments. The direct solid sample introduction technique of laser ablation (LA) is reported as an alternative to the solution method. Direct solid sampling provides several advantages over solution methods and shows great potential for a number of solid sample analyses in forensic science. The advantages of laser ablation include the simplification of sample preparation, thereby reducing the time and complexity of the analysis, the elimination of handling acid dissolution reagents such as HF and the reduction of sources of interferences in the ionization plasma. Direct sampling also provides for essentially "non-destructive" sampling due to the removal of very small amounts of sample needed for analysis. The discrimination potential of LA-ICP-MS is compared with previously reported solution ICP-MS methods using external calibration with internal standardization and a newly reported solution isotope dilution (ID) method. A total of ninety-one different glass samples were used for the comparison study using the techniques mentioned. One set consisted of forty-five headlamps taken from a variety of automobiles representing a range of twenty years of manufacturing dates. A second set consisted of forty-six automotive glasses (side windows and windshields) representing casework glass from different vehicle manufacturers over several years was also characterized by RI and elemental composition analysis. The solution sample introduction techniques (external calibration and isotope dilution) provide for excellent sensitivity and precision but have the disadvantages of destroying the sample and also involve complex sample preparation. The laser ablation method was simpler, faster and produced comparable discrimination to the EC-ICP-MS and ID-ICP-MS. LA-ICP-MS can provide for an excellent alternative to solution analysis of glass in forensic casework samples. Paints and coatings are frequently encountered as trace evidence samples submitted to forensic science laboratories. A LA-ICP-MS method has been developed to complement the commonly used techniques in forensic laboratories in order to better characterize these samples for forensic purposes. Time-resolved plots of each sample can be compared to associate samples to each other or to discriminate between samples. Additionally, the concentration of lead and the ratios of other elements have been determined in various automotive paints by the reported method. A sample set of eighteen (18) survey automotive paint samples have been analyzed with the developed method in order to determine the utility of LA-ICP-MS and to compare the method to the more commonly used scanning electron microscopy (SEM) method for elemental characterization of paint layers in forensic casework.

  16. High-Throughput Effect-Directed Analysis Using Downscaled in Vitro Reporter Gene Assays To Identify Endocrine Disruptors in Surface Water

    PubMed Central

    2018-01-01

    Effect-directed analysis (EDA) is a commonly used approach for effect-based identification of endocrine disruptive chemicals in complex (environmental) mixtures. However, for routine toxicity assessment of, for example, water samples, current EDA approaches are considered time-consuming and laborious. We achieved faster EDA and identification by downscaling of sensitive cell-based hormone reporter gene assays and increasing fractionation resolution to allow testing of smaller fractions with reduced complexity. The high-resolution EDA approach is demonstrated by analysis of four environmental passive sampler extracts. Downscaling of the assays to a 384-well format allowed analysis of 64 fractions in triplicate (or 192 fractions without technical replicates) without affecting sensitivity compared to the standard 96-well format. Through a parallel exposure method, agonistic and antagonistic androgen and estrogen receptor activity could be measured in a single experiment following a single fractionation. From 16 selected candidate compounds, identified through nontargeted analysis, 13 could be confirmed chemically and 10 were found to be biologically active, of which the most potent nonsteroidal estrogens were identified as oxybenzone and piperine. The increased fractionation resolution and the higher throughput that downscaling provides allow for future application in routine high-resolution screening of large numbers of samples in order to accelerate identification of (emerging) endocrine disruptors. PMID:29547277

  17. Detrital zircon analysis of Mesoproterozoic and neoproterozoic metasedimentary rocks of northcentral idaho: Implications for development of the Belt-Purcell basin

    USGS Publications Warehouse

    Lewis, R.S.; Vervoort, J.D.; Burmester, R.F.; Oswald, P.J.

    2010-01-01

    The authors analyzed detrital zircon grains from 10 metasedimentary rock samples of the Priest River complex and three other amphibolite-facies metamorphic sequences in north-central Idaho to test the previous assignment of these rocks to the Mesoproterozoic Belt-Purcell Supergroup. Zircon grains from two samples of the Prichard Formation (lower Belt) and one sample of Cambrian quartzite were also analyzed as controls with known depositional ages. U-Pb zircon analysis by laser ablation - inductively coupled plasma - mass spectrometry reveals that 6 of the 10 samples contain multiple age populations between 1900 and 1400 Ma and a scatter of older ages, similar to results reported from the Belt- Purcell Supergroup to the north and east. Results from the Priest River metamorphic complex confirm previous correlations with the Prichard Formation. Samples from the Golden and Elk City sequences have significant numbers of 1500-1380 Ma grains, which indicates that they do not predate the Belt. Rather, they are probably from a relatively young, southwestern part of the Belt Supergroup (Lemhi subbasin). Non-North American (1610-1490 Ma) grains are rare in these rocks. Three samples of quartzite from the Syringa metamorphic sequence northwest of the Idaho batholith contain zircon grains younger than the Belt Supergroup and support a Neoproterozoic age. A single Cambrian sample has abundant 1780 Ma grains and none younger than ~1750 Ma. These results indicate that the likely protoliths of many high-grade metamorphic rocks in northern Idaho were strata of the Belt-Purcell Supergroup or overlying rocks of the Neoproterozoic Windermere Supergroup and not basement rocks.

  18. Proteomics as a Quality Control Tool of Pharmaceutical Probiotic Bacterial Lysate Products

    PubMed Central

    Klein, Günter; Schanstra, Joost P.; Hoffmann, Janosch; Mischak, Harald; Siwy, Justyna; Zimmermann, Kurt

    2013-01-01

    Probiotic bacteria have a wide range of applications in veterinary and human therapeutics. Inactivated probiotics are complex samples and quality control (QC) should measure as many molecular features as possible. Capillary electrophoresis coupled to mass spectrometry (CE/MS) has been used as a multidimensional and high throughput method for the identification and validation of biomarkers of disease in complex biological samples such as biofluids. In this study we evaluate the suitability of CE/MS to measure the consistency of different lots of the probiotic formulation Pro-Symbioflor which is a bacterial lysate of heat-inactivated Escherichia coli and Enterococcus faecalis. Over 5000 peptides were detected by CE/MS in 5 different lots of the bacterial lysate and in a sample of culture medium. 71 to 75% of the total peptide content was identical in all lots. This percentage increased to 87–89% when allowing the absence of a peptide in one of the 5 samples. These results, based on over 2000 peptides, suggest high similarity of the 5 different lots. Sequence analysis identified peptides of both E. coli and E. faecalis and peptides originating from the culture medium, thus confirming the presence of the strains in the formulation. Ontology analysis suggested that the majority of the peptides identified for E. coli originated from the cell membrane or the fimbrium, while peptides identified for E. faecalis were enriched for peptides originating from the cytoplasm. The bacterial lysate peptides as a whole are recognised as highly conserved molecular patterns by the innate immune system as microbe associated molecular pattern (MAMP). Sequence analysis also identified the presence of soybean, yeast and casein protein fragments that are part of the formulation of the culture medium. In conclusion CE/MS seems an appropriate QC tool to analyze complex biological products such as inactivated probiotic formulations and allows determining the similarity between lots. PMID:23840518

  19. Evaluation of partial coherence correction in X-ray ptychography

    DOE PAGES

    Burdet, Nicolas; Shi, Xiaowen; Parks, Daniel; ...

    2015-02-23

    Coherent X-ray Diffraction Imaging (CDI) and X-ray ptychography both heavily rely on the high degree of spatial coherence of the X-ray illumination for sufficient experimental data quality for reconstruction convergence. Nevertheless, the majority of the available synchrotron undulator sources have a limited degree of partial coherence, leading to reduced data quality and a lower speckle contrast in the coherent diffraction patterns. It is still an open question whether experimentalists should compromise the coherence properties of an X-ray source in exchange for a higher flux density at a sample, especially when some materials of scientific interest are relatively weak scatterers. Amore » previous study has suggested that in CDI, the best strategy for the study of strong phase objects is to maintain a high degree of coherence of the illuminating X-rays because of the broadening of solution space resulting from the strong phase structures. In this article, we demonstrate the first systematic analysis of the effectiveness of partial coherence correction in ptychography as a function of the coherence properties, degree of complexity of illumination (degree of phase diversity of the probe) and sample phase complexity. We have also performed analysis of how well ptychographic algorithms refine X-ray probe and complex coherence functions when those variables are unknown at the start of reconstructions, for noise-free simulated data, in the case of both real-valued and highly-complex objects.« less

  20. EEG based topography analysis in string recognition task

    NASA Astrophysics Data System (ADS)

    Ma, Xiaofei; Huang, Xiaolin; Shen, Yuxiaotong; Qin, Zike; Ge, Yun; Chen, Ying; Ning, Xinbao

    2017-03-01

    Vision perception and recognition is a complex process, during which different parts of brain are involved depending on the specific modality of the vision target, e.g. face, character, or word. In this study, brain activities in string recognition task compared with idle control state are analyzed through topographies based on multiple measurements, i.e. sample entropy, symbolic sample entropy and normalized rhythm power, extracted from simultaneously collected scalp EEG. Our analyses show that, for most subjects, both symbolic sample entropy and normalized gamma power in string recognition task are significantly higher than those in idle state, especially at locations of P4, O2, T6 and C4. It implies that these regions are highly involved in string recognition task. Since symbolic sample entropy measures complexity, from the perspective of new information generation, and normalized rhythm power reveals the power distributions in frequency domain, complementary information about the underlying dynamics can be provided through the two types of indices.

  1. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE PAGES

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...

    2018-03-28

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  2. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    NASA Astrophysics Data System (ADS)

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-12-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.

  3. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  4. On the relation between correlation dimension, approximate entropy and sample entropy parameters, and a fast algorithm for their calculation

    NASA Astrophysics Data System (ADS)

    Zurek, Sebastian; Guzik, Przemyslaw; Pawlak, Sebastian; Kosmider, Marcin; Piskorski, Jaroslaw

    2012-12-01

    We explore the relation between correlation dimension, approximate entropy and sample entropy parameters, which are commonly used in nonlinear systems analysis. Using theoretical considerations we identify the points which are shared by all these complexity algorithms and show explicitly that the above parameters are intimately connected and mutually interdependent. A new geometrical interpretation of sample entropy and correlation dimension is provided and the consequences for the interpretation of sample entropy, its relative consistency and some of the algorithms for parameter selection for this quantity are discussed. To get an exact algorithmic relation between the three parameters we construct a very fast algorithm for simultaneous calculations of the above, which uses the full time series as the source of templates, rather than the usual 10%. This algorithm can be used in medical applications of complexity theory, as it can calculate all three parameters for a realistic recording of 104 points within minutes with the use of an average notebook computer.

  5. Impaired Muscle Mitochondrial Biogenesis and Myogenesis in Spinal Muscular Atrophy

    PubMed Central

    Ripolone, Michela; Ronchi, Dario; Violano, Raffaella; Vallejo, Dionis; Fagiolari, Gigliola; Barca, Emanuele; Lucchini, Valeria; Colombo, Irene; Villa, Luisa; Berardinelli, Angela; Balottin, Umberto; Morandi, Lucia; Mora, Marina; Bordoni, Andreina; Fortunato, Francesco; Corti, Stefania; Parisi, Daniela; Toscano, Antonio; Sciacco, Monica; DiMauro, Salvatore; Comi, Giacomo P.; Moggio, Maurizio

    2016-01-01

    IMPORTANCE The important depletion of mitochondrial DNA (mtDNA) and the general depression of mitochondrial respiratory chain complex levels (including complex II) have been confirmed, implying an increasing paucity of mitochondria in the muscle from patients with types I, II, and III spinal muscular atrophy (SMA-I, -II, and -III, respectively). OBJECTIVE To investigate mitochondrial dysfunction in a large series of muscle biopsy samples from patients with SMA. DESIGN, SETTING, AND PARTICIPANTS We studied quadriceps muscle samples from 24 patients with genetically documented SMA and paraspinal muscle samples from 3 patients with SMA-II undergoing surgery for scoliosis correction. Postmortem muscle samples were obtained from 1 additional patient. Age-matched controls consisted of muscle biopsy specimens from healthy children aged 1 to 3 years who had undergone analysis for suspected myopathy. Analyses were performed at the Neuromuscular Unit, Istituto di Ricovero e Cura a Carattere Scientifico Foundation Ca’ Granda Ospedale Maggiore Policlinico-Milano, from April 2011 through January 2015. EXPOSURES We used histochemical, biochemical, and molecular techniques to examine the muscle samples. MAIN OUTCOMES AND MEASURES Respiratory chain activity and mitochondrial content. RESULTS Results of histochemical analysis revealed that cytochrome-c oxidase (COX) deficiency was more evident in muscle samples from patients with SMA-I and SMA-II. Residual activities for complexes I, II, and IV in muscles from patients with SMA-I were 41%, 27%, and 30%, respectively, compared with control samples (P < .005). Muscle mtDNA content and cytrate synthase activity were also reduced in all 3 SMA types (P < .05). We linked these alterations to downregulation of peroxisome proliferator–activated receptor coactivator 1α, the transcriptional activators nuclear respiratory factor 1 and nuclear respiratory factor 2, mitochondrial transcription factor A, and their downstream targets, implying depression of the entire mitochondrial biogenesis. Results of Western blot analysis confirmed the reduced levels of the respiratory chain subunits that included mitochondrially encoded COX1 (47.5%; P = .004), COX2 (32.4%; P < .001), COX4 (26.6%; P < .001), and succinate dehydrogenase complex subunit A (65.8%; P = .03) as well as the structural outer membrane mitochondrial porin (33.1%; P < .001). Conversely, the levels of expression of 3 myogenic regulatory factors—muscle-specificmyogenic factor 5, myoblast determination 1, and myogenin—were higher in muscles from patients with SMA compared with muscles from age-matched controls (P < .05). CONCLUSIONS AND RELEVANCE Our results strongly support the conclusion that an altered regulation of myogenesis and a downregulated mitochondrial biogenesis contribute to pathologic change in the muscle of patients with SMA. Therapeutic strategies should aim at counteracting these changes. PMID:25844556

  6. Identification of active fluorescence stained bacteria by Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Krause, Mario; Beyer, Beatrice; Pietsch, Christian; Radt, Benno; Harz, Michaela; Rösch, Petra; Popp, Jürgen

    2008-04-01

    Microorganisms can be found everywhere e.g. in food both as useful ingredients or harmful contaminations causing food spoilage. Therefore, a fast and easy to handle analysis method is needed to detect bacteria in different kinds of samples like meat, juice or air to decide if the sample is contaminated by harmful microorganisms. Conventional identification methods in microbiology require always cultivation and therefore are time consuming. In this contribution we present an analysis approach to identify fluorescence stained bacteria on strain level by means of Raman spectroscopy. The stained bacteria are highlighted and can be localized easier against a complex sample environment e.g. in food. The use of Raman spectroscopy in combination with chemometrical methods allows the identification of single bacteria within minutes.

  7. Second-order data obtained by beta-cyclodextrin complexes: a novel approach for multicomponent analysis with three-way multivariate calibration methods.

    PubMed

    Khani, Rouhollah; Ghasemi, Jahan B; Shemirani, Farzaneh

    2014-10-01

    This research reports the first application of β-cyclodextrin (β-CD) complexes as a new method for generation of three way data, combined with second-order calibration methods for quantification of a binary mixture of caffeic (CA) and vanillic (VA) acids, as model compounds in fruit juices samples. At first, the basic experimental parameters affecting the formation of inclusion complexes between target analytes and β-CD were investigated and optimized. Then under the optimum conditions, parallel factor analysis (PARAFAC) and bilinear least squares/residual bilinearization (BLLS/RBL) were applied for deconvolution of trilinear data to get spectral and concentration profiles of CA and VA as a function of β-CD concentrations. Due to severe concentration profile overlapping between CA and VA in β-CD concentration dimension, PARAFAC could not be successfully applied to the studied samples. So, BLLS/RBL performed better than PARAFAC. The resolution of the model compounds was possible due to differences in the spectral absorbance changes of the β-CD complexes signals of the investigated analytes, opening a new approach for second-order data generation. The proposed method was validated by comparison with a reference method based on high-performance liquid chromatography photodiode array detection (HPLC-PDA), and no significant differences were found between the reference values and the ones obtained with the proposed method. Such a chemometrics-based protocol may be a very promising tool for more analytical applications in real samples monitoring, due to its advantages of simplicity, rapidity, accuracy, sufficient spectral resolution and concentration prediction even in the presence of unknown interferents. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Column-coupling strategies for multidimensional electrophoretic separation techniques.

    PubMed

    Kler, Pablo A; Sydes, Daniel; Huhn, Carolin

    2015-01-01

    Multidimensional electrophoretic separations represent one of the most common strategies for dealing with the analysis of complex samples. In recent years we have been witnessing the explosive growth of separation techniques for the analysis of complex samples in applications ranging from life sciences to industry. In this sense, electrophoretic separations offer several strategic advantages such as excellent separation efficiency, different methods with a broad range of separation mechanisms, and low liquid consumption generating less waste effluents and lower costs per analysis, among others. Despite their impressive separation efficiency, multidimensional electrophoretic separations present some drawbacks that have delayed their extensive use: the volumes of the columns, and consequently of the injected sample, are significantly smaller compared to other analytical techniques, thus the coupling interfaces between two separations components must be very efficient in terms of providing geometrical precision with low dead volume. Likewise, very sensitive detection systems are required. Additionally, in electrophoretic separation techniques, the surface properties of the columns play a fundamental role for electroosmosis as well as the unwanted adsorption of proteins or other complex biomolecules. In this sense the requirements for an efficient coupling for electrophoretic separation techniques involve several aspects related to microfluidics and physicochemical interactions of the electrolyte solutions and the solid capillary walls. It is interesting to see how these multidimensional electrophoretic separation techniques have been used jointly with different detection techniques, for intermediate detection as well as for final identification and quantification, particularly important in the case of mass spectrometry. In this work we present a critical review about the different strategies for coupling two or more electrophoretic separation techniques and the different intermediate and final detection methods implemented for such separations.

  9. The mantle source of island arc magmatism during early subduction: Evidence from Hf isotopes in rutile from the Jijal Complex (Kohistan arc, Pakistan)

    NASA Astrophysics Data System (ADS)

    Ewing, Tanya A.; Müntener, Othmar

    2018-05-01

    The Cretaceous-Paleogene Kohistan arc complex, northern Pakistan, is renowned as one of the most complete sections through a preserved paleo-island arc. The Jijal Complex represents a fragment of the plutonic roots of the Kohistan arc, formed during its early intraoceanic history. We present the first Hf isotope determinations for the Jijal Complex, made on rutile from garnet gabbros. These lithologies are zircon-free, but contain rutile that formed as an early phase. Recent developments in analytical capabilities coupled with a careful analytical and data reduction protocol allow the accurate determination of Hf isotope composition for rutile with <30 ppm Hf for the first time. Rutile from the analysed samples contains 5-35 ppm Hf, with sample averages of 13-17 ppm. Rutile from five samples from the Jijal Complex mafic section, sampling 2 km of former crustal thickness, gave indistinguishable Hf isotope compositions with εHf(i) ranging from 11.4 ± 3.2 to 20.1 ± 5.7. These values are within error of or only slightly more enriched than modern depleted mantle. The analysed samples record variable degrees of interaction with late-stage melt segregations, which produced symplectitic overprints on the main mineral assemblage as well as pegmatitic segregations of hydrous minerals. The indistinguishable εHf(i) across this range of lithologies demonstrates the robust preservation of the Hf isotope composition of rutile. The Hf isotope data, combined with previously published Nd isotope data for the Jijal Complex garnet gabbros, favour derivation from an inherently enriched, Indian Ocean type mantle. This implies a smaller contribution from subducted sediments than if the source was a normal (Pacific-type) depleted mantle. The Jijal Complex thus had only a limited recycled continental crustal component in its source, and represents a largely juvenile addition of new continental crust during the early phases of intraoceanic magmatism. The ability to determine the Hf isotope composition of rutile with low Hf contents is an important development for zircon-free mafic lithologies. This study highlights the potential of Hf isotope analysis of rutile to characterise the most juvenile deep arc crust cumulates worldwide.

  10. Analysis of food polyphenols by ultra high-performance liquid chromatography coupled to mass spectrometry: an overview.

    PubMed

    Motilva, Maria-José; Serra, Aida; Macià, Alba

    2013-05-31

    Phenolic compounds, which are widely distributed in plant-derived foods, recently attracted much attention due to their health benefits, so their determination in food samples is a topic of increasing interest. In the last few years, the development of chromatographic columns packed with sub-2μm particles and the modern high resolution mass spectrometry (MS) have opened up new possibilities for improving the analytical methods for complex sample matrices, such as ingredients, foods and biological samples. In addition, they have emerged as an ideal tool for profiling complex samples due to its speed, efficiency, sensitivity and selectivity. The present review addresses the use of the improved liquid chromatography (LC), ultra-high performance LC (UHPLC), coupled to MS or tandem MS (MS/MS) as the detector system for the determination of phenolic compounds in food samples. Additionally, the different strategies to extract, quantify the phenolic compounds and to reduce the matrix effect (%ME) are also reviewed. Finally, a briefly outline future trends of UHPLC-MS methods is commented. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Attitude Toward Ambiguity: Empirically Robust Factors in Self-Report Personality Scales.

    PubMed

    Lauriola, Marco; Foschi, Renato; Mosca, Oriana; Weller, Joshua

    2016-06-01

    Two studies were conducted to examine the factor structure of attitude toward ambiguity, a broad personality construct that refers to personal reactions to perceived ambiguous stimuli in a variety of context and situations. Using samples from two countries, Study 1 mapped the hierarchical structure of 133 items from seven tolerance-intolerance of ambiguity scales (N = 360, Italy; N = 306, United States). Three major factors-Discomfort with Ambiguity, Moral Absolutism/Splitting, and Need for Complexity and Novelty-were recovered in each country with high replicability coefficients across samples. In Study 2 (N = 405, Italian community sample; N =366, English native speakers sample), we carried out a confirmatory analysis on selected factor markers. A bifactor model had an acceptable fit for each sample and reached the construct-level invariance for general and group factors. Convergent validity with related traits was assessed in both studies. We conclude that attitude toward ambiguity can be best represented a multidimensional construct involving affective (Discomfort with Ambiguity), cognitive (Moral Absolutism/Splitting), and epistemic (Need for Complexity and Novelty) components. © The Author(s) 2015.

  12. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  13. Analysis of polychlorinated n-alkanes in environmental samples.

    PubMed

    Santos, F J; Parera, J; Galceran, M T

    2006-10-01

    Polychlorinated n-alkanes (PCAs), also known as chlorinated paraffins (CPs), are highly complex technical mixtures that contain a huge number of structural isomers, theoretically more than 10,000 diastereomers and enantiomers. As a consequence of their persistence, tendency to bioaccumulation, and widespread and unrestricted use, PCAs have been found in aquatic and terrestrial food webs, even in rural and remote areas. Recently, these compounds have been included in regulatory programs of several international organizations, including the US Environmental Protection Agency and the European Union. Consequently, there is a growing demand for reliable methods with which to analyze PCAs in environmental samples. Here, we review current trends and recent developments in the analysis of PCAs in environmental samples such as air, water, sediment, and biota. Practical aspects of sample preparation, chromatographic separation, and detection are covered, with special emphasis placed on analysis of PCAs using gas chromatography-mass spectrometry. The advantages and limitations of these techniques as well as recent improvements in quantification procedures are discussed.

  14. Direct Analysis of Triterpenes from High-Salt Fermented Cucumbers Using Infrared Matrix-Assisted Laser Desorption Electrospray Ionization (IR-MALDESI)

    NASA Astrophysics Data System (ADS)

    Ekelöf, Måns; McMurtrie, Erin K.; Nazari, Milad; Johanningsmeier, Suzanne D.; Muddiman, David C.

    2017-02-01

    High-salt samples present a challenge to mass spectrometry (MS) analysis, particularly when electrospray ionization (ESI) is used, requiring extensive sample preparation steps such as desalting, extraction, and purification. In this study, infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) coupled to a Q Exactive Plus mass spectrometer was used to directly analyze 50-μm thick slices of cucumber fermented and stored in 1 M sodium chloride brine. From the several hundred unique substances observed, three triterpenoid lipids produced by cucumbers, β-sitosterol, stigmasterol, and lupeol, were putatively identified based on exact mass and selected for structural analysis. The spatial distribution of the lipids were imaged, and the putative assignments were confirmed by tandem mass spectrometry performed directly on the same cucumber, demonstrating the capacity of the technique to deliver confident identifications from highly complex samples in molar concentrations of salt without the need for sample preparation.

  15. Impact of Oriented Clay Particles on X-Ray Spectroscopy Analysis

    NASA Astrophysics Data System (ADS)

    Lim, A. J. M. S.; Syazwani, R. N.; Wijeyesekera, D. C.

    2016-07-01

    Understanding the engineering properties of the mineralogy and microfabic of clayey soils is very complex and thus very difficult for soil characterization. Micromechanics of soils recognize that the micro structure and mineralogy of clay have a significant influence on its engineering behaviour. To achieve a more reliable quantitative evaluation of clay mineralogy, a proper sample preparation technique for quantitative clay mineral analysis is necessary. This paper presents the quantitative evaluation of elemental analysis and chemical characterization of oriented and random oriented clay particles using X-ray spectroscopy. Three different types of clays namely marine clay, bentonite and kaolin clay were studied. The oriented samples were prepared by placing the dispersed clay in water and left to settle on porous ceramic tiles by applying a relatively weak suction through a vacuum pump. Images form a Scanning Electron Microscope (SEM) was also used to show the comparison between the orientation patterns of both the sample preparation techniques. From the quantitative analysis of the X-ray spectroscopy, oriented sampling method showed more accuracy in identifying mineral deposits, because it produced better peak intensity on the spectrum and more mineral content can be identified compared to randomly oriented samples.

  16. Metal speciation of environmental samples using SPE and SFC-AED analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, S.C.; Burford, M.D.; Robson, M.

    1995-12-31

    Due to growing public concern over heavy metals in the environment, soil, water and air particulate samples azre now routinely screened for their metal content. Conventional metal analysis typically involves acid digestion extraction and results in the generation of large aqueous and organic solvent waste. This harsh extraction process is usually used to obtain the total metal content of the sample, the extract being analysed by atomic emission or absorption spectroscoply techniques. A more selective method of metal extraction has been investigated which uses a supercritical fluid modified with a complexing agent. The relatively mild extraction method enables both organometallicmore » and inorganic metal species to be recovered intact. The various components from the supercritical fluid extract can be chromatographically separated using supercritical fluid chromatography (SFC) and positive identification of the metals achieved using atomic emission detection (AED). The aim of the study is to develop an analytical extraction procedure which enables a rapid, sensitive and quantitative analysis of metals in environmental samples, using just one extraction (eg SFE) and one analysis (eg SFC-AED) procedure.« less

  17. Optofluidic analysis system for amplification-free, direct detection of Ebola infection

    NASA Astrophysics Data System (ADS)

    Cai, H.; Parks, J. W.; Wall, T. A.; Stott, M. A.; Stambaugh, A.; Alfson, K.; Griffiths, A.; Mathies, R. A.; Carrion, R.; Patterson, J. L.; Hawkins, A. R.; Schmidt, H.

    2015-09-01

    The massive outbreak of highly lethal Ebola hemorrhagic fever in West Africa illustrates the urgent need for diagnostic instruments that can identify and quantify infections rapidly, accurately, and with low complexity. Here, we report on-chip sample preparation, amplification-free detection and quantification of Ebola virus on clinical samples using hybrid optofluidic integration. Sample preparation and target preconcentration are implemented on a PDMS-based microfluidic chip (automaton), followed by single nucleic acid fluorescence detection in liquid-core optical waveguides on a silicon chip in under ten minutes. We demonstrate excellent specificity, a limit of detection of 0.2 pfu/mL and a dynamic range of thirteen orders of magnitude, far outperforming other amplification-free methods. This chip-scale approach and reduced complexity compared to gold standard RT-PCR methods is ideal for portable instruments that can provide immediate diagnosis and continued monitoring of infectious diseases at the point-of-care.

  18. POWER ANALYSIS FOR COMPLEX MEDIATIONAL DESIGNS USING MONTE CARLO METHODS

    PubMed Central

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2013-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex mediational models. The approach is based on the well known technique of generating a large number of samples in a Monte Carlo study, and estimating power as the percentage of cases in which an estimate of interest is significantly different from zero. Examples of power calculation for commonly used mediational models are provided. Power analyses for the single mediator, multiple mediators, three-path mediation, mediation with latent variables, moderated mediation, and mediation in longitudinal designs are described. Annotated sample syntax for Mplus is appended and tabled values of required sample sizes are shown for some models. PMID:23935262

  19. Chemical analysis of acoustically levitated drops by Raman spectroscopy.

    PubMed

    Tuckermann, Rudolf; Puskar, Ljiljana; Zavabeti, Mahta; Sekine, Ryo; McNaughton, Don

    2009-07-01

    An experimental apparatus combining Raman spectroscopy with acoustic levitation, Raman acoustic levitation spectroscopy (RALS), is investigated in the field of physical and chemical analytics. Whereas acoustic levitation enables the contactless handling of microsized samples, Raman spectroscopy offers the advantage of a noninvasive method without complex sample preparation. After carrying out some systematic tests to probe the sensitivity of the technique to drop size, shape, and position, RALS has been successfully applied in monitoring sample dilution and preconcentration, evaporation, crystallization, an acid-base reaction, and analytes in a surface-enhanced Raman spectroscopy colloidal suspension.

  20. Small sample estimation of the reliability function for technical products

    NASA Astrophysics Data System (ADS)

    Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.

    2017-12-01

    It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.

  1. Ion mobility-mass spectrometry as a tool to investigate protein-ligand interactions.

    PubMed

    Göth, Melanie; Pagel, Kevin

    2017-07-01

    Ion mobility-mass spectrometry (IM-MS) is a powerful tool for the simultaneous analysis of mass, charge, size, and shape of ionic species. It allows the characterization of even low-abundant species in complex samples and is therefore particularly suitable for the analysis of proteins and their assemblies. In the last few years even complex and intractable species have been investigated successfully with IM-MS and the number of publications in this field is steadily growing. This trend article highlights recent advances in which IM-MS was used to study protein-ligand complexes and in particular focuses on the catch and release (CaR) strategy and collision-induced unfolding (CIU). Graphical Abstract Native mass spectrometry and ion mobility-mass spectrometry are versatile tools to follow the stoichiometry, energetics, and structural impact of protein-ligand binding.

  2. Multiplexing N-glycan analysis by DNA analyzer.

    PubMed

    Feng, Hua-Tao; Li, Pingjing; Rui, Guo; Stray, James; Khan, Shaheer; Chen, Shiaw-Min; Li, Sam F Y

    2017-07-01

    Analysis of N-glycan structures has been gaining attentions over the years due to their critical importance to biopharma-based applications and growing roles in biological research. Glycan profiling is also critical to the development of biosimilar drugs. The detailed characterization of N-glycosylation is mandatory because it is a nontemplate driven process and that significantly influences critical properties such as bio-safety and bio-activity. The ability to comprehensively characterize highly complex mixtures of N-glycans has been analytically challenging and stimulating because of the difficulties in both the structure complexity and time-consuming sample pretreatment procedures. CE-LIF is one of the typical techniques for N-glycan analysis due to its high separation efficiency. In this paper, a 16-capillary DNA analyzer was coupled with a magnetic bead glycan purification method to accelerate the sample preparation procedure and therefore increase N-glycan assay throughput. Routinely, the labeling dye used for CE-LIF is 8-aminopyrene-1,3,6-trisulfonic acid, while the typical identification method involves matching migration times with database entries. Two new fluorescent dyes were used to either cross-validate and increase the glycan identification precision or simplify sample preparation steps. Exoglycosidase studies were carried out using neuramididase, galactosidase, and fucosidase to confirm the results of three dye cross-validation. The optimized method combines the parallel separation capacity of multiple-capillary separation with three labeling dyes, magnetic bead assisted preparation, and exoglycosidase treatment to allow rapid and accurate analysis of N-glycans. These new methods provided enough useful structural information to permit N-glycan structure elucidation with only one sample injection. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Ruminant Rhombencephalitis-Associated Listeria monocytogenes Alleles Linked to a Multilocus Variable-Number Tandem-Repeat Analysis Complex ▿ †

    PubMed Central

    Balandyté, Lina; Brodard, Isabelle; Frey, Joachim; Oevermann, Anna; Abril, Carlos

    2011-01-01

    Listeria monocytogenes is among the most important food-borne pathogens and is well adapted to persist in the environment. To gain insight into the genetic relatedness and potential virulence of L. monocytogenes strains causing central nervous system (CNS) infections, we used multilocus variable-number tandem-repeat analysis (MLVA) to subtype 183 L. monocytogenes isolates, most from ruminant rhombencephalitis and some from human patients, food, and the environment. Allelic-profile-based comparisons grouped L. monocytogenes strains mainly into three clonal complexes and linked single-locus variants (SLVs). Clonal complex A essentially consisted of isolates from human and ruminant brain samples. All but one rhombencephalitis isolate from cattle were located in clonal complex A. In contrast, food and environmental isolates mainly clustered into clonal complex C, and none was classified as clonal complex A. Isolates of the two main clonal complexes (A and C) obtained by MLVA were analyzed by PCR for the presence of 11 virulence-associated genes (prfA, actA, inlA, inlB, inlC, inlD, inlE, inlF, inlG, inlJ, and inlC2H). Virulence gene analysis revealed significant differences in the actA, inlF, inlG, and inlJ allelic profiles between clinical isolates (complex A) and nonclinical isolates (complex C). The association of particular alleles of actA, inlF, and newly described alleles of inlJ with isolates from CNS infections (particularly rhombencephalitis) suggests that these virulence genes participate in neurovirulence of L. monocytogenes. The overall absence of inlG in clinical complex A and its presence in complex C isolates suggests that the InlG protein is more relevant for the survival of L. monocytogenes in the environment. PMID:21984240

  4. Diagnostic value of succinate ubiquinone reductase activity in the identification of patients with mitochondrial DNA depletion.

    PubMed

    Hargreaves, P; Rahman, S; Guthrie, P; Taanman, J W; Leonard, J V; Land, J M; Heales, S J R

    2002-02-01

    Mitochondrial DNA (mtDNA) depletion syndrome (McKusick 251880) is characterized by a progressive quantitative loss of mtDNA resulting in severe mitochondrial dysfunction. A diagnosis of mtDNA depletion can only be confirmed after Southern blot analysis of affected tissue. Only a limited number of centres have the facilities to offer this service, and this is frequently on an irregular basis. There is therefore a need for a test that can refine sample selection as well as complementing the molecular analysis. In this study we compared the activities of the nuclear-encoded succinate ubiquinone reductase (complex II) to the activities of the combined mitochondrial and nuclear-encoded mitochondrial electron transport chain (ETC) complexes; NADH:ubiquinone reductase (complex I), ubiquinol-cytochrome-c reductase (complex III), and cytochrome-c oxidase (complex IV), in skeletal muscle biopsies from 7 patients with confirmed mtDNA depletion. In one patient there was no evidence of an ETC defect. However, the remaining 6 patients exhibited reduced complex I and IV activities. Five of these patients also displayed reduced complex II-III (succinate:cytochrome-c reductase) activity. Individual measurement of complex II and complex III activities demonstrated normal levels of complex II activity compared to complex III, which was reduced in the 5 biopsies assayed. These findings suggest a possible diagnostic value for the detection of normal levels of complex II activity in conjunction with reduced complex I, III and IV activity in the identification of likely candidates for mtDNA depletion syndrome

  5. Development of liquid chromatography high resolution mass spectrometry strategies for the screening of complex organic matter: Application to astrophysical simulated materials.

    PubMed

    Eddhif, Balkis; Allavena, Audrey; Liu, Sylvie; Ribette, Thomas; Abou Mrad, Ninette; Chiavassa, Thierry; d'Hendecourt, Louis Le Sergeant; Sternberg, Robert; Danger, Gregoire; Geffroy-Rodier, Claude; Poinot, Pauline

    2018-03-01

    The present work aims at developing two LC-HRMS setups for the screening of organic matter in astrophysical samples. Their analytical development has been demonstrated on a 100-µg residue coming from the photo-thermo chemical processing of a cometary ice analog produced in laboratory. The first 1D-LC-HRMS setup combines a serially coupled columns configuration with HRMS detection. It has allowed to discriminate among different chemical families (amino acids, sugars, nucleobases and oligopeptides) in only one chromatographic run without neither a priori acid hydrolysis nor chemical derivatisation. The second setup is a dual-LC configuration which connects a series of trapping columns with analytical reverse-phase columns. By coupling on-line these two distinct LC units with a HRMS detection, high mass compounds (350

  6. Multiplex quantification of protein toxins in human biofluids and food matrices using immunoextraction and high-resolution targeted mass spectrometry.

    PubMed

    Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François

    2015-08-18

    The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.

  7. Touch Spray Mass Spectrometry for In Situ Analysis of Complex Samples

    PubMed Central

    Kerian, Kevin S.; Jarmusch, Alan K.; Cooks, R. Graham

    2014-01-01

    Touch spray, a spray-based ambient in-situ ionization method, uses a small probe, e.g. a teasing needle to pick up sample and the application of voltage and solvent to cause field-induced droplet emission. Compounds extracted from the microsample are incorporated into the sprayed micro droplets. Performance tests include disease state of tissue, microorganism identification, and therapeutic drug quantitation. Chemical derivatization is performed simultaneously with ionization. PMID:24756256

  8. Meta-analysis of gene-level tests for rare variant association.

    PubMed

    Liu, Dajiang J; Peloso, Gina M; Zhan, Xiaowei; Holmen, Oddgeir L; Zawistowski, Matthew; Feng, Shuang; Nikpay, Majid; Auer, Paul L; Goel, Anuj; Zhang, He; Peters, Ulrike; Farrall, Martin; Orho-Melander, Marju; Kooperberg, Charles; McPherson, Ruth; Watkins, Hugh; Willer, Cristen J; Hveem, Kristian; Melander, Olle; Kathiresan, Sekar; Abecasis, Gonçalo R

    2014-02-01

    The majority of reported complex disease associations for common genetic variants have been identified through meta-analysis, a powerful approach that enables the use of large sample sizes while protecting against common artifacts due to population structure and repeated small-sample analyses sharing individual-level data. As the focus of genetic association studies shifts to rare variants, genes and other functional units are becoming the focus of analysis. Here we propose and evaluate new approaches for performing meta-analysis of rare variant association tests, including burden tests, weighted burden tests, variable-threshold tests and tests that allow variants with opposite effects to be grouped together. We show that our approach retains useful features from single-variant meta-analysis approaches and demonstrate its use in a study of blood lipid levels in ∼18,500 individuals genotyped with exome arrays.

  9. Compound-Specific Isotope Analysis of Amino Acids for Stardust-Returned Samples

    NASA Technical Reports Server (NTRS)

    Cook, Jamie; Elsila, Jamie E.; Stern J. C.; Glavin, D. P.; Dworkin, J. P.

    2008-01-01

    Significant portions of the early Earth's prebiotic organic inventory , including amino acids, could have been delivered to the Earth's sur face by comets and their fragments. Analysis of comets via spectrosc opic observations has identified many organic molecules, including me thane, ethane, arnmonia, cyanic acid, formaldehyde, formamide, acetal ehyde, acetonitrile, and methanol. Reactions between these identifie d molecules could allow the formation of more complex organics such a s amino acids. Isotopic analysis could reveal whether an extraterrest rial signature is present in the Stardust-exposed amines and amino ac ids. Although bulk isotopic analysis would be dominated by the EACA contaminant's terrestrial signature, compoundspecific isotope analysi s (CSIA) could determine the signature of each of the other individua l amines. Here, we report on progress made towards CSIA of the amino acids glycine and EACA in Stardustreturned samples.

  10. The spectra program library: A PC based system for gamma-ray spectra analysis and INAA data reduction

    USGS Publications Warehouse

    Baedecker, P.A.; Grossman, J.N.

    1995-01-01

    A PC based system has been developed for the analysis of gamma-ray spectra and for the complete reduction of data from INAA experiments, including software to average the results from mulitple lines and multiple countings and to produce a final report of analysis. Graphics algorithms may be called for the analysis of complex spectral features, to compare the data from alternate photopeaks and to evaluate detector performance during a given counting cycle. A database of results for control samples can be used to prepare quality control charts to evaluate long term precision and to search for systemic variations in data on reference samples as a function of time. The entire software library can be accessed through a user-friendly menu interface with internal help.

  11. The Value of Extended Pedigrees for Next-Generation Analysis of Complex Disease in the Rhesus Macaque

    PubMed Central

    Vinson, Amanda; Prongay, Kamm; Ferguson, Betsy

    2013-01-01

    Complex diseases (e.g., cardiovascular disease and type 2 diabetes, among many others) pose the biggest threat to human health worldwide and are among the most challenging to investigate. Susceptibility to complex disease may be caused by multiple genetic variants (GVs) and their interaction, by environmental factors, and by interaction between GVs and environment, and large study cohorts with substantial analytical power are typically required to elucidate these individual contributions. Here, we discuss the advantages of both power and feasibility afforded by the use of extended pedigrees of rhesus macaques (Macaca mulatta) for genetic studies of complex human disease based on next-generation sequence data. We present these advantages in the context of previous research conducted in rhesus macaques for several representative complex diseases. We also describe a single, multigeneration pedigree of Indian-origin rhesus macaques and a sample biobank we have developed for genetic analysis of complex disease, including power of this pedigree to detect causal GVs using either genetic linkage or association methods in a variance decomposition approach. Finally, we summarize findings of significant heritability for a number of quantitative traits that demonstrate that genetic contributions to risk factors for complex disease can be detected and measured in this pedigree. We conclude that the development and application of an extended pedigree to analysis of complex disease traits in the rhesus macaque have shown promising early success and that genome-wide genetic and higher order -omics studies in this pedigree are likely to yield useful insights into the architecture of complex human disease. PMID:24174435

  12. An inexpensive, temporally-integrated system for monitoring occurrence and biological effects of aquatic contaminants in the field

    EPA Science Inventory

    Assessment of potential ecological risks of complex contaminant mixtures in the environment requires integrated chemical and biological approaches. Instrumental analysis of environmental samples alone can identify contaminants, but provides only limited insights as to possible a...

  13. Strong selection at MHC in Mexicans since admixture

    USDA-ARS?s Scientific Manuscript database

    Mexicans are a recent admixture of Amerindians, Europeans, and Africans. We performed local ancestry analysis of Mexican samples from two genome-wide association studies obtained from dbGaP, and discovered that at the major histocompatibility complex (MHC) region Mexicans have excessive African ance...

  14. Next Generation Spectrometers for Rapid Analysis of Complex Mixtures

    DTIC Science & Technology

    2014-05-29

    digitizer by collecting 1 and 100 million averages of a molecular sample ( nitromethane ) and verifying the expected factor of 10 improvement in signal-to...spectra of nitromethane . The positive-going spectrum represents 100 million averages while the negative-going spectrum represents 1 million averages

  15. Knowledge, Learning, Analysis System (KLAS)

    USDA-ARS?s Scientific Manuscript database

    The goal of KLAS is to develop a new scientific approach that takes advantage of the data deluge, defined here as both legacy data and new data acquired from environmental and biotic sensors, complex simulation models, and improved technologies for probing biophysical samples. This approach can be i...

  16. Real-time analysis of dual-display phage immobilization and autoantibody screening using quartz crystal microbalance with dissipation monitoring.

    PubMed

    Rajaram, Kaushik; Losada-Pérez, Patricia; Vermeeren, Veronique; Hosseinkhani, Baharak; Wagner, Patrick; Somers, Veerle; Michiels, Luc

    2015-01-01

    Over the last three decades, phage display technology has been used for the display of target-specific biomarkers, peptides, antibodies, etc. Phage display-based assays are mostly limited to the phage ELISA, which is notorious for its high background signal and laborious methodology. These problems have been recently overcome by designing a dual-display phage with two different end functionalities, namely, streptavidin (STV)-binding protein at one end and a rheumatoid arthritis-specific autoantigenic target at the other end. Using this dual-display phage, a much higher sensitivity in screening specificities of autoantibodies in complex serum sample has been detected compared to single-display phage system on phage ELISA. Herein, we aimed to develop a novel, rapid, and sensitive dual-display phage to detect autoantibodies presence in serum samples using quartz crystal microbalance with dissipation monitoring as a sensing platform. The vertical functionalization of the phage over the STV-modified surfaces resulted in clear frequency and dissipation shifts revealing a well-defined viscoelastic signature. Screening for autoantibodies using antihuman IgG-modified surfaces and the dual-display phage with STV magnetic bead complexes allowed to isolate the target entities from complex mixtures and to achieve a large response as compared to negative control samples. This novel dual-display strategy can be a potential alternative to the time consuming phage ELISA protocols for the qualitative analysis of serum autoantibodies and can be taken as a departure point to ultimately achieve a point of care diagnostic system.

  17. A Simple and Computationally Efficient Sampling Approach to Covariate Adjustment for Multifactor Dimensionality Reduction Analysis of Epistasis

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    Epistasis or gene-gene interaction is a fundamental component of the genetic architecture of complex traits such as disease susceptibility. Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free method to detect epistasis when there are no significant marginal genetic effects. However, in many studies of complex disease, other covariates like age of onset and smoking status could have a strong main effect and may potentially interfere with MDR's ability to achieve its goal. In this paper, we present a simple and computationally efficient sampling method to adjust for covariate effects in MDR. We use simulation to show that after adjustment, MDR has sufficient power to detect true gene-gene interactions. We also compare our method with the state-of-art technique in covariate adjustment. The results suggest that our proposed method performs similarly, but is more computationally efficient. We then apply this new method to an analysis of a population-based bladder cancer study in New Hampshire. PMID:20924193

  18. Used tire recycling to produce granulates: evaluation of occupational exposure to chemical agents.

    PubMed

    Savary, Barbara; Vincent, Raymond

    2011-10-01

    Exposure was assessed in four facilities where used tires are turned into rubber granulates. Particulate exposure levels were measured using filter samples and gravimetric analysis. In parallel, volatile organic compounds (VOCs) screening was carried out using samples taken on activated carbon supports, followed by an analysis using a gas chromatograph coupled to a spectrometric detector. The exposure level medians are between 0.58 and 3.95 mg m(-3). Clogging of the textile fiber separation systems can lead to worker exposure; in this case, the measured concentrations can reach 41 mg m(-3). However, in contrast to the data in the literature, VOC levels >1 p.p.m. were not detected. The particulate mixtures deposited on the installation surfaces are complex; some of the chemical agents are toxic to humans. The results of this study indicate significant exposure to complex mixtures of rubber dust. Optimizing exhaust ventilation systems inside the shredders, with a cyclone for example, is essential for reducing the exposure of workers in this rapidly developing sector.

  19. Comparative Analysis of Mass Spectral Similarity Measures on Peak Alignment for Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry

    PubMed Central

    2013-01-01

    Peak alignment is a critical procedure in mass spectrometry-based biomarker discovery in metabolomics. One of peak alignment approaches to comprehensive two-dimensional gas chromatography mass spectrometry (GC×GC-MS) data is peak matching-based alignment. A key to the peak matching-based alignment is the calculation of mass spectral similarity scores. Various mass spectral similarity measures have been developed mainly for compound identification, but the effect of these spectral similarity measures on the performance of peak matching-based alignment still remains unknown. Therefore, we selected five mass spectral similarity measures, cosine correlation, Pearson's correlation, Spearman's correlation, partial correlation, and part correlation, and examined their effects on peak alignment using two sets of experimental GC×GC-MS data. The results show that the spectral similarity measure does not affect the alignment accuracy significantly in analysis of data from less complex samples, while the partial correlation performs much better than other spectral similarity measures when analyzing experimental data acquired from complex biological samples. PMID:24151524

  20. DTWscore: differential expression and cell clustering analysis for time-series single-cell RNA-seq data.

    PubMed

    Wang, Zhuo; Jin, Shuilin; Liu, Guiyou; Zhang, Xiurui; Wang, Nan; Wu, Deliang; Hu, Yang; Zhang, Chiping; Jiang, Qinghua; Xu, Li; Wang, Yadong

    2017-05-23

    The development of single-cell RNA sequencing has enabled profound discoveries in biology, ranging from the dissection of the composition of complex tissues to the identification of novel cell types and dynamics in some specialized cellular environments. However, the large-scale generation of single-cell RNA-seq (scRNA-seq) data collected at multiple time points remains a challenge to effective measurement gene expression patterns in transcriptome analysis. We present an algorithm based on the Dynamic Time Warping score (DTWscore) combined with time-series data, that enables the detection of gene expression changes across scRNA-seq samples and recovery of potential cell types from complex mixtures of multiple cell types. The DTWscore successfully classify cells of different types with the most highly variable genes from time-series scRNA-seq data. The study was confined to methods that are implemented and available within the R framework. Sample datasets and R packages are available at https://github.com/xiaoxiaoxier/DTWscore .

  1. Lab on a Chip

    NASA Astrophysics Data System (ADS)

    Puget, P.

    The reliable and fast detection of chemical or biological molecules, or the measurement of their concentrations in a sample, are key problems in many fields such as environmental analysis, medical diagnosis, or the food industry. There are traditionally two approaches to this problem. The first aims to carry out a measurement in situ in the sample using chemical and biological sensors. The constraints imposed by detection limits, specificity, and in some cases stability are entirely imputed to the sensor. The second approach uses so-called total analysis systems to process the sample according to a protocol made up of different steps, such as extractions, purifications, concentrations, and a final detection stage. The latter is made in better conditions than with the first approach, which may justify the greater complexity of the process. It is this approach that is implemented in most methods for identifying pathogens, whether they be in biological samples (especially for in vitro diagnosis) or samples taken from the environment. The instrumentation traditionally used to carry out these protocols comprises a set of bulky benchtop apparatus, which needs to be plugged into the mains in order to function. However, there are many specific applications (to be discussed in this chapter) for which analysis instruments with the following characteristics are needed: Possibility of use outside the laboratory, i.e., instruments as small as possible, consuming little energy, and largely insensitive to external conditions of temperature, humidity, vibrations, and so on. Possibility of use by non-specialised agents, or even unmanned operation. Possibility of handling a large number of samples in a limited time, typically for high-throughput screening applications. Possibility of handling small samples. At the same time, a high level of performance is required, in particular in terms of (1) the detection limit, which must be as low as possible, (2) specificity, i.e., the ability to detect a particular molecule in a complex mixture, and (3) speed.

  2. Modeling Complex Workflow in Molecular Diagnostics

    PubMed Central

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  3. The moisture outgassing kinetics of a silica reinforced polydimethylsiloxane

    NASA Astrophysics Data System (ADS)

    Sharma, H. N.; McLean, W.; Maxwell, R. S.; Dinh, L. N.

    2016-09-01

    A silica-filled polydimethylsiloxane (PDMS) composite M9787 was investigated for potential outgassing in a vacuum/dry environment with the temperature programmed desorption/reaction method. The outgassing kinetics of 463 K vacuum heat-treated samples, vacuum heat-treated samples which were subsequently re-exposed to moisture, and untreated samples were extracted using the isoconversional and constrained iterative regression methods in a complementary fashion. Density functional theory (DFT) calculations of water interactions with a silica surface were also performed to provide insight into the structural motifs leading to the obtained kinetic parameters. Kinetic analysis/model revealed that no outgassing occurs from the vacuum heat-treated samples in subsequent vacuum/dry environment applications at room temperature (˜300 K). The main effect of re-exposure of the vacuum heat-treated samples to a glove box condition (˜30 ppm by volume of H2O) for even a couple of days was the formation, on the silica surface fillers, of ˜60 ppm by weight of physisorbed and loosely bonded moisture, which subsequently outgasses at room temperature in a vacuum/dry environment in a time span of 10 yr. However, without any vacuum heat treatment and even after 1 h of vacuum pump down, about 300 ppm by weight of H2O would be released from the PDMS in the next few hours. Thereafter the outgassing rate slows down substantially. The presented methodology of using the isoconversional kinetic analysis results and some appropriate nature of the reaction as the constraints for more accurate iterative regression analysis/deconvolution of complex kinetic spectra, and of checking the so-obtained results with first principle calculations such as DFT can serve as a template for treating other complex physical/chemical processes as well.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, H. N.; McLean, W.; Maxwell, R. S.

    We investigated a silica-filled polydimethylsiloxane (PDMS) composite M9787 for potential outgassing in a vacuum/dry environment with the temperature programmed desorption/reaction method. The outgassing kinetics of 463 K vacuum heat-treated samples, vacuum heat-treated samples which were subsequently re-exposed to moisture, and untreated samples were extracted using the isoconversional and constrained iterative regression methods in a complementary fashion. Density functional theory (DFT) calculations of water interactions with a silica surface were also performed to provide insight into the structural motifs leading to the obtained kinetic parameters. Kinetic analysis/model revealed that no outgassing occurs from the vacuum heat-treated samples in subsequent vacuum/dry environmentmore » applications at room temperature (~300 K). Moreover, the main effect of re-exposure of the vacuum heat-treated samples to a glove box condition (~30 ppm by volume of H 2O) for even a couple of days was the formation, on the silica surface fillers, of ~60 ppm by weight of physisorbed and loosely bonded moisture, which subsequently outgasses at room temperature in a vacuum/dry environment in a time span of 10 yr. However, without any vacuum heat treatment and even after 1 h of vacuum pump down, about 300 ppm by weight of H 2O would be released from the PDMS in the next few hours. Thereafter the outgassing rate slows down substantially. Our presented methodology of using the isoconversional kinetic analysis results and some appropriate nature of the reaction as the constraints for more accurate iterative regression analysis/deconvolution of complex kinetic spectra, and of checking the so-obtained results with first principle calculations such as DFT can serve as a template for treating other complex physical/chemical processes as well.« less

  5. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed

    West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  6. Spatially-Resolved Proteomics: Rapid Quantitative Analysis of Laser Capture Microdissected Alveolar Tissue Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clair, Geremy; Piehowski, Paul D.; Nicola, Teodora

    Global proteomics approaches allow characterization of whole tissue lysates to an impressive depth. However, it is now increasingly recognized that to better understand the complexity of multicellular organisms, global protein profiling of specific spatially defined regions/substructures of tissues (i.e. spatially-resolved proteomics) is essential. Laser capture microdissection (LCM) enables microscopic isolation of defined regions of tissues preserving crucial spatial information. However, current proteomics workflows entail several manual sample preparation steps and are challenged by the microscopic mass-limited samples generated by LCM, and that impact measurement robustness, quantification, and throughput. Here, we coupled LCM with a fully automated sample preparation workflow thatmore » with a single manual step allows: protein extraction, tryptic digestion, peptide cleanup and LC-MS/MS analysis of proteomes from microdissected tissues. Benchmarking against the current state of the art in ultrasensitive global proteomic analysis, our approach demonstrated significant improvements in quantification and throughput. Using our LCM-SNaPP proteomics approach, we characterized to a depth of more than 3,400 proteins, the ontogeny of protein changes during normal lung development in laser capture microdissected alveolar tissue containing ~4,000 cells per sample. Importantly, the data revealed quantitative changes for 350 low abundance transcription factors and signaling molecules, confirming earlier transcript-level observations and defining seven modules of coordinated transcription factor/signaling molecule expression patterns, suggesting that a complex network of temporal regulatory control directs normal lung development with epigenetic regulation fine-tuning pre-natal developmental processes. Our LCM-proteomics approach facilitates efficient, spatially-resolved, ultrasensitive global proteomics analyses in high-throughput that will be enabling for several clinical and biological applications.« less

  7. The moisture outgassing kinetics of a silica reinforced polydimethylsiloxane

    DOE PAGES

    Sharma, H. N.; McLean, W.; Maxwell, R. S.; ...

    2016-09-21

    We investigated a silica-filled polydimethylsiloxane (PDMS) composite M9787 for potential outgassing in a vacuum/dry environment with the temperature programmed desorption/reaction method. The outgassing kinetics of 463 K vacuum heat-treated samples, vacuum heat-treated samples which were subsequently re-exposed to moisture, and untreated samples were extracted using the isoconversional and constrained iterative regression methods in a complementary fashion. Density functional theory (DFT) calculations of water interactions with a silica surface were also performed to provide insight into the structural motifs leading to the obtained kinetic parameters. Kinetic analysis/model revealed that no outgassing occurs from the vacuum heat-treated samples in subsequent vacuum/dry environmentmore » applications at room temperature (~300 K). Moreover, the main effect of re-exposure of the vacuum heat-treated samples to a glove box condition (~30 ppm by volume of H 2O) for even a couple of days was the formation, on the silica surface fillers, of ~60 ppm by weight of physisorbed and loosely bonded moisture, which subsequently outgasses at room temperature in a vacuum/dry environment in a time span of 10 yr. However, without any vacuum heat treatment and even after 1 h of vacuum pump down, about 300 ppm by weight of H 2O would be released from the PDMS in the next few hours. Thereafter the outgassing rate slows down substantially. Our presented methodology of using the isoconversional kinetic analysis results and some appropriate nature of the reaction as the constraints for more accurate iterative regression analysis/deconvolution of complex kinetic spectra, and of checking the so-obtained results with first principle calculations such as DFT can serve as a template for treating other complex physical/chemical processes as well.« less

  8. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed Central

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  9. Impact of distributions on the archetypes and prototypes in heterogeneous nanoparticle ensembles.

    PubMed

    Fernandez, Michael; Wilson, Hugh F; Barnard, Amanda S

    2017-01-05

    The magnitude and complexity of the structural and functional data available on nanomaterials requires data analytics, statistical analysis and information technology to drive discovery. We demonstrate that multivariate statistical analysis can recognise the sets of truly significant nanostructures and their most relevant properties in heterogeneous ensembles with different probability distributions. The prototypical and archetypal nanostructures of five virtual ensembles of Si quantum dots (SiQDs) with Boltzmann, frequency, normal, Poisson and random distributions are identified using clustering and archetypal analysis, where we find that their diversity is defined by size and shape, regardless of the type of distribution. At the complex hull of the SiQD ensembles, simple configuration archetypes can efficiently describe a large number of SiQDs, whereas more complex shapes are needed to represent the average ordering of the ensembles. This approach provides a route towards the characterisation of computationally intractable virtual nanomaterial spaces, which can convert big data into smart data, and significantly reduce the workload to simulate experimentally relevant virtual samples.

  10. Refined two-index entropy and multiscale analysis for complex system

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2016-10-01

    As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.

  11. Determination of arsenic species in rice samples using CPE and ETAAS.

    PubMed

    Costa, Bruno Elias Dos Santos; Coelho, Nívia Maria Melo; Coelho, Luciana Melo

    2015-07-01

    A highly sensitive and selective procedure for the determination of arsenate and total arsenic in food by electrothermal atomic absorption spectrometry after cloud point extraction (ETAAS/CPE) was developed. The procedure is based on the formation of a complex of As(V) ions with molybdate in the presence of 50.0 mmol L(-1) sulfuric acid. The complex was extracted into the surfactant-rich phase of 0.06% (w/v) Triton X-114. The variables affecting the complex formation, extraction and phase separation were optimized using factorial designs. Under the optimal conditions, the calibration graph was linear in the range of 0.05-10.0 μg L(-1). The detection and quantification limits were 10 and 33 ng L(-1), respectively and the corresponding value for the relative standard deviation for 10 replicates was below 5%. Recovery values of between 90.8% and 113.1% were obtained for spiked samples. The accuracy of the method was evaluated by comparison with the results obtained for the analysis of a rice flour sample (certified material IRMM-804) and no significant difference at the 95% confidence level was observed. The method was successfully applied to the determination of As(V) and total arsenic in rice samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Figures of Merit for Lunar Simulants

    NASA Technical Reports Server (NTRS)

    Slane, Frederick A.; Rickman, Douglas L.

    2012-01-01

    At an earlier SRR the concept for an international standard on Lunar regolith simulants was presented. The international standard, ISO 10788, Lunar Simulants, has recently been published. This paper presents the final content of the standard. Therefore, we are presenting an update of the following: The collection and analysis of lunar samples from 1969 to present has yielded large amounts of data. Published analyses give some idea of the complex nature of the regolith at all scales, rocks, soils and the smaller particulates commonly referred to as dust. Data recently acquired in support of NASA s simulant effort has markedly increased our knowledge and quantitatively demonstrates that complexity. It is anticipated that future analyses will further add to the known complexity. In an effort to communicate among the diverse technical communities performing research on or research using regolith samples and simulants, a set of Figures of Merit (FoM) have been devised. The objective is to allow consistent and concise comparative communication between researchers from multiple organizations and nations engaged in lunar exploration. This paper describes Figures of Merit in a new international standard for Lunar Simulants. The FoM methodology uses scientific understanding of the lunar samples to formulate parameters which are reproducibly quantifiable. Contaminants and impurities in the samples are also addressed.

  13. Asymmetric flow field flow fractionation with light scattering detection - an orthogonal sensitivity analysis.

    PubMed

    Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S

    2016-11-18

    Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  15. Over-expression and purification strategies for recombinant multi-protein oligomers: a case study of Mycobacterium tuberculosis σ/anti-σ factor protein complexes.

    PubMed

    Thakur, Krishan Gopal; Jaiswal, Ravi Kumar; Shukla, Jinal K; Praveena, T; Gopal, B

    2010-12-01

    The function of a protein in a cell often involves coordinated interactions with one or several regulatory partners. It is thus imperative to characterize a protein both in isolation as well as in the context of its complex with an interacting partner. High resolution structural information determined by X-ray crystallography and Nuclear Magnetic Resonance offer the best route to characterize protein complexes. These techniques, however, require highly purified and homogenous protein samples at high concentration. This requirement often presents a major hurdle for structural studies. Here we present a strategy based on co-expression and co-purification to obtain recombinant multi-protein complexes in the quantity and concentration range that can enable hitherto intractable structural projects. The feasibility of this strategy was examined using the σ factor/anti-σ factor protein complexes from Mycobacterium tuberculosis. The approach was successful across a wide range of σ factors and their cognate interacting partners. It thus appears likely that the analysis of these complexes based on variations in expression constructs and procedures for the purification and characterization of these recombinant protein samples would be widely applicable for other multi-protein systems. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Classroom learning and achievement: how the complexity of classroom interaction impacts students' learning

    NASA Astrophysics Data System (ADS)

    Podschuweit, Sören; Bernholt, Sascha; Brückmann, Maja

    2016-05-01

    Background: Complexity models have provided a suitable framework in various domains to assess students' educational achievement. Complexity is often used as the analytical focus when regarding learning outcomes, i.e. when analyzing written tests or problem-centered interviews. Numerous studies reveal negative correlations between the complexity of a task and the probability of a student solving it. Purpose: Thus far, few detailed investigations explore the importance of complexity in actual classroom lessons. Moreover, the few efforts made so far revealed inconsistencies. Hence, the present study sheds light on the influence the complexity of students' and teachers' class contributions have on students' learning outcomes. Sample: Videos of 10 German 8th grade physics courses covering three consecutive lessons on two topics each (electricity, mechanics) have been analyzed. The sample includes 10 teachers and 290 students. Design and methods: Students' and teachers' verbal contributions were coded manual-based according to the level of complexity. Additionally, pre-post testing of knowledge in electricity and mechanics was applied to assess the students' learning gain. ANOVA analysis was used to characterize the influence of the complexity on the learning gain. Results: Results indicate that the mean level of complexity in classroom contributions explains a large portion of variance in post-test results on class level. Despite this overarching trend, taking classroom activities into account as well reveals even more fine-grained patterns, leading to more specific relations between the complexity in the classroom and students' achievement. Conclusions: In conclusion, we argue for more reflected teaching approaches intended to gradually increase class complexity to foster students' level of competency.

  17. Quantitative Analysis of Mixed Halogen Dioxins and Furans in Fire Debris Utilizing Atmospheric Pressure Ionization Gas Chromatography-Triple Quadrupole Mass Spectrometry.

    PubMed

    Organtini, Kari L; Myers, Anne L; Jobst, Karl J; Reiner, Eric J; Ross, Brian; Ladak, Adam; Mullin, Lauren; Stevens, Douglas; Dorman, Frank L

    2015-10-20

    Residential and commercial fires generate a complex mixture of volatile, semivolatile, and nonvolatile compounds. This study focused on the semi/nonvolatile components of fire debris to better understand firefighter exposure risks. Using the enhanced sensitivity of gas chromatography coupled to atmospheric pressure ionization-tandem mass spectrometry (APGC-MS/MS), complex fire debris samples collected from simulation fires were analyzed for the presence of potentially toxic polyhalogenated dibenzo-p-dioxins and dibenzofurans (PXDD/Fs and PBDD/Fs). Extensive method development was performed to create multiple reaction monitoring (MRM) methods for a wide range of PXDD/Fs from dihalogenated through hexa-halogenated homologue groups. Higher halogenated compounds were not observed due to difficulty eluting them off the long column used for analysis. This methodology was able to identify both polyhalogenated (mixed bromo-/chloro- and polybromo-) dibenzo-p-dioxins and dibenzofurans in the simulated burn study samples collected, with the dibenzofuran species being the dominant compounds in the samples. Levels of these compounds were quantified as total homologue groups due to the limitations of commercial congener availability. Concentration ranges in household simulation debris were observed at 0.01-5.32 ppb (PXDFs) and 0.18-82.11 ppb (PBDFs). Concentration ranges in electronics simulation debris were observed at 0.10-175.26 ppb (PXDFs) and 0.33-9254.41 ppb (PBDFs). Samples taken from the particulate matter coating the firefighters' helmets contained some of the highest levels of dibenzofurans, ranging from 4.10 ppb to 2.35 ppm. The data suggest that firefighters and first responders at fire scenes are exposed to a complex mixture of potentially hundreds to thousands of different polyhalogenated dibenzo-p-dioxins and dibenzofurans that could negatively impact their health.

  18. Evaluation of targeted exome sequencing for 28 protein-based blood group systems, including the homologous gene systems, for blood group genotyping.

    PubMed

    Schoeman, Elizna M; Lopez, Genghis H; McGowan, Eunike C; Millard, Glenda M; O'Brien, Helen; Roulis, Eileen V; Liew, Yew-Wah; Martin, Jacqueline R; McGrath, Kelli A; Powley, Tanya; Flower, Robert L; Hyland, Catherine A

    2017-04-01

    Blood group single nucleotide polymorphism genotyping probes for a limited range of polymorphisms. This study investigated whether massively parallel sequencing (also known as next-generation sequencing), with a targeted exome strategy, provides an extended blood group genotype and the extent to which massively parallel sequencing correctly genotypes in homologous gene systems, such as RH and MNS. Donor samples (n = 28) that were extensively phenotyped and genotyped using single nucleotide polymorphism typing, were analyzed using the TruSight One Sequencing Panel and MiSeq platform. Genes for 28 protein-based blood group systems, GATA1, and KLF1 were analyzed. Copy number variation analysis was used to characterize complex structural variants in the GYPC and RH systems. The average sequencing depth per target region was 66.2 ± 39.8. Each sample harbored on average 43 ± 9 variants, of which 10 ± 3 were used for genotyping. For the 28 samples, massively parallel sequencing variant sequences correctly matched expected sequences based on single nucleotide polymorphism genotyping data. Copy number variation analysis defined the Rh C/c alleles and complex RHD hybrids. Hybrid RHD*D-CE-D variants were correctly identified, but copy number variation analysis did not confidently distinguish between D and CE exon deletion versus rearrangement. The targeted exome sequencing strategy employed extended the range of blood group genotypes detected compared with single nucleotide polymorphism typing. This single-test format included detection of complex MNS hybrid cases and, with copy number variation analysis, defined RH hybrid genes along with the RHCE*C allele hitherto difficult to resolve by variant detection. The approach is economical compared with whole-genome sequencing and is suitable for a red blood cell reference laboratory setting. © 2017 AABB.

  19. Model-based quality assessment and base-calling for second-generation sequencing data.

    PubMed

    Bravo, Héctor Corrada; Irizarry, Rafael A

    2010-09-01

    Second-generation sequencing (sec-gen) technology can sequence millions of short fragments of DNA in parallel, making it capable of assembling complex genomes for a small fraction of the price and time of previous technologies. In fact, a recently formed international consortium, the 1000 Genomes Project, plans to fully sequence the genomes of approximately 1200 people. The prospect of comparative analysis at the sequence level of a large number of samples across multiple populations may be achieved within the next five years. These data present unprecedented challenges in statistical analysis. For instance, analysis operates on millions of short nucleotide sequences, or reads-strings of A,C,G, or T's, between 30 and 100 characters long-which are the result of complex processing of noisy continuous fluorescence intensity measurements known as base-calling. The complexity of the base-calling discretization process results in reads of widely varying quality within and across sequence samples. This variation in processing quality results in infrequent but systematic errors that we have found to mislead downstream analysis of the discretized sequence read data. For instance, a central goal of the 1000 Genomes Project is to quantify across-sample variation at the single nucleotide level. At this resolution, small error rates in sequencing prove significant, especially for rare variants. Sec-gen sequencing is a relatively new technology for which potential biases and sources of obscuring variation are not yet fully understood. Therefore, modeling and quantifying the uncertainty inherent in the generation of sequence reads is of utmost importance. In this article, we present a simple model to capture uncertainty arising in the base-calling procedure of the Illumina/Solexa GA platform. Model parameters have a straightforward interpretation in terms of the chemistry of base-calling allowing for informative and easily interpretable metrics that capture the variability in sequencing quality. Our model provides these informative estimates readily usable in quality assessment tools while significantly improving base-calling performance. © 2009, The International Biometric Society.

  20. A tree-like Bayesian structure learning algorithm for small-sample datasets from complex biological model systems.

    PubMed

    Yin, Weiwei; Garimalla, Swetha; Moreno, Alberto; Galinski, Mary R; Styczynski, Mark P

    2015-08-28

    There are increasing efforts to bring high-throughput systems biology techniques to bear on complex animal model systems, often with a goal of learning about underlying regulatory network structures (e.g., gene regulatory networks). However, complex animal model systems typically have significant limitations on cohort sizes, number of samples, and the ability to perform follow-up and validation experiments. These constraints are particularly problematic for many current network learning approaches, which require large numbers of samples and may predict many more regulatory relationships than actually exist. Here, we test the idea that by leveraging the accuracy and efficiency of classifiers, we can construct high-quality networks that capture important interactions between variables in datasets with few samples. We start from a previously-developed tree-like Bayesian classifier and generalize its network learning approach to allow for arbitrary depth and complexity of tree-like networks. Using four diverse sample networks, we demonstrate that this approach performs consistently better at low sample sizes than the Sparse Candidate Algorithm, a representative approach for comparison because it is known to generate Bayesian networks with high positive predictive value. We develop and demonstrate a resampling-based approach to enable the identification of a viable root for the learned tree-like network, important for cases where the root of a network is not known a priori. We also develop and demonstrate an integrated resampling-based approach to the reduction of variable space for the learning of the network. Finally, we demonstrate the utility of this approach via the analysis of a transcriptional dataset of a malaria challenge in a non-human primate model system, Macaca mulatta, suggesting the potential to capture indicators of the earliest stages of cellular differentiation during leukopoiesis. We demonstrate that by starting from effective and efficient approaches for creating classifiers, we can identify interesting tree-like network structures with significant ability to capture the relationships in the training data. This approach represents a promising strategy for inferring networks with high positive predictive value under the constraint of small numbers of samples, meeting a need that will only continue to grow as more high-throughput studies are applied to complex model systems.

  1. Entropy for the Complexity of Physiological Signal Dynamics.

    PubMed

    Zhang, Xiaohua Douglas

    2017-01-01

    Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.

  2. Holistic Practice in Traumatic Brain Injury Rehabilitation: Perspectives of Health Practitioners

    PubMed Central

    Wright, Courtney J.; Zeeman, Heidi; Biezaitis, Valda

    2016-01-01

    Given that the literature suggests there are various (and often contradictory) interpretations of holistic practice in brain injury rehabilitation and multiple complexities in its implementation (including complex setting, discipline, and client-base factors), this study aimed to examine the experiences of practitioners in their conceptualization and delivery of holistic practice in their respective settings. Nineteen health practitioners purposively sampled from an extensive Brain Injury Network in Queensland, Australia participated in individual interviews. A systematic text analysis process using Leximancer qualitative analysis program was undertaken, followed by manual thematic analysis to develop overarching themes. The findings from this study have identified several items for future inter-professional development that will not only benefit the practitioners working in brain injury rehabilitation settings, but the patients and their families as well. PMID:27270604

  3. Holistic Practice in Traumatic Brain Injury Rehabilitation: Perspectives of Health Practitioners.

    PubMed

    Wright, Courtney J; Zeeman, Heidi; Biezaitis, Valda

    2016-01-01

    Given that the literature suggests there are various (and often contradictory) interpretations of holistic practice in brain injury rehabilitation and multiple complexities in its implementation (including complex setting, discipline, and client-base factors), this study aimed to examine the experiences of practitioners in their conceptualization and delivery of holistic practice in their respective settings. Nineteen health practitioners purposively sampled from an extensive Brain Injury Network in Queensland, Australia participated in individual interviews. A systematic text analysis process using Leximancer qualitative analysis program was undertaken, followed by manual thematic analysis to develop overarching themes. The findings from this study have identified several items for future inter-professional development that will not only benefit the practitioners working in brain injury rehabilitation settings, but the patients and their families as well.

  4. The Effects of Aging and Dual Tasking on Human Gait Complexity During Treadmill Walking: A Comparative Study Using Quantized Dynamical Entropy and Sample Entropy.

    PubMed

    Ahmadi, Samira; Wu, Christine; Sepehri, Nariman; Kantikar, Anuprita; Nankar, Mayur; Szturm, Tony

    2018-01-01

    Quantized dynamical entropy (QDE) has recently been proposed as a new measure to quantify the complexity of dynamical systems with the purpose of offering a better computational efficiency. This paper further investigates the viability of this method using five different human gait signals. These signals are recorded while normal walking and while performing secondary tasks among two age groups (young and older age groups). The results are compared with the outcomes of previously established sample entropy (SampEn) measure for the same signals. We also study how analyzing segmented and spatially and temporally normalized signal differs from analyzing whole data. Our findings show that human gait signals become more complex as people age and while they are cognitively loaded. Center of pressure (COP) displacement in mediolateral direction is the best signal for showing the gait changes. Moreover, the results suggest that by segmenting data, more information about intrastride dynamical features are obtained. Most importantly, QDE is shown to be a reliable measure for human gait complexity analysis.

  5. Burkholderia cepacia, cystic fibrosis and outcomes following lung transplantation: experiences from a single center in Brazil.

    PubMed

    de Souza Carraro, Danila; Carraro, Rafael Medeiros; Campos, Silvia Vidal; Iuamoto, Leandro Ryuchi; Braga, Karina Andrighetti de Oliveira; Oliveira, Lea Campos de; Sabino, Ester Cerdeira; Rossi, Flavia; Pêgo-Fernandes, Paulo Manuel

    2018-03-12

    To evaluate the impact of Burkholderia cepacia complex colonization in cystic fibrosis patients undergoing lung transplantation. We prospectively analyzed clinical data and respiratory tract samples (sputum and bronchoalveolar lavage) collected from suppurative lung disease patients between January 2008 and November 2013. We also subtyped different Burkholderia cepacia complex genotypes via DNA sequencing using primers against the recA gene in samples collected between January 2012 and November 2013. From 2008 to 2013, 34 lung transplants were performed on cystic fibrosis patients at our center. Burkholderia cepacia complex was detected in 13 of the 34 (38.2%) patients. Seven of the 13 (53%) strains were subjected to genotype analysis, from which three strains of B. metallica and four strains of B. cenocepacia were identified. The mortality rate was 1/13 (7.6%), and this death was not related to B. cepacia infection. The results of our study suggest that colonization by B. cepacia complex and even B. cenocepacia in patients with cystic fibrosis should not be considered an absolute contraindication to lung transplantation in Brazilian centers.

  6. Laser electrospray mass spectrometry of adsorbed molecules at atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Brady, John J.; Judge, Elizabeth J.; Simon, Kuriakose; Levis, Robert J.

    2010-02-01

    Atmospheric pressure mass analysis of solid phase biomolecules is performed using laser electrospray mass spectrometry (LEMS). A non-resonant femtosecond duration laser pulse vaporizes native samples at atmospheric pressure for subsequent electrospray ionization and transfer into a mass spectrometer. LEMS was used to detect a complex molecule (irinotecan HCl), a complex mixture (cold medicine formulation with active ingredients: acetaminophen, dextromethorphan HBr and doxylamine succinate), and a biological building block (deoxyguanosine) deposited on steel surfaces without a matrix molecule.

  7. NMR studies of protein-nucleic acid interactions.

    PubMed

    Varani, Gabriele; Chen, Yu; Leeper, Thomas C

    2004-01-01

    Protein-DNA and protein-RNA complexes play key functional roles in every living organism. Therefore, the elucidation of their structure and dynamics is an important goal of structural and molecular biology. Nuclear magnetic resonance (NMR) studies of protein and nucleic acid complexes have common features with studies of protein-protein complexes: the interaction surfaces between the molecules must be carefully delineated, the relative orientation of the two species needs to be accurately and precisely determined, and close intermolecular contacts defined by nuclear Overhauser effects (NOEs) must be obtained. However, differences in NMR properties (e.g., chemical shifts) and biosynthetic pathways for sample productions generate important differences. Chemical shift differences between the protein and nucleic acid resonances can aid the NMR structure determination process; however, the relatively limited dispersion of the RNA ribose resonances makes the process of assigning intermolecular NOEs more difficult. The analysis of the resulting structures requires computational tools unique to nucleic acid interactions. This chapter summarizes the most important elements of the structure determination by NMR of protein-nucleic acid complexes and their analysis. The main emphasis is on recent developments (e.g., residual dipolar couplings and new Web-based analysis tools) that have facilitated NMR studies of these complexes and expanded the type of biological problems to which NMR techniques of structural elucidation can now be applied.

  8. Association between suicidal symptoms and repeat suicidal behaviour within a sample of hospital-treated suicide attempters

    PubMed Central

    van Borkulo, Claudia D.; O’Connor, Rory C.

    2017-01-01

    Background Suicidal behaviour is the end result of the complex relation between many factors which are biological, psychological and environmental in nature. Network analysis is a novel method that may help us better understand the complex association between different factors. Aims To examine the relationship between suicidal symptoms as assessed by the Beck Scale for Suicide Ideation and future suicidal behaviour in patients admitted to hospital following a suicide attempt, using network analysis. Method Secondary analysis was conducted on previously collected data from a sample of 366 patients who were admitted to a Scottish hospital following a suicide attempt. Network models were estimated to visualise and test the association between baseline symptom network structure and suicidal behaviour at 15-month follow-up. Results Network analysis showed that the desire for an active attempt was found to be the most central, strongly related suicide symptom. Of the 19 suicide symptoms that were assessed at baseline, 10 symptoms were directly related to repeat suicidal behaviour. When comparing baseline network structure of repeaters (n=94) with the network of non-repeaters (n=272), no significant differences were found. Conclusions Network analysis can help us better understand suicidal behaviour by visualising the complex relation between relevant symptoms and by indicating which symptoms are most central within the network. These insights have theoretical implications as well as informing the assessment and treatment of suicidal behaviour. Declaration of interest None. Copyright and usage © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license. PMID:28507771

  9. Label-free proteomic analysis of intestinal mucosa proteins in common carp (Cyprinus carpio) infected with Aeromonas hydrophila.

    PubMed

    Di, Guilan; Li, Hui; Zhang, Chao; Zhao, Yanjing; Zhou, Chuanjiang; Naeem, Sajid; Li, Li; Kong, Xianghui

    2017-07-01

    Outbreaks of infectious diseases in common carp Cyprinus carpio, a major cultured fish in northern regions of China, constantly result in significant economic losses. Until now, information proteomic on immune defence remains limited. In the present study, a profile of intestinal mucosa immune response in Cyprinus carpio was investigated after 0, 12, 36 and 84 h after challenging tissues with Aeromonas hydrophila at a concentration of 1.4 × 10 8  CFU/mL. Proteomic profiles in different samples were compared using label-free quantitative proteomic approach. Based on MASCOT database search, 1149 proteins were identified in samples after normalisation of proteins. Treated groups 1 (T1) and 2 (T2) were first clustered together and then clustered with control (C group). The distance between C and treated group 3 (T3) represented the maxima according to hierarchical cluster analysis. Therefore, comparative analysis between C and T3 was selected in the following analysis. A total of 115 proteins with differential abundance were detected to show conspicuous expressing variances. A total of 52 up-regulated proteins and 63 down-regulated proteins were detected in T3. Gene ontology analysis showed that identified up-regulated differentially expressed proteins in T3 were mainly localised in the hemoglobin complex, and down-regulated proteins in T3 were mainly localised in the major histocompatibility complex II protein complex. Forty-six proteins of differential abundance (40% of 115) were involved in immune response, with 17 up-regulated and 29 down-regulated proteins detected in T3. This study is the first to report proteome response of carp intestinal mucosa against A. hydrophila infection; information obtained contribute to understanding defence mechanisms of carp intestinal mucosa. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  11. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  12. Recent advances of mesoporous materials in sample preparation.

    PubMed

    Zhao, Liang; Qin, Hongqiang; Wu, Ren'an; Zou, Hanfa

    2012-03-09

    Sample preparation has been playing an important role in the analysis of complex samples. Mesoporous materials as the promising adsorbents have gained increasing research interest in sample preparation due to their desirable characteristics of high surface area, large pore volume, tunable mesoporous channels with well defined pore-size distribution, controllable wall composition, as well as modifiable surface properties. The aim of this paper is to review the recent advances of mesoporous materials in sample preparation with emphases on extraction of metal ions, adsorption of organic compounds, size selective enrichment of peptides/proteins, specific capture of post-translational peptides/proteins and enzymatic reactor for protein digestion. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Environmentally relevant chemical mixtures of concern in waters of United States tributaries to the Great Lakes

    USGS Publications Warehouse

    Elliott, Sarah M.; Brigham, Mark E.; Kiesling, Richard L.; Schoenfuss, Heiko L.; Jorgenson, Zachary G.

    2018-01-01

    The North American Great Lakes are a vital natural resource that provide fish and wildlife habitat, as well as drinking water and waste assimilation services for millions of people. Tributaries to the Great Lakes receive chemical inputs from various point and nonpoint sources, and thus are expected to have complex mixtures of chemicals. However, our understanding of the co‐occurrence of specific chemicals in complex mixtures is limited. To better understand the occurrence of specific chemical mixtures in the US Great Lakes Basin, surface water from 24 US tributaries to the Laurentian Great Lakes was collected and analyzed for diverse suites of organic chemicals, primarily focused on chemicals of concern (e.g., pharmaceuticals, personal care products, fragrances). A total of 181 samples and 21 chemical classes were assessed for mixture compositions. Basin wide, 1664 mixtures occurred in at least 25% of sites. The most complex mixtures identified comprised 9 chemical classes and occurred in 58% of sampled tributaries. Pharmaceuticals typically occurred in complex mixtures, reflecting pharmaceutical‐use patterns and wastewater facility outfall influences. Fewer mixtures were identified at lake or lake‐influenced sites than at riverine sites. As mixture complexity increased, the probability of a specific mixture occurring more often than by chance greatly increased, highlighting the importance of understanding source contributions to the environment. This empirically based analysis of mixture composition and occurrence may be used to focus future sampling efforts or mixture toxicity assessments. 

  14. Contamination of environment in the road surroudings - impact of road salting on Norway spruce (Picea abies) and Scots pine (Pinus sylvestris)

    NASA Astrophysics Data System (ADS)

    Hegrová, Jitka; Steiner, Oliver; Goessler, Walter; Tanda, Stefan; Anděl, Petr

    2017-09-01

    A comprehensive overview of the influence of transport on the environment is presented in this study. The complex analysis of soil and needle samples provides an extensive set of data, which presents elemental contamination of the environment near roads. Traffic pollution (including winter road treatment) has a significant negative influence on our environment. Besides sodium and chlorine from winter maintenance many other elements are emitted into the environment. Three possible sources of contamination are assumed for environmental contamination evaluation: car emission, winter maintenance and abrasion from breaks and clutches. The chemical analysis focused on the description of samples from inorganic point of view. The influence of the contamination potential on the sodium and chlorine content in the samples of 1st year-old and 2nd year-old needles of Norway spruce (Picea abies) and Scots pine (Pinus sylvestris) is discussed. Additional soil samples were taken from each sampling site and analyzed to get insight in the sodium and chlorine distribution. Statistical evaluation was used for interpretation of complex interaction patterns between element concentrations in different aged needles based on localities character including distance from the road and element concentration in soils. This species of needles were chosen because of its heightened sensitivity towards salinization. The study was conducted in different parts of the Czech Republic. The resulting database is a source of valuable information about the influence of transport on the environment.

  15. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis

    PubMed Central

    2012-01-01

    Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand. PMID:22276739

  16. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis.

    PubMed

    Tu, Jing; Ge, Qinyu; Wang, Shengqin; Wang, Lei; Sun, Beili; Yang, Qi; Bai, Yunfei; Lu, Zuhong

    2012-01-25

    The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.

  17. Spectral refractive index assessment of turbid samples by combining spatial frequency near-infrared spectroscopy with Kramers-Kronig analysis

    NASA Astrophysics Data System (ADS)

    Meitav, Omri; Shaul, Oren; Abookasis, David

    2018-03-01

    A practical algorithm for estimating the wavelength-dependent refractive index (RI) of a turbid sample in the spatial frequency domain with the aid of Kramers-Kronig (KK) relations is presented. In it, phase-shifted sinusoidal patterns (structured illumination) are serially projected at a high spatial frequency onto the sample surface (mouse scalp) at different near-infrared wavelengths while a camera mounted normally to the sample surface captures the reflected diffuse light. In the offline analysis pipeline, recorded images at each wavelength are converted to spatial absorption maps by logarithmic function, and once the absorption coefficient information is obtained, the imaginary part (k) of the complex RI (CRI), based on Maxell's equations, can be calculated. Using the data represented by k, the real part of the CRI (n) is then resolved by KK analysis. The wavelength dependence of n ( λ ) is then fitted separately using four standard dispersion models: Cornu, Cauchy, Conrady, and Sellmeier. In addition, three-dimensional surface-profile distribution of n is provided based on phase profilometry principles and a phase-unwrapping-based phase-derivative-variance algorithm. Experimental results demonstrate the capability of the proposed idea for sample's determination of a biological sample's RI value.

  18. Relationship of Complexity Factor Ratings With Operational Errors

    DTIC Science & Technology

    2007-05-01

    losing information about their interrelationships. Prior to the analysis, the Kaiser-Meyer-Olkin ( KMO ) measure of sampling adequacy was examined to...test whether partial correlations among the variables were small. KMO values of .6 and above are required for a good solution. A KMO of .87 was

  19. Application of high performance liquid chromatography for the profiling of complex chemical mixtures with the aid of chemometrics.

    PubMed

    Ni, Yongnian; Zhang, Liangsheng; Churchill, Jane; Kokot, Serge

    2007-06-15

    In this paper, chemometrics methods were applied to resolve the high performance liquid chromatography (HPLC) fingerprints of complex, many-component substances to compare samples from a batch from a given manufacturer, or from those of different producers. As an example of such complex substances, we used a common Chinese traditional medicine, Huoxiang Zhengqi Tincture (HZT) for this research. Twenty-one samples, each representing a separate HZT production batch from one of three manufacturers were analyzed by HPLC with the aid of a diode array detector (DAD). An Agilent Zorbax Eclipse XDB-C18 column with an Agilent Zorbax high pressure reliance cartridge guard-column were used. The mobile phase consisted of water (A) and methanol (B) with a gradient program of 25-65% (v/v, B) during 0-30min, 65-55% (v/v, B) during 30-35min and 55-100% (v/v, B) during 35-60min (flow rate, 1.0mlmin(-1); injection volume, 20mul; and column temperature-ambient). The detection wavelength was adjusted for maximum sensitivity at different time periods. A peak area matrix with 21objectsx14HPLC variables was obtained by sampling each chromatogram at 14 common retention times. Similarities were then calculated to discriminate the batch-to-batch samples and also, a more informative multi-criteria decision making methodology (MCDM), PROMETHEE and GAIA, was applied to obtain more information from the chromatograms in order to rank and compare the complex HZT profiles. The results showed that with the MCDM analysis, it was possible to match and discriminate correctly the batch samples from the three different manufacturers. Fourier transform infrared (FT-IR) spectra taken from samples from several batches were compared by the common similarity method with the HPLC results. It was found that the FT-IR spectra did not discriminate the samples from the different batches.

  20. Development of a sensitive and selective liquid chromatography-mass spectrometry method for high throughput analysis of paralytic shellfish toxins using graphitised carbon solid phase extraction.

    PubMed

    Boundy, Michael J; Selwood, Andrew I; Harwood, D Tim; McNabb, Paul S; Turner, Andrew D

    2015-03-27

    Routine regulatory monitoring of paralytic shellfish toxins (PST) commonly employs oxidative derivitisation and complex liquid chromatography fluorescence detection methods (LC-FL). The pre-column oxidation LC-FL method is currently implemented in New Zealand and the United Kingdom. When using this method positive samples are fractionated and two different oxidations are required to confirm the identity and quantity of each PST analogue present. There is a need for alternative methods that are simpler, provide faster turnaround times and have improved detection limits. Hydrophilic interaction liquid chromatography (HILIC) HPLC-MS/MS analysis of PST has been used for research purposes, but high detection limits and substantial sample matrix issues have prevented it from becoming a viable alternative for routine monitoring purposes. We have developed a HILIC UPLC-MS/MS method for paralytic shellfish toxins with an optimised desalting clean-up procedure on inexpensive carbon solid phase extraction cartridges for reduction of matrix interferences. This represents a major technical breakthrough and allows sensitive, selective and rapid analysis of paralytic shellfish toxins from a variety of sample types, including many commercially produced bivalve molluscan shellfish species. Additionally, this analytical approach avoids the need for complex calculations to determine sample toxicity, as unlike other methods each PST analogue is able to be quantified as a single resolved peak. This article presents the method development and optimisation information. A thorough single laboratory validation study has subsequently been performed and this data will be presented elsewhere. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Comparative analysis of taxonomic, functional, and metabolic patterns of microbiomes from 14 full-scale biogas reactors by metagenomic sequencing and radioisotopic analysis.

    PubMed

    Luo, Gang; Fotidis, Ioannis A; Angelidaki, Irini

    2016-01-01

    Biogas production is a very complex process due to the high complexity in diversity and interactions of the microorganisms mediating it, and only limited and diffuse knowledge exists about the variation of taxonomic and functional patterns of microbiomes across different biogas reactors, and their relationships with the metabolic patterns. The present study used metagenomic sequencing and radioisotopic analysis to assess the taxonomic, functional, and metabolic patterns of microbiomes from 14 full-scale biogas reactors operated under various conditions treating either sludge or manure. The results from metagenomic analysis showed that the dominant methanogenic pathway revealed by radioisotopic analysis was not always correlated with the taxonomic and functional compositions. It was found by radioisotopic experiments that the aceticlastic methanogenic pathway was dominant, while metagenomics analysis showed higher relative abundance of hydrogenotrophic methanogens. Principal coordinates analysis showed the sludge-based samples were clearly distinct from the manure-based samples for both taxonomic and functional patterns, and canonical correspondence analysis showed that the both temperature and free ammonia were crucial environmental variables shaping the taxonomic and functional patterns. The study further the overall patterns of functional genes were strongly correlated with overall patterns of taxonomic composition across different biogas reactors. The discrepancy between the metabolic patterns determined by metagenomic analysis and metabolic pathways determined by radioisotopic analysis was found. Besides, a clear correlation between taxonomic and functional patterns was demonstrated for biogas reactors, and also the environmental factors that shaping both taxonomic and functional genes patterns were identified.

  2. Estimating genetic effects and quantifying missing heritability explained by identified rare-variant associations.

    PubMed

    Liu, Dajiang J; Leal, Suzanne M

    2012-10-05

    Next-generation sequencing has led to many complex-trait rare-variant (RV) association studies. Although single-variant association analysis can be performed, it is grossly underpowered. Therefore, researchers have developed many RV association tests that aggregate multiple variant sites across a genetic region (e.g., gene), and test for the association between the trait and the aggregated genotype. After these aggregate tests detect an association, it is only possible to estimate the average genetic effect for a group of RVs. As a result of the "winner's curse," such an estimate can be biased. Although for common variants one can obtain unbiased estimates of genetic parameters by analyzing a replication sample, for RVs it is desirable to obtain unbiased genetic estimates for the study where the association is identified. This is because there can be substantial heterogeneity of RV sites and frequencies even among closely related populations. In order to obtain an unbiased estimate for aggregated RV analysis, we developed bootstrap-sample-split algorithms to reduce the bias of the winner's curse. The unbiased estimates are greatly important for understanding the population-specific contribution of RVs to the heritability of complex traits. We also demonstrate both theoretically and via simulations that for aggregate RV analysis the genetic variance for a gene or region will always be underestimated, sometimes substantially, because of the presence of noncausal variants or because of the presence of causal variants with effects of different magnitudes or directions. Therefore, even if RVs play a major role in the complex-trait etiologies, a portion of the heritability will remain missing, and the contribution of RVs to the complex-trait etiologies will be underestimated. Copyright © 2012 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  3. Improved Classification of Orthosiphon stamineus by Data Fusion of Electronic Nose and Tongue Sensors

    PubMed Central

    Zakaria, Ammar; Shakaff, Ali Yeon Md.; Adom, Abdul Hamid; Ahmad, Mohd Noor; Masnan, Maz Jamilah; Aziz, Abdul Hallis Abdul; Fikri, Nazifah Ahmad; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah

    2010-01-01

    An improved classification of Orthosiphon stamineus using a data fusion technique is presented. Five different commercial sources along with freshly prepared samples were discriminated using an electronic nose (e-nose) and an electronic tongue (e-tongue). Samples from the different commercial brands were evaluated by the e-tongue and then followed by the e-nose. Applying Principal Component Analysis (PCA) separately on the respective e-tongue and e-nose data, only five distinct groups were projected. However, by employing a low level data fusion technique, six distinct groupings were achieved. Hence, this technique can enhance the ability of PCA to analyze the complex samples of Orthosiphon stamineus. Linear Discriminant Analysis (LDA) was then used to further validate and classify the samples. It was found that the LDA performance was also improved when the responses from the e-nose and e-tongue were fused together. PMID:22163381

  4. Improved classification of Orthosiphon stamineus by data fusion of electronic nose and tongue sensors.

    PubMed

    Zakaria, Ammar; Shakaff, Ali Yeon Md; Adom, Abdul Hamid; Ahmad, Mohd Noor; Masnan, Maz Jamilah; Aziz, Abdul Hallis Abdul; Fikri, Nazifah Ahmad; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah

    2010-01-01

    An improved classification of Orthosiphon stamineus using a data fusion technique is presented. Five different commercial sources along with freshly prepared samples were discriminated using an electronic nose (e-nose) and an electronic tongue (e-tongue). Samples from the different commercial brands were evaluated by the e-tongue and then followed by the e-nose. Applying Principal Component Analysis (PCA) separately on the respective e-tongue and e-nose data, only five distinct groups were projected. However, by employing a low level data fusion technique, six distinct groupings were achieved. Hence, this technique can enhance the ability of PCA to analyze the complex samples of Orthosiphon stamineus. Linear Discriminant Analysis (LDA) was then used to further validate and classify the samples. It was found that the LDA performance was also improved when the responses from the e-nose and e-tongue were fused together.

  5. Classification of narcotics in solid mixtures using principal component analysis and Raman spectroscopy.

    PubMed

    Ryder, Alan G

    2002-03-01

    Eighty-five solid samples consisting of illegal narcotics diluted with several different materials were analyzed by near-infrared (785 nm excitation) Raman spectroscopy. Principal Component Analysis (PCA) was employed to classify the samples according to narcotic type. The best sample discrimination was obtained by using the first derivative of the Raman spectra. Furthermore, restricting the spectral variables for PCA to 2 or 3% of the original spectral data according to the most intense peaks in the Raman spectrum of the pure narcotic resulted in a rapid discrimination method for classifying samples according to narcotic type. This method allows for the easy discrimination between cocaine, heroin, and MDMA mixtures even when the Raman spectra are complex or very similar. This approach of restricting the spectral variables also decreases the computational time by a factor of 30 (compared to the complete spectrum), making the methodology attractive for rapid automatic classification and identification of suspect materials.

  6. Multi-edge X-ray absorption spectroscopy study of road dust samples from a traffic area of Venice using stoichiometric and environmental references

    NASA Astrophysics Data System (ADS)

    Valotto, Gabrio; Cattaruzza, Elti; Bardelli, Fabrizio

    2017-02-01

    The appropriate selection of representative pure compounds to be used as reference is a crucial step for successful analysis of X-ray absorption near edge spectroscopy (XANES) data, and it is often not a trivial task. This is particularly true when complex environmental matrices are investigated, being their elemental speciation a priori unknown. In this paper, an investigation on the speciation of Cu, Zn, and Sb based on the use of conventional (stoichiometric compounds) and non-conventional (environmental samples or relevant certified materials) references is explored. This method can be useful in when the effectiveness of XANES analysis is limited because of the difficulty in obtaining a set of references sufficiently representative of the investigated samples. Road dust samples collected along the bridge connecting Venice to the mainland were used to show the potentialities and the limits of this approach.

  7. Regression and Data Mining Methods for Analyses of Multiple Rare Variants in the Genetic Analysis Workshop 17 Mini-Exome Data

    PubMed Central

    Bailey-Wilson, Joan E.; Brennan, Jennifer S.; Bull, Shelley B; Culverhouse, Robert; Kim, Yoonhee; Jiang, Yuan; Jung, Jeesun; Li, Qing; Lamina, Claudia; Liu, Ying; Mägi, Reedik; Niu, Yue S.; Simpson, Claire L.; Wang, Libo; Yilmaz, Yildiz E.; Zhang, Heping; Zhang, Zhaogong

    2012-01-01

    Group 14 of Genetic Analysis Workshop 17 examined several issues related to analysis of complex traits using DNA sequence data. These issues included novel methods for analyzing rare genetic variants in an aggregated manner (often termed collapsing rare variants), evaluation of various study designs to increase power to detect effects of rare variants, and the use of machine learning approaches to model highly complex heterogeneous traits. Various published and novel methods for analyzing traits with extreme locus and allelic heterogeneity were applied to the simulated quantitative and disease phenotypes. Overall, we conclude that power is (as expected) dependent on locus-specific heritability or contribution to disease risk, large samples will be required to detect rare causal variants with small effect sizes, extreme phenotype sampling designs may increase power for smaller laboratory costs, methods that allow joint analysis of multiple variants per gene or pathway are more powerful in general than analyses of individual rare variants, population-specific analyses can be optimal when different subpopulations harbor private causal mutations, and machine learning methods may be useful for selecting subsets of predictors for follow-up in the presence of extreme locus heterogeneity and large numbers of potential predictors. PMID:22128066

  8. Rapid Discrimination for Traditional Complex Herbal Medicines from Different Parts, Collection Time, and Origins Using High-Performance Liquid Chromatography and Near-Infrared Spectral Fingerprints with Aid of Pattern Recognition Methods

    PubMed Central

    Fu, Haiyan; Fan, Yao; Zhang, Xu; Lan, Hanyue; Yang, Tianming; Shao, Mei; Li, Sihan

    2015-01-01

    As an effective method, the fingerprint technique, which emphasized the whole compositions of samples, has already been used in various fields, especially in identifying and assessing the quality of herbal medicines. High-performance liquid chromatography (HPLC) and near-infrared (NIR), with their unique characteristics of reliability, versatility, precision, and simple measurement, played an important role among all the fingerprint techniques. In this paper, a supervised pattern recognition method based on PLSDA algorithm by HPLC and NIR has been established to identify the information of Hibiscus mutabilis L. and Berberidis radix, two common kinds of herbal medicines. By comparing component analysis (PCA), linear discriminant analysis (LDA), and particularly partial least squares discriminant analysis (PLSDA) with different fingerprint preprocessing of NIR spectra variables, PLSDA model showed perfect functions on the analysis of samples as well as chromatograms. Most important, this pattern recognition method by HPLC and NIR can be used to identify different collection parts, collection time, and different origins or various species belonging to the same genera of herbal medicines which proved to be a promising approach for the identification of complex information of herbal medicines. PMID:26345990

  9. Fluorescence excitation-emission matrix (EEM) spectroscopy for rapid identification and quality evaluation of cell culture media components.

    PubMed

    Li, Boyan; Ryan, Paul W; Shanahan, Michael; Leister, Kirk J; Ryder, Alan G

    2011-11-01

    The application of fluorescence excitation-emission matrix (EEM) spectroscopy to the quantitative analysis of complex, aqueous solutions of cell culture media components was investigated. These components, yeastolate, phytone, recombinant human insulin, eRDF basal medium, and four different chemically defined (CD) media, are used for the formulation of basal and feed media employed in the production of recombinant proteins using a Chinese Hamster Ovary (CHO) cell based process. The comprehensive analysis (either identification or quality assessment) of these materials using chromatographic methods is time consuming and expensive and is not suitable for high-throughput quality control. The use of EEM in conjunction with multiway chemometric methods provided a rapid, nondestructive analytical method suitable for the screening of large numbers of samples. Here we used multiway robust principal component analysis (MROBPCA) in conjunction with n-way partial least squares discriminant analysis (NPLS-DA) to develop a robust routine for both the identification and quality evaluation of these important cell culture materials. These methods are applicable to a wide range of complex mixtures because they do not rely on any predetermined compositional or property information, thus making them potentially very useful for sample handling, tracking, and quality assessment in biopharmaceutical industries.

  10. A rapid analytical method to quantify complex organohalogen contaminant mixtures in large samples of high lipid mammalian tissues.

    PubMed

    Desforges, Jean-Pierre; Eulaers, Igor; Periard, Luke; Sonne, Christian; Dietz, Rune; Letcher, Robert J

    2017-06-01

    In vitro investigations of the health impact of individual chemical compounds have traditionally been used in risk assessments. However, humans and wildlife are exposed to a plethora of potentially harmful chemicals, including organohalogen contaminants (OHCs). An alternative exposure approach to individual or simple mixtures of synthetic OHCs is to isolate the complex mixture present in free-ranging wildlife, often non-destructively sampled from lipid rich adipose. High concentration stock volumes required for in vitro investigations do, however, pose a great analytical challenge to extract sufficient amounts of complex OHC cocktails. Here we describe a novel method to easily, rapidly and efficiently extract an environmentally accumulated and therefore relevant contaminant cocktail from large (10-50 g) marine mammal blubber samples. We demonstrate that lipid freeze-filtration with acetonitrile removes up to 97% of blubber lipids, with minimal effect on the efficiency of OHC recovery. Sample extracts after freeze-filtration were further processed to remove residual trace lipids via high-pressure gel permeation chromatography and solid phase extraction. Average recoveries of OHCs from triplicate analysis of killer whale (Orcinus orca), polar bear (Ursus maritimus) and pilot whale (Globicephala spp.) blubber standard reference material (NIST SRM-1945) ranged from 68 to 80%, 54-92% and 58-145%, respectively, for 13 C-enriched internal standards of six polychlorinated biphenyl congeners, 16 organochlorine pesticides and four brominated flame retardants. This approach to rapidly generate OHC mixtures shows great potential for experimental exposures using complex contaminant mixtures, research or monitoring driven contaminant quantification in biological samples, as well as the untargeted identification of emerging contaminants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Persistence of radiation-induced chromosome aberrations in a long-term cell culture.

    PubMed

    Duran, Assumpta; Barquinero, Joan Francesc; Caballín, María Rosa; Ribas, Montserrat; Barrios, Leonardo

    2009-04-01

    The aim of the present study was to evaluate the persistence of chromosome aberrations induced by X rays. FISH painting and mFISH techniques were applied to long-term cultures of irradiated cells. With painting, at 2 Gy the frequency of apparently simple translocations remained almost invariable during all the culture, whereas at 4 Gy a rapid decline was observed between the first and the second samples, followed by a slight decrease until the end of the culture. Apparently simple dicentrics and complex aberrations disappeared after the first sample at 2 and 4 Gy. By mFISH, at 2 Gy the frequency of complete plus one-way translocations remained invariable between the first and last sample, but at 4 Gy a 60% decline was observed. True incomplete simple translocations disappeared at 2 and 4 Gy, indicating that incompleteness could be a factor to consider when the persistence of translocations is analyzed. The analysis by mFISH showed that the frequency of complex aberrations and their complexity increased with dose and tended to disappear in the last sample. Our results indicate that the influence of dose on the decrease in the frequency of simple translocations with time postirradiation cannot be fully explained by the disappearance of true incomplete translocations and complex aberrations. The chromosome involvement was random for radiation-induced exchange aberrations and non-random for total aberrations. Chromosome 7 showed the highest deviations from expected, being less and more involved than expected in the first and last samples, respectively. Some preferential chromosome-chromosome associations were observed, including a coincidence with a cluster from radiogenic chromosome aberrations described in other studies.

  12. Feasibility of diffuse reflectance infrared Fourier spectroscopy (DRIFTS) to quantify iron-cyanide (Fe-CN) complexes in soil

    NASA Astrophysics Data System (ADS)

    Sut-Lohmann, Magdalena; Raab, Thomas

    2017-04-01

    Contaminated sites create a significant risk to human health, by poisoning drinking water, soil, air and as a consequence food. Continuous release of persistent iron-cyanide (Fe-CN) complexes from various industrial sources poses a high hazard to the environment and indicates the necessity to analyze considerable amount of samples. At the present time quantitative determination of Fe-CN concentration in soil usually requires a time consuming two step process: digestion of the sample (e.g., micro distillation system) and its analytical detection performed, e.g., by automated spectrophotometrical flow injection analysis (FIA). In order to determine the feasibility of diffuse reflectance infrared Fourier spectroscopy (DRIFTS) to quantify the Fe-CN complexes in soil matrix, 42 soil samples were collected (8 to 12.520 mg kg-1CN) indicating single symmetrical CN band in the range 2092 - 2084 cm-1. Partial least squares (PLS) calibration-validation model revealed IR response to CNtot exceeding 1268 mg kg-1 (limit of detection, LOD). Subsequently, leave-one-out cross-validation (LOO-CV) was performed on soil samples containing low CNtot (<900 mg kg-1), which improved the sensitivity of the model by reducing the LOD to 154 mg kg-1. Finally, the LOO-CV conducted on the samples with CNtot >900 mg kg-1 resulted in LOD equal to 3494 mg kg-1. Our results indicate that spectroscopic data in combination with PLS statistics can efficiently be used to predict Fe-CN concentrations in soil. We conclude that the protocol applied in this study can strongly reduce the time and costs essential for the spatial and vertical screening of the site affected by complexed Fe-CN.

  13. A broadband variable-temperature test system for complex permittivity measurements of solid and powder materials

    NASA Astrophysics Data System (ADS)

    Zhang, Yunpeng; Li, En; Zhang, Jing; Yu, Chengyong; Zheng, Hu; Guo, Gaofeng

    2018-02-01

    A microwave test system to measure the complex permittivity of solid and powder materials as a function of temperature has been developed. The system is based on a TM0n0 multi-mode cylindrical cavity with a slotting structure, which provides purer test modes compared to a traditional cavity. To ensure the safety, effectiveness, and longevity, heating and testing are carried out separately and the sample can move between two functional areas through an Alundum tube. Induction heating and a pneumatic platform are employed to, respectively, shorten the heating and cooling time of the sample. The single trigger function of the vector network analyzer is added to test software to suppress the drift of the resonance peak during testing. Complex permittivity is calculated by the rigorous field theoretical solution considering multilayer media loading. The variation of the cavity equivalent radius caused by the sample insertion holes is discussed in detail, and its influence to the test result is analyzed. The calibration method for the complex permittivity of the Alundum tube and quartz vial (for loading powder sample), which vary with the temperature, is given. The feasibility of the system has been verified by measuring different samples in a wide range of relative permittivity and loss tangent, and variable-temperature test results of fused quartz and SiO2 powder up to 1500 °C are compared with published data. The results indicate that the presented system is reliable and accurate. The stability of the system is verified by repeated and long-term tests, and error analysis is presented to estimate the error incurred due to the uncertainties in different error sources.

  14. Fast prediction of the fatigue behavior of short-fiber-reinforced thermoplastics based on heat build-up measurements: application to heterogeneous cases

    NASA Astrophysics Data System (ADS)

    Serrano, Leonell; Marco, Yann; Le Saux, Vincent; Robert, Gilles; Charrier, Pierre

    2017-09-01

    Short-fiber-reinforced thermoplastics components for structural applications are usually very complex parts as stiffeners, ribs and thickness variations are used to compensate the quite low material intrinsic stiffness. These complex geometries induce complex local mechanical fields but also complex microstructures due to the injection process. Accounting for these two aspects is crucial for the design in regard to fatigue of these parts, especially for automotive industry. The aim of this paper is to challenge an energetic approach, defined to evaluate quickly the fatigue lifetime, on three different heterogeneous cases: a classic dog-bone sample with a skin-core microstructure and two structural samples representative of the thickness variations observed for industrial components. First, a method to evaluate dissipated energy fields from thermal measurements is described and is applied to the three samples in order to relate the cyclic loading amplitude to the fields of cyclic dissipated energy. Then, a local analysis is detailed in order to link the energy dissipated at the failure location to the fatigue lifetime and to predict the fatigue curve from the thermomechanical response of one single sample. The predictions obtained for the three cases are compared successfully to the Wöhler curves obtained with classic fatigue tests. Finally, a discussion is proposed to compare results for the three samples in terms of dissipation fields and fatigue lifetime. This comparison illustrates that, if the approach is leading to a very relevant diagnosis on each case, the dissipated energy field is not giving a straightforward access to the lifetime cartography as the relation between fatigue failure and dissipated energy seems to be dependent on the local mechanical and microstructural state.

  15. Tandem array of nanoelectronic readers embedded coplanar to a fluidic nanochannel for correlated single biopolymer analysis

    PubMed Central

    Lesser-Rojas, Leonardo; Sriram, K. K.; Liao, Kuo-Tang; Lai, Shui-Chin; Kuo, Pai-Chia; Chu, Ming-Lee; Chou, Chia-Fu

    2014-01-01

    We have developed a two-step electron-beam lithography process to fabricate a tandem array of three pairs of tip-like gold nanoelectronic detectors with electrode gap size as small as 9 nm, embedded in a coplanar fashion to 60 nm deep, 100 nm wide, and up to 150 μm long nanochannels coupled to a world-micro-nanofluidic interface for easy sample introduction. Experimental tests with a sealed device using DNA-protein complexes demonstrate the coplanarity of the nanoelectrodes to the nanochannel surface. Further, this device could improve transverse current detection by correlated time-of-flight measurements of translocating samples, and serve as an autocalibrated velocimeter and nanoscale tandem Coulter counters for single molecule analysis of heterogeneous samples. PMID:24753731

  16. a Chiral Tagging Strategy for Determining Absolute Configuration and Enantiomeric Excess by Molecular Rotational Spectroscopy

    NASA Astrophysics Data System (ADS)

    Evangelisti, Luca; Caminati, Walther; Patterson, David; Thomas, Javix; Xu, Yunjie; West, Channing; Pate, Brooks

    2017-06-01

    The introduction of three wave mixing rotational spectroscopy by Patterson, Schnell, and Doyle [1,2] has expanded applications of molecular rotational spectroscopy into the field of chiral analysis. Chiral analysis of a molecule is the quantitative measurement of the relative abundances of all stereoisomers of the molecule and these include both diastereomers (with distinct molecular rotational spectra) and enantiomers (with equivalent molecular rotational spectra). This work adapts a common strategy in chiral analysis of enantiomers to molecular rotational spectroscopy. A "chiral tag" is attached to the molecule of interest by making a weakly bound complex in a pulsed jet expansion. When this tag molecule is enantiopure, it will create diastereomeric complexes with the two enantiomers of the molecule being analyzed and these can be differentiated by molecule rotational spectroscopy. Identifying the structure of this complex, with knowledge of the absolute configuration of the tag, establishes the absolute configuration of the molecule of interest. Furthermore, the diastereomer complex spectra can be used to determine the enantiomeric excess of the sample. The ability to perform chiral analysis will be illustrated by a study of solketal using propylene oxide as the tag. The possibility of using current methods of quantum chemistry to assign a specific structure to the chiral tag complex will be discussed. Finally, chiral tag rotational spectroscopy offers a "gold standard" method for determining the absolute configuration of the molecule through determination of the substitution structure of the complex. When this measurement is possible, rotational spectroscopy can deliver a quantitative three dimensional structure of the molecule with correct stereochemistry as the analysis output. [1] David Patterson, Melanie Schnell, John M. Doyle, Nature 497, 475 (2013). [2] David Patterson, John M. Doyle, Phys. Rev. Lett. 111, 023008 (2013).

  17. Performance of laboratories analysing welding fume on filter samples: results from the WASP proficiency testing scheme.

    PubMed

    Stacey, Peter; Butler, Owen

    2008-06-01

    This paper emphasizes the need for occupational hygiene professionals to require evidence of the quality of welding fume data from analytical laboratories. The measurement of metals in welding fume using atomic spectrometric techniques is a complex analysis often requiring specialist digestion procedures. The results from a trial programme testing the proficiency of laboratories in the Workplace Analysis Scheme for Proficiency (WASP) to measure potentially harmful metals in several different types of welding fume showed that most laboratories underestimated the mass of analyte on the filters. The average recovery was 70-80% of the target value and >20% of reported recoveries for some of the more difficult welding fume matrices were <50%. This level of under-reporting has significant implications for any health or hygiene studies of the exposure of welders to toxic metals for the types of fumes included in this study. Good laboratories' performance measuring spiked WASP filter samples containing soluble metal salts did not guarantee good performance when measuring the more complex welding fume trial filter samples. Consistent rather than erratic error predominated, suggesting that the main analytical factor contributing to the differences between the target values and results was the effectiveness of the sample preparation procedures used by participating laboratories. It is concluded that, with practice and regular participation in WASP, performance can improve over time.

  18. Didelphis marsupialis (common opossum): a potential reservoir host for zoonotic leishmaniasis in the metropolitan region of Belo Horizonte (Minas Gerais, Brazil).

    PubMed

    Schallig, Henk D F H; da Silva, Eduardo S; van der Meide, Wendy F; Schoone, Gerard J; Gontijo, Celia M F

    2007-01-01

    Identification of the zoonotic reservoir is important for leishmaniasis control program. A number of (wild) animal species may serve as reservoir hosts, including the opossum Didelphis marsupialis. A survey carried out in Didelphis specimens (n = 111) from the metropolitan region of Belo Horizonte, an important focus of human leishmaniasis in Brazil, is reported. All animals were serologically tested with indirect fluorescence antibody test (IFAT) and direct agglutination tests (DAT) based on L. (L.) donovani or L. (V.) braziliensis antigen. A sub-population (n = 20) was analyzed with polymerase chain reaction (PCR) for the presence of Leishmania-specific DNA. For species identification, PCR-positive samples were subjected to restriction enzyme fragment polymorphism (RFLP) analysis. Depending on the sero-diagnostic test employed, the sero-prevalence varied between 8.1% (9/111 animals positive with DAT test based on L. braziliensis antigen) and 21.6% (24/111 animals positive with IFAT). Five out of 20 samples analyzed with PCR tested positive for the presence of Leishmania-specific DNA. RFLP analysis revealed that two samples contained L. braziliensis complex DNA, one contained L. donovani complex DNA, and two samples could not be typed with the methodology used. These data suggest a potential role for the opossum as a reservoir host for zoonotic leishmaniasis in the region.

  19. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Dalitz plot analysis of Ds+→K+K-π+

    NASA Astrophysics Data System (ADS)

    Del Amo Sanchez, P.; Lees, J. P.; Poireau, V.; Prencipe, E.; Tisserand, V.; Garra Tico, J.; Grauges, E.; Martinelli, M.; Milanes, D. A.; Palano, A.; Pappagallo, M.; Eigen, G.; Stugu, B.; Sun, L.; Brown, D. N.; Kerth, L. T.; Kolomensky, Yu. G.; Lynch, G.; Osipenkov, I. L.; Koch, H.; Schroeder, T.; Asgeirsson, D. J.; Hearty, C.; Mattison, T. S.; McKenna, J. A.; Khan, A.; Blinov, V. E.; Buzykaev, A. R.; Druzhinin, V. P.; Golubev, V. B.; Kravchenko, E. A.; Onuchin, A. P.; Serednyakov, S. I.; Skovpen, Yu. I.; Solodov, E. P.; Todyshev, K. Yu.; Yushkov, A. N.; Bondioli, M.; Curry, S.; Kirkby, D.; Lankford, A. J.; Mandelkern, M.; Martin, E. C.; Stoker, D. P.; Atmacan, H.; Gary, J. W.; Liu, F.; Long, O.; Vitug, G. M.; Campagnari, C.; Hong, T. M.; Kovalskyi, D.; Richman, J. D.; West, C.; Eisner, A. M.; Heusch, C. A.; Kroseberg, J.; Lockman, W. S.; Martinez, A. J.; Schalk, T.; Schumm, B. A.; Seiden, A.; Winstrom, L. O.; Cheng, C. H.; Doll, D. A.; Echenard, B.; Hitlin, D. G.; Ongmongkolkul, P.; Porter, F. C.; Rakitin, A. Y.; Andreassen, R.; Dubrovin, M. S.; Mancinelli, G.; Meadows, B. T.; Sokoloff, M. D.; Bloom, P. C.; Ford, W. T.; Gaz, A.; Nagel, M.; Nauenberg, U.; Smith, J. G.; Wagner, S. R.; Ayad, R.; Toki, W. H.; Jasper, H.; Karbach, T. M.; Petzold, A.; Spaan, B.; Kobel, M. J.; Schubert, K. R.; Schwierz, R.; Bernard, D.; Verderi, M.; Clark, P. J.; Playfer, S.; Watson, J. E.; Andreotti, M.; Bettoni, D.; Bozzi, C.; Calabrese, R.; Cecchi, A.; Cibinetto, G.; Fioravanti, E.; Franchini, P.; Garzia, I.; Luppi, E.; Munerato, M.; Negrini, M.; Petrella, A.; Piemontese, L.; Baldini-Ferroli, R.; Calcaterra, A.; de Sangro, R.; Finocchiaro, G.; Nicolaci, M.; Pacetti, S.; Patteri, P.; Peruzzi, I. M.; Piccolo, M.; Rama, M.; Zallo, A.; Contri, R.; Guido, E.; Lo Vetere, M.; Monge, M. R.; Passaggio, S.; Patrignani, C.; Robutti, E.; Tosi, S.; Bhuyan, B.; Prasad, V.; Lee, C. L.; Morii, M.; Edwards, A. J.; Adametz, A.; Marks, J.; Uwer, U.; Bernlochner, F. U.; Ebert, M.; Lacker, H. M.; Lueck, T.; Volk, A.; Dauncey, P. D.; Tibbetts, M.; Behera, P. K.; Mallik, U.; Chen, C.; Cochran, J.; Crawley, H. B.; Dong, L.; Meyer, W. T.; Prell, S.; Rosenberg, E. I.; Rubin, A. E.; Gritsan, A. V.; Guo, Z. J.; Arnaud, N.; Davier, M.; Derkach, D.; Firmino da Costa, J.; Grosdidier, G.; Le Diberder, F.; Lutz, A. M.; Malaescu, B.; Perez, A.; Roudeau, P.; Schune, M. H.; Serrano, J.; Sordini, V.; Stocchi, A.; Wang, L.; Wormser, G.; Lange, D. J.; Wright, D. M.; Bingham, I.; Chavez, C. A.; Coleman, J. P.; Fry, J. R.; Gabathuler, E.; Gamet, R.; Hutchcroft, D. E.; Payne, D. J.; Touramanis, C.; Bevan, A. J.; di Lodovico, F.; Sacco, R.; Sigamani, M.; Cowan, G.; Paramesvaran, S.; Wren, A. C.; Brown, D. N.; Davis, C. L.; Denig, A. G.; Fritsch, M.; Gradl, W.; Hafner, A.; Alwyn, K. E.; Bailey, D.; Barlow, R. J.; Jackson, G.; Lafferty, G. D.; Anderson, J.; Cenci, R.; Jawahery, A.; Roberts, D. A.; Simi, G.; Tuggle, J. M.; Dallapiccola, C.; Salvati, E.; Cowan, R.; Dujmic, D.; Sciolla, G.; Zhao, M.; Lindemann, D.; Patel, P. M.; Robertson, S. H.; Schram, M.; Biassoni, P.; Lazzaro, A.; Lombardo, V.; Palombo, F.; Stracka, S.; Cremaldi, L.; Godang, R.; Kroeger, R.; Sonnek, P.; Summers, D. J.; Nguyen, X.; Simard, M.; Taras, P.; de Nardo, G.; Monorchio, D.; Onorato, G.; Sciacca, C.; Raven, G.; Snoek, H. L.; Jessop, C. P.; Knoepfel, K. J.; Losecco, J. M.; Wang, W. F.; Corwin, L. A.; Honscheid, K.; Kass, R.; Morris, J. P.; Blount, N. L.; Brau, J.; Frey, R.; Igonkina, O.; Kolb, J. A.; Rahmat, R.; Sinev, N. B.; Strom, D.; Strube, J.; Torrence, E.; Castelli, G.; Feltresi, E.; Gagliardi, N.; Margoni, M.; Morandin, M.; Posocco, M.; Rotondo, M.; Simonetto, F.; Stroili, R.; Ben-Haim, E.; Bonneaud, G. R.; Briand, H.; Calderini, G.; Chauveau, J.; Hamon, O.; Leruste, Ph.; Marchiori, G.; Ocariz, J.; Prendki, J.; Sitt, S.; Biasini, M.; Manoni, E.; Rossi, A.; Angelini, C.; Batignani, G.; Bettarini, S.; Carpinelli, M.; Casarosa, G.; Cervelli, A.; Forti, F.; Giorgi, M. A.; Lusiani, A.; Neri, N.; Paoloni, E.; Rizzo, G.; Walsh, J. J.; Lopes Pegna, D.; Lu, C.; Olsen, J.; Smith, A. J. S.; Telnov, A. V.; Anulli, F.; Baracchini, E.; Cavoto, G.; Faccini, R.; Ferrarotto, F.; Ferroni, F.; Gaspero, M.; Li Gioi, L.; Mazzoni, M. A.; Piredda, G.; Renga, F.; Hartmann, T.; Leddig, T.; Schröder, H.; Waldi, R.; Adye, T.; Franek, B.; Olaiya, E. O.; Wilson, F. F.; Emery, S.; Hamel de Monchenault, G.; Vasseur, G.; Yèche, Ch.; Zito, M.; Allen, M. T.; Aston, D.; Bard, D. J.; Bartoldus, R.; Benitez, J. F.; Cartaro, C.; Convery, M. R.; Dorfan, J.; Dubois-Felsmann, G. P.; Dunwoodie, W.; Field, R. C.; Franco Sevilla, M.; Fulsom, B. G.; Gabareen, A. M.; Graham, M. T.; Grenier, P.; Hast, C.; Innes, W. R.; Kelsey, M. H.; Kim, H.; Kim, P.; Kocian, M. L.; Leith, D. W. G. S.; Li, S.; Lindquist, B.; Luitz, S.; Luth, V.; Lynch, H. L.; Macfarlane, D. B.; Marsiske, H.; Muller, D. R.; Neal, H.; Nelson, S.; O'Grady, C. P.; Ofte, I.; Perl, M.; Pulliam, T.; Ratcliff, B. N.; Roodman, A.; Salnikov, A. A.; Santoro, V.; Schindler, R. H.; Schwiening, J.; Snyder, A.; Su, D.; Sullivan, M. K.; Sun, S.; Suzuki, K.; Thompson, J. M.; Va'Vra, J.; Wagner, A. P.; Weaver, M.; Wisniewski, W. J.; Wittgen, M.; Wright, D. H.; Wulsin, H. W.; Yarritu, A. K.; Young, C. C.; Ziegler, V.; Chen, X. R.; Park, W.; Purohit, M. V.; White, R. M.; Wilson, J. R.; Randle-Conde, A.; Sekula, S. J.; Bellis, M.; Burchat, P. R.; Miyashita, T. S.; Ahmed, S.; Alam, M. S.; Ernst, J. A.; Pan, B.; Saeed, M. A.; Zain, S. B.; Guttman, N.; Soffer, A.; Lund, P.; Spanier, S. M.; Eckmann, R.; Ritchie, J. L.; Ruland, A. M.; Schilling, C. J.; Schwitters, R. F.; Wray, B. C.; Izen, J. M.; Lou, X. C.; Bianchi, F.; Gamba, D.; Pelliccioni, M.; Bomben, M.; Lanceri, L.; Vitale, L.; Lopez-March, N.; Martinez-Vidal, F.; Oyanguren, A.; Albert, J.; Banerjee, Sw.; Choi, H. H. F.; Hamano, K.; King, G. J.; Kowalewski, R.; Lewczuk, M. J.; Lindsay, C.; Nugent, I. M.; Roney, J. M.; Sobie, R. J.; Gershon, T. J.; Harrison, P. F.; Latham, T. E.; Pennington, M. R.; Puccio, E. M. T.; Band, H. R.; Dasu, S.; Flood, K. T.; Pan, Y.; Prepost, R.; Vuosalo, C. O.; Wu, S. L.

    2011-03-01

    We perform a Dalitz plot analysis of about 100 000 Ds+ decays to K+K-π+ and measure the complex amplitudes of the intermediate resonances which contribute to this decay mode. We also measure the relative branching fractions of Ds+→K+K+π- and Ds+→K+K+K-. For this analysis we use a 384 fb-1 data sample, recorded by the BABAR detector at the PEP-II asymmetric-energy e+e- collider running at center-of-mass energies near 10.58 GeV.

  1. Integrative Analysis of Complex Cancer Genomics and Clinical Profiles Using the cBioPortal

    PubMed Central

    Gao, Jianjiong; Aksoy, Bülent Arman; Dogrusoz, Ugur; Dresdner, Gideon; Gross, Benjamin; Sumer, S. Onur; Sun, Yichao; Jacobsen, Anders; Sinha, Rileen; Larsson, Erik; Cerami, Ethan; Sander, Chris; Schultz, Nikolaus

    2014-01-01

    The cBioPortal for Cancer Genomics (http://cbioportal.org) provides a Web resource for exploring, visualizing, and analyzing multidimensional cancer genomics data. The portal reduces molecular profiling data from cancer tissues and cell lines into readily understandable genetic, epigenetic, gene expression, and proteomic events. The query interface combined with customized data storage enables researchers to interactively explore genetic alterations across samples, genes, and pathways and, when available in the underlying data, to link these to clinical outcomes. The portal provides graphical summaries of gene-level data from multiple platforms, network visualization and analysis, survival analysis, patient-centric queries, and software programmatic access. The intuitive Web interface of the portal makes complex cancer genomics profiles accessible to researchers and clinicians without requiring bioinformatics expertise, thus facilitating biological discoveries. Here, we provide a practical guide to the analysis and visualization features of the cBioPortal for Cancer Genomics. PMID:23550210

  2. A software suite for the generation and comparison of peptide arrays from sets of data collected by liquid chromatography-mass spectrometry.

    PubMed

    Li, Xiao-jun; Yi, Eugene C; Kemp, Christopher J; Zhang, Hui; Aebersold, Ruedi

    2005-09-01

    There is an increasing interest in the quantitative proteomic measurement of the protein contents of substantially similar biological samples, e.g. for the analysis of cellular response to perturbations over time or for the discovery of protein biomarkers from clinical samples. Technical limitations of current proteomic platforms such as limited reproducibility and low throughput make this a challenging task. A new LC-MS-based platform is able to generate complex peptide patterns from the analysis of proteolyzed protein samples at high throughput and represents a promising approach for quantitative proteomics. A crucial component of the LC-MS approach is the accurate evaluation of the abundance of detected peptides over many samples and the identification of peptide features that can stratify samples with respect to their genetic, physiological, or environmental origins. We present here a new software suite, SpecArray, that generates a peptide versus sample array from a set of LC-MS data. A peptide array stores the relative abundance of thousands of peptide features in many samples and is in a format identical to that of a gene expression microarray. A peptide array can be subjected to an unsupervised clustering analysis to stratify samples or to a discriminant analysis to identify discriminatory peptide features. We applied the SpecArray to analyze two sets of LC-MS data: one was from four repeat LC-MS analyses of the same glycopeptide sample, and another was from LC-MS analysis of serum samples of five male and five female mice. We demonstrate through these two study cases that the SpecArray software suite can serve as an effective software platform in the LC-MS approach for quantitative proteomics.

  3. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  4. Quick detection and quantification of iron-cyanide complexes using fourier transform infrared spectroscopy.

    PubMed

    Sut-Lohmann, Magdalena; Raab, Thomas

    2017-08-01

    The continuous release of persistent iron-cyanide (Fe-CN) complexes from various industrial sources poses a high hazard to the environment and indicates the necessity to analyze a considerable amount of samples. Conventional flow injection analysis (FIA) is a time and cost consuming method for cyanide (CN) determination. Thus, a rapid and economic alternative needs to be developed to quantify the Fe-CN complexes. 52 soil samples were collected at a former Manufactured Gas Plant (MGP) site in order to determine the feasibility of diffuse reflectance infrared Fourier spectroscopy (DRIFTS). Soil analysis revealed CN concentrations in a range from 8 to 14.809 mg kg -1 , where 97% was in the solid form (Fe 4 [Fe(CN) 6 ] 3 ), which is characterized by a single symmetrical CN band in the range 2092-2084 cm -1 . The partial least squares (PLS) calibration-validation model revealed IR response to CN tot which exceeds 2306 mg kg -1 (limit of detection, LOD). Leave-one-out cross-validation (LOO-CV) was performed on soil samples, which contained low CN tot (<900 mg kg -1 ). This improved the sensitivity of the model by reducing the LOD to 154 mg kg -1 . Finally, the LOO-CV conducted on the samples with CN tot  > 900 mg kg -1 resulted in LOD equal to 3751 mg kg -1 . It was found that FTIR spectroscopy provides the information concerning different CN species in the soil samples. Additionally, it is suitable for quantifying Fe-CN species in matrixes with CN tot  > 154 mg kg -1 . Thus, FTIR spectroscopy, in combination with the statistical approach applied here seems to be a feasible and quick method for screening of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Molecularly imprinted polymer coupled with dispersive liquid-liquid microextraction and injector port silylation: a novel approach for the determination of 3-phenoxybenzoic acid in complex biological samples using gas chromatography-tandem mass spectrometry.

    PubMed

    Mudiam, Mohana Krishna Reddy; Chauhan, Abhishek; Jain, Rajeev; Dhuriya, Yogesh Kumar; Saxena, Prem Narain; Khanna, Vinay Kumar

    2014-01-15

    A novel analytical approach based on molecularly imprinted solid phase extraction (MISPE) coupled with dispersive liquid-liquid microextraction (DLLME), and injector port silylation (IPS) has been developed for the selective preconcentration, derivatization and analysis of 3-phenoxybenzoic acid (3-PBA) using gas chromatography-tandem mass spectrometry (GC-MS/MS) in complex biological samples such as rat blood and liver. Factors affecting the synthesis of MIP were evaluated and the best monomer and cross-linker were selected based on binding affinity studies. Various parameters of MISPE, DLLME and IPS were optimized for the selective preconcentration and derivatization of 3-PBA. The developed method offers a good linearity over the calibration range of 0.02-2.5ngmg(-1) and 7.5-2000ngmL(-1) for liver and blood respectively. Under optimized conditions, the recovery of 3-PBA in liver and blood samples were found to be in the range of 83-91%. The detection limit was found to be 0.0045ngmg(-1) and 1.82ngmL(-1) in liver and blood respectively. SRM transition of 271→227 and 271→197 has been selected as quantifier and qualifier transition for 3-PBA derivative. Intra and inter-day precision for five replicates in a day and for five, successive days was found to be less than 8%. The method developed was successfully applied to real samples, i.e. rat blood and tissue for quantitative evaluation of 3-PBA. The analytical approach developed is rapid, economic, simple, eco-friendly and possess immense utility for the analysis of analytes with polar functional groups in complex biological samples by GC-MS/MS. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Determination of effective complex refractive index of a turbid liquid with surface plasmon resonance phase detection.

    PubMed

    Yingying, Zhang; Jiancheng, Lai; Cheng, Yin; Zhenhua, Li

    2009-03-01

    The dependence of the surface plasmon resonance (SPR) phase difference curve on the complex refractive index of a sample in Kretschmann configuration is discussed comprehensively, based on which a new method is proposed to measure the complex refractive index of turbid liquid. A corresponding experiment setup was constructed to measure the SPR phase difference curve, and the complex refractive index of turbid liquid was determined. By using the setup, the complex refractive indices of Intralipid solutions with concentrations of 5%, 10%, 15%, and 20% are obtained to be 1.3377+0.0005 i, 1.3427+0.0028 i, 1.3476+0.0034 i, and 1.3496+0.0038 i, respectively. Furthermore, the error analysis indicates that the root-mean-square errors of both the real and the imaginary parts of the measured complex refractive index are less than 5x10(-5).

  7. Combining Capillary Electrophoresis with Mass Spectrometry for Applications in Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, David C.; Smith, Richard D.

    2005-04-01

    Throughout the field of global proteomics, ranging from simple organism studies to human medical applications, the high sample complexity creates demands for improved separations and analysis techniques. Furthermore, with increased organism complexity, the correlation between proteome and genome becomes less certain due to extensive mRNA processing prior to translation. In this way, the same DNA sequence can potentially code for regions in a number of distinct proteins; quantitative differences in expression (or abundance) between these often-related species are of significant interest. Well-established proteomics techniques, which use genomic information to identify peptides that originate from protease digestion, often cannot easily distinguishmore » between such gene products; intact protein-level analyses are required to complete the picture, particularly for identifying post-translational modifications. While chromatographic techniques are currently better suited to peptide analysis, capillary electrophoresis (CE) in combination with mass spectrometry (MS) may become important for intact protein analysis. This review focuses on CE/MS instrumentation and techniques showing promise for such applications, highlighting those with greatest potential. Reference will also be made to developments relevant to peptide-level analyses for use in time- or sample-limited situations.« less

  8. Placental Proteomics: A Shortcut to Biological Insight

    PubMed Central

    Robinson, John M.; Vandré, Dale D.; Ackerman, William E.

    2012-01-01

    Proteomics analysis of biological samples has the potential to identify novel protein expression patterns and/or changes in protein expression patterns in different developmental or disease states. An important component of successful proteomics research, at least in its present form, is to reduce the complexity of the sample if it is derived from cells or tissues. One method to simplify complex tissues is to focus on a specific, highly purified sub-proteome. Using this approach we have developed methods to prepare highly enriched fractions of the apical plasma membrane of the syncytiotrophoblast. Through proteomics analysis of this fraction we have identified over five hundred proteins several of which were previously not known to reside in the syncytiotrophoblast. Herein, we focus on two of these, dysferlin and myoferlin. These proteins, largely known from studies of skeletal muscle, may not have been found in the human placenta were it not for discovery-based proteomics analysis. This new knowledge, acquired through a discovery-driven approach, can now be applied for the generation of hypothesis-based experimentation. Thus discovery-based and hypothesis-based research are complimentary approaches that when coupled together can hasten scientific discoveries. PMID:19070895

  9. A Versatile Integrated Ambient Ionization Source Platform.

    PubMed

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-30

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.

  10. Solid-contact potentiometric sensors and multisensors based on polyaniline and thiacalixarene receptors for the analysis of some beverages and alcoholic drinks

    NASA Astrophysics Data System (ADS)

    Sorvin, Michail; Belyakova, Svetlana; Stoikov, Ivan; Shamagsumova, Rezeda; Evtugyn, Gennady

    2018-04-01

    Electronic tongue is a sensor array that aims to discriminate and analyze complex media like food and beverages on the base of chemometrics approaches for data mining and pattern recognition. In this review, the concept of electronic tongue comprising of solid-contact potentiometric sensors with polyaniline and thacalix[4]arene derivatives is described. The electrochemical reactions of polyaniline as a background of solid-contact sensors and the characteristics of thiacalixarenes and pillararenes as neutral ionophores are briefly considered. The electronic tongue systems described were successfully applied for assessment of fruit juices, green tea, beer and alcoholic drinks They were classified in accordance with the origination, brands and styles. Variation of the sensor response resulted from the reactions between Fe(III) ions added and sample components, i.e., antioxidants and complexing agents. The use of principal component analysis and discriminant analysis is shown for multisensor signal treatment and visualization. The discrimination conditions can be optimized by variation of the ionophores, Fe(III) concentration and sample dilution. The results obtained were compared with other electronic tongue systems reported for the same subjects.

  11. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    PubMed

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Exploring hyperspectral imaging data sets with topological data analysis.

    PubMed

    Duponchel, Ludovic

    2018-02-13

    Analytical chemistry is rapidly changing. Indeed we acquire always more data in order to go ever further in the exploration of complex samples. Hyperspectral imaging has not escaped this trend. It quickly became a tool of choice for molecular characterisation of complex samples in many scientific domains. The main reason is that it simultaneously provides spectral and spatial information. As a result, chemometrics has provided many exploration tools (PCA, clustering, MCR-ALS …) well-suited for such data structure at early stage. However we are today facing a new challenge considering the always increasing number of pixels in the data cubes we have to manage. The idea is therefore to introduce a new paradigm of Topological Data Analysis in order explore hyperspectral imaging data sets highlighting its nice properties and specific features. With this paper, we shall also point out the fact that conventional chemometric methods are often based on variance analysis or simply impose a data model which implicitly defines the geometry of the data set. Thus we will show that it is not always appropriate in the framework of hyperspectral imaging data sets exploration. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Fluorescence lifetime evaluation of whole soils from the Amazon rainforest.

    PubMed

    Nicolodelli, Gustavo; Tadini, Amanda Maria; Nogueira, Marcelo Saito; Pratavieira, Sebastião; Mounier, Stephane; Huaman, Jose Luis Clabel; Dos Santos, Cléber Hilário; Montes, Célia Regina; Milori, Débora Marcondes Bastos Pereira

    2017-08-20

    Time-resolved fluorescence spectroscopy (TRFS) is a new tool that can be used to investigate processes of interaction between metal ions and organic matter (OM) in soils, providing a specific analysis of the structure and dynamics of macromolecules. To the best of our knowledge, there are no studies in the literature reporting the use of this technique applied to whole/non-fractionated soil samples, making it a potential method for use in future studies. This work describes the use of TRFS to evaluate the fluorescence lifetimes of OM of whole soils from the Amazon region. Analysis was made of pellets of soils from an oxisol-spodosol system, collected in São Gabriel da Cachoeira (Amazonas, Brazil). The fluorescence lifetimes in the oxisol-spodosol system were attributed to two different fluorophores. One was related to complexation of an OM fraction with metals, resulting in a shorter fluorophore lifetime. A short fluorescence lifetime (2-12 ns) could be associated with simpler structures of the OM, while a long lifetime (19-66 ns) was associated with more complex OM structures. This new TRFS technique for analysis of the fluorescence lifetime in whole soil samples complies with the principles of green chemistry.

  14. Solid-Contact Potentiometric Sensors and Multisensors Based on Polyaniline and Thiacalixarene Receptors for the Analysis of Some Beverages and Alcoholic Drinks.

    PubMed

    Sorvin, Michail; Belyakova, Svetlana; Stoikov, Ivan; Shamagsumova, Rezeda; Evtugyn, Gennady

    2018-01-01

    Electronic tongue is a sensor array that aims to discriminate and analyze complex media like food and beverages on the base of chemometrics approaches for data mining and pattern recognition. In this review, the concept of electronic tongue comprising of solid-contact potentiometric sensors with polyaniline and thacalix[4]arene derivatives is described. The electrochemical reactions of polyaniline as a background of solid-contact sensors and the characteristics of thiacalixarenes and pillararenes as neutral ionophores are briefly considered. The electronic tongue systems described were successfully applied for assessment of fruit juices, green tea, beer, and alcoholic drinks They were classified in accordance with the origination, brands and styles. Variation of the sensor response resulted from the reactions between Fe(III) ions added and sample components, i.e., antioxidants and complexing agents. The use of principal component analysis and discriminant analysis is shown for multisensor signal treatment and visualization. The discrimination conditions can be optimized by variation of the ionophores, Fe(III) concentration, and sample dilution. The results obtained were compared with other electronic tongue systems reported for the same subjects.

  15. Solid-Contact Potentiometric Sensors and Multisensors Based on Polyaniline and Thiacalixarene Receptors for the Analysis of Some Beverages and Alcoholic Drinks

    PubMed Central

    Sorvin, Michail; Belyakova, Svetlana; Stoikov, Ivan; Shamagsumova, Rezeda; Evtugyn, Gennady

    2018-01-01

    Electronic tongue is a sensor array that aims to discriminate and analyze complex media like food and beverages on the base of chemometrics approaches for data mining and pattern recognition. In this review, the concept of electronic tongue comprising of solid-contact potentiometric sensors with polyaniline and thacalix[4]arene derivatives is described. The electrochemical reactions of polyaniline as a background of solid-contact sensors and the characteristics of thiacalixarenes and pillararenes as neutral ionophores are briefly considered. The electronic tongue systems described were successfully applied for assessment of fruit juices, green tea, beer, and alcoholic drinks They were classified in accordance with the origination, brands and styles. Variation of the sensor response resulted from the reactions between Fe(III) ions added and sample components, i.e., antioxidants and complexing agents. The use of principal component analysis and discriminant analysis is shown for multisensor signal treatment and visualization. The discrimination conditions can be optimized by variation of the ionophores, Fe(III) concentration, and sample dilution. The results obtained were compared with other electronic tongue systems reported for the same subjects. PMID:29740577

  16. A Versatile Integrated Ambient Ionization Source Platform

    NASA Astrophysics Data System (ADS)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  17. A spin-frustrated trinuclear copper complex based on triaminoguanidine with an energetically well-separated degenerate ground state.

    PubMed

    Spielberg, Eike T; Gilb, Aksana; Plaul, Daniel; Geibig, Daniel; Hornig, David; Schuch, Dirk; Buchholz, Axel; Ardavan, Arzhang; Plass, Winfried

    2015-04-06

    We present the synthesis and crystal structure of the trinuclear copper complex [Cu3(saltag)(bpy)3]ClO4·3DMF [H5saltag = tris(2-hydroxybenzylidene)triaminoguanidine; bpy = 2,2'-bipyridine]. The complex crystallizes in the trigonal space group R3̅, with all copper ions being crystallographically equivalent. Analysis of the temperature dependence of the magnetic susceptibility shows that the triaminoguanidine ligand mediates very strong antiferromagnetic interactions (JCuCu = -324 cm(-1)). Detailed analysis of the magnetic susceptibility and magnetization data as well as X-band electron spin resonance spectra, all recorded on both powdered samples and single crystals, show indications of neither antisymmetric exchange nor symmetry lowering, thus indicating only a very small splitting of the degenerate S = (1)/2 ground state. These findings are corroborated by density functional theory calculations, which explain both the strong isotropic and negligible antisymmetric exchange interactions.

  18. Complex segregation analysis of craniomandibular osteopathy in Deutsch Drahthaar dogs.

    PubMed

    Vagt, J; Distl, O

    2018-01-01

    This study investigated familial relationships among Deutsch Drahthaar dogs with craniomandibular osteopathy and examined the most likely mode of inheritance. Sixteen Deutsch Drahthaar dogs with craniomandibular osteopathy were diagnosed using clinical findings, radiography or computed tomography. All 16 dogs with craniomandibular osteopathy had one common ancestor. Complex segregation analyses rejected models explaining the segregation of craniomandibular osteopathy through random environmental variation, monogenic inheritance or an additive sex effect. Polygenic and mixed major gene models sufficiently explained the segregation of craniomandibular osteopathy in the pedigree analysis and offered the most likely hypotheses. The SLC37A2:c.1332C>T variant was not found in a sample of Deutsch Drahthaar dogs with craniomandibular osteopathy, nor in healthy controls. Craniomandibular osteopathy is an inherited condition in Deutsch Drahthaar dogs and the inheritance seems to be more complex than a simple Mendelian model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Distinguishing fiction from non-fiction with complex networks

    NASA Astrophysics Data System (ADS)

    Larue, David M.; Carr, Lincoln D.; Jones, Linnea K.; Stevanak, Joe T.

    2014-03-01

    Complex Network Measures are applied to networks constructed from texts in English to demonstrate an initial viability in textual analysis. Texts from novels and short stories obtained from Project Gutenberg and news stories obtained from NPR are selected. Unique word stems in a text are used as nodes in an associated unweighted undirected network, with edges connecting words occurring within a certain number of words somewhere in the text. Various combinations of complex network measures are computed for each text's network. Fisher's Linear Discriminant analysis is used to build a parameter optimizing the ability to separate the texts according to their genre. Success rates in the 70% range for correctly distinguishing fiction from non-fiction were obtained using edges defined as within four words, using 400 word samples from 400 texts from each of the two genres with some combinations of measures such as the power-law exponents of degree distributions and clustering coefficients.

  20. Participation in Decision Making as a Property of Complex Adaptive Systems: Developing and Testing a Measure

    PubMed Central

    Anderson, Ruth A.; Hsieh, Pi-Ching; Su, Hui Fang; Landerman, Lawrence R.; McDaniel, Reuben R.

    2013-01-01

    Objectives. To (1) describe participation in decision-making as a systems-level property of complex adaptive systems and (2) present empirical evidence of reliability and validity of a corresponding measure. Method. Study 1 was a mail survey of a single respondent (administrators or directors of nursing) in each of 197 nursing homes. Study 2 was a field study using random, proportionally stratified sampling procedure that included 195 organizations with 3,968 respondents. Analysis. In Study 1, we analyzed the data to reduce the number of scale items and establish initial reliability and validity. In Study 2, we strengthened the psychometric test using a large sample. Results. Results demonstrated validity and reliability of the participation in decision-making instrument (PDMI) while measuring participation of workers in two distinct job categories (RNs and CNAs). We established reliability at the organizational level aggregated items scores. We established validity of the multidimensional properties using convergent and discriminant validity and confirmatory factor analysis. Conclusions. Participation in decision making, when modeled as a systems-level property of organization, has multiple dimensions and is more complex than is being traditionally measured. Managers can use this model to form decision teams that maximize the depth and breadth of expertise needed and to foster connection among them. PMID:24349771

  1. Top-down approach for the direct characterization of low molecular weight heparins using LC-FT-MS.

    PubMed

    Li, Lingyun; Zhang, Fuming; Zaia, Joseph; Linhardt, Robert J

    2012-10-16

    Low molecular heparins (LMWHs) are structurally complex, heterogeneous, polydisperse, and highly negatively charged mixtures of polysaccharides. The direct characterization of LMWH is a major challenge for currently available analytical technologies. Electrospray ionization (ESI) liquid chromatography-mass spectrometry (LC-MS) is a powerful tool for the characterization complex biological samples in the fields of proteomics, metabolomics, and glycomics. LC-MS has been applied to the analysis of heparin oligosaccharides, separated by size exclusion, reversed phase ion-pairing chromatography, and chip-based amide hydrophilic interaction chromatography (HILIC). However, there have been limited applications of ESI-LC-MS for the direct characterization of intact LMWHs (top-down analysis) due to their structural complexity, low ionization efficiency, and sulfate loss. Here we present a simple and reliable HILIC-Fourier transform (FT)-ESI-MS platform to characterize and compare two currently marketed LMWH products using the top-down approach requiring no special sample preparation steps. This HILIC system relies on cross-linked diol rather than amide chemistry, affording highly resolved chromatographic separations using a relatively high percentage of acetonitrile in the mobile phase, resulting in stable and high efficiency ionization. Bioinformatics software (GlycReSoft 1.0) was used to automatically assign structures within 5-ppm mass accuracy.

  2. Participation in decision making as a property of complex adaptive systems: developing and testing a measure.

    PubMed

    Anderson, Ruth A; Plowman, Donde; Corazzini, Kirsten; Hsieh, Pi-Ching; Su, Hui Fang; Landerman, Lawrence R; McDaniel, Reuben R

    2013-01-01

    Objectives. To (1) describe participation in decision-making as a systems-level property of complex adaptive systems and (2) present empirical evidence of reliability and validity of a corresponding measure. Method. Study 1 was a mail survey of a single respondent (administrators or directors of nursing) in each of 197 nursing homes. Study 2 was a field study using random, proportionally stratified sampling procedure that included 195 organizations with 3,968 respondents. Analysis. In Study 1, we analyzed the data to reduce the number of scale items and establish initial reliability and validity. In Study 2, we strengthened the psychometric test using a large sample. Results. Results demonstrated validity and reliability of the participation in decision-making instrument (PDMI) while measuring participation of workers in two distinct job categories (RNs and CNAs). We established reliability at the organizational level aggregated items scores. We established validity of the multidimensional properties using convergent and discriminant validity and confirmatory factor analysis. Conclusions. Participation in decision making, when modeled as a systems-level property of organization, has multiple dimensions and is more complex than is being traditionally measured. Managers can use this model to form decision teams that maximize the depth and breadth of expertise needed and to foster connection among them.

  3. Impact of Scedosporium apiospermum complex seroprevalence in patients with cystic fibrosis.

    PubMed

    Parize, Perrine; Billaud, Sandrine; Bienvenu, Anne L; Bourdy, Stéphanie; le Pogam, Marie A; Reix, Philippe; Picot, Stéphane; Robert, Raymond; Lortholary, Olivier; Bouchara, Jean-Philippe; Durieu, Isabelle

    2014-12-01

    Species of the Scedosporium apiospermum complex (S. a complex) are emerging fungi responsible for chronic airway colonization in cystic fibrosis (CF) patients. Recent studies performed on Aspergillus fumigatus suggest that the colonization of the airways by filamentous fungi may contribute to the progressive deterioration of lung function. We studied S. a complex seroprevalence, as a marker of close contact between patient and the fungi, in a large monocentric cohort of CF patients attended in a reference centre in Lyon, France. Serum samples from 373 CF patients were analysed. Antibodies against S. a complex were detected in 35 patients (9.4%). In multivariate analysis, S. a complex seropositivity was only associated with seropositivity to A. fumigatus. This study does not suggest an association between sensitization against S. a complex and poorer lung function in CF. Prospective studies are needed to evaluate the impact of both seropositivity and S. a complex colonization on the course of CF. Copyright © 2014 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  4. Minimized state complexity of quantum-encoded cryptic processes

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  5. LAB ANALYSIS OF EMERGENCY WATER SAMPLES CONTAINING UNKNOWN CONTAMINANTS: CONSIDERATIONS FROM THE USEPA RESPONSE PROTOCOL TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  6. Limitations and potential of spectral subtractions in fourier-transform infrared (FTIR) spectroscopy of soil samples

    USDA-ARS?s Scientific Manuscript database

    Soil science research is increasingly applying Fourier transform infrared (FTIR) spectroscopy for analysis of soil organic matter (SOM). However, the compositional complexity of soils and the dominance of the mineral component can limit spectroscopic resolution of SOM and other minor components. The...

  7. Tandem Extraction/Liquid Chromatography-Mass Spectrometry Protocol for the Analysis of Acrylamide and Surfactant-Related Compounds in Complex Matrix Environmental Water Samples

    EPA Science Inventory

    Ethoxylated alcohols, alkylphenols, and acrylamide are emerging contaminants with many different routes into the environment. Ethoxylated alcohols are used ubiquitously as surfactants in both industrial and household products. The use of ethoxylated alcohols and alkylphenols as s...

  8. Mass Spectrometry on Future Mars Landers

    NASA Technical Reports Server (NTRS)

    Brinckerhoff, W. B.; Mahaffy, P. R.

    2011-01-01

    Mass spectrometry investigations on the 2011 Mars Science Laboratory (MSL) and the 2018 ExoMars missions will address core science objectives related to the potential habitability of their landing site environments and more generally the near-surface organic inventory of Mars. The analysis of complex solid samples by mass spectrometry is a well-known approach that can provide a broad and sensitive survey of organic and inorganic compounds as well as supportive data for mineralogical analysis. The science value of such compositional information is maximized when one appreciates the particular opportunities and limitations of in situ analysis with resource-constrained instrumentation in the context of a complete science payload and applied to materials found in a particular environment. The Sample Analysis at Mars (SAM) investigation on MSL and the Mars Organic Molecule Analyzer (MOMA) investigation on ExoMars will thus benefit from and inform broad-based analog field site work linked to the Mars environments where such analysis will occur.

  9. Fast identification of microplastics in complex environmental samples by a thermal degradation method.

    PubMed

    Dümichen, Erik; Eisentraut, Paul; Bannick, Claus Gerhard; Barthel, Anne-Kathrin; Senz, Rainer; Braun, Ulrike

    2017-05-01

    In order to determine the relevance of microplastic particles in various environmental media, comprehensive investigations are needed. However, no analytical method exists for fast identification and quantification. At present, optical spectroscopy methods like IR and RAMAN imaging are used. Due to their time consuming procedures and uncertain extrapolation, reliable monitoring is difficult. For analyzing polymers Py-GC-MS is a standard method. However, due to a limited sample amount of about 0.5 mg it is not suited for analysis of complex sample mixtures like environmental samples. Therefore, we developed a new thermoanalytical method as a first step for identifying microplastics in environmental samples. A sample amount of about 20 mg, which assures the homogeneity of the sample, is subjected to complete thermal decomposition. The specific degradation products of the respective polymer are adsorbed on a solid-phase adsorber and subsequently analyzed by thermal desorption gas chromatography mass spectrometry. For certain identification, the specific degradation products for the respective polymer were selected first. Afterwards real environmental samples from the aquatic (three different rivers) and the terrestrial (bio gas plant) systems were screened for microplastics. Mainly polypropylene (PP), polyethylene (PE) and polystyrene (PS) were identified for the samples from the bio gas plant and PE and PS from the rivers. However, this was only the first step and quantification measurements will follow. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Learning from Past Classification Errors: Exploring Methods for Improving the Performance of a Deep Learning-based Building Extraction Model through Quantitative Analysis of Commission Errors for Optimal Sample Selection

    NASA Astrophysics Data System (ADS)

    Swan, B.; Laverdiere, M.; Yang, L.

    2017-12-01

    In the past five years, deep Convolutional Neural Networks (CNN) have been increasingly favored for computer vision applications due to their high accuracy and ability to generalize well in very complex problems; however, details of how they function and in turn how they may be optimized are still imperfectly understood. In particular, their complex and highly nonlinear network architecture, including many hidden layers and self-learned parameters, as well as their mathematical implications, presents open questions about how to effectively select training data. Without knowledge of the exact ways the model processes and transforms its inputs, intuition alone may fail as a guide to selecting highly relevant training samples. Working in the context of improving a CNN-based building extraction model used for the LandScan USA gridded population dataset, we have approached this problem by developing a semi-supervised, highly-scalable approach to select training samples from a dataset of identified commission errors. Due to the large scope this project, tens of thousands of potential samples could be derived from identified commission errors. To efficiently trim those samples down to a manageable and effective set for creating additional training sample, we statistically summarized the spectral characteristics of areas with rates of commission errors at the image tile level and grouped these tiles using affinity propagation. Highly representative members of each commission error cluster were then used to select sites for training sample creation. The model will be incrementally re-trained with the new training data to allow for an assessment of how the addition of different types of samples affects the model performance, such as precision and recall rates. By using quantitative analysis and data clustering techniques to select highly relevant training samples, we hope to improve model performance in a manner that is resource efficient, both in terms of training process and in sample creation.

  11. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  12. SLEPR: A Sample-Level Enrichment-Based Pathway Ranking Method — Seeking Biological Themes through Pathway-Level Consistency

    PubMed Central

    Yi, Ming; Stephens, Robert M.

    2008-01-01

    Analysis of microarray and other high throughput data often involves identification of genes consistently up or down-regulated across samples as the first step in extraction of biological meaning. This gene-level paradigm can be limited as a result of valid sample fluctuations and biological complexities. In this report, we describe a novel method, SLEPR, which eliminates this limitation by relying on pathway-level consistencies. Our method first selects the sample-level differentiated genes from each individual sample, capturing genes missed by other analysis methods, ascertains the enrichment levels of associated pathways from each of those lists, and then ranks annotated pathways based on the consistency of enrichment levels of individual samples from both sample classes. As a proof of concept, we have used this method to analyze three public microarray datasets with a direct comparison with the GSEA method, one of the most popular pathway-level analysis methods in the field. We found that our method was able to reproduce the earlier observations with significant improvements in depth of coverage for validated or expected biological themes, but also produced additional insights that make biological sense. This new method extends existing analyses approaches and facilitates integration of different types of HTP data. PMID:18818771

  13. Quantitative Analysis of Therapeutic Drugs in Dried Blood Spot Samples by Paper Spray Mass Spectrometry: An Avenue to Therapeutic Drug Monitoring

    NASA Astrophysics Data System (ADS)

    Manicke, Nicholas Edward; Abu-Rabie, Paul; Spooner, Neil; Ouyang, Zheng; Cooks, R. Graham

    2011-09-01

    A method is presented for the direct quantitative analysis of therapeutic drugs from dried blood spot samples by mass spectrometry. The method, paper spray mass spectrometry, generates gas phase ions directly from the blood card paper used to store dried blood samples without the need for complex sample preparation and separation; the entire time for preparation and analysis of blood samples is around 30 s. Limits of detection were investigated for a chemically diverse set of some 15 therapeutic drugs; hydrophobic and weakly basic drugs, such as sunitinib, citalopram, and verapamil, were found to be routinely detectable at approximately 1 ng/mL. Samples were prepared by addition of the drug to whole blood. Drug concentrations were measured quantitatively over several orders of magnitude, with accuracies within 10% of the expected value and relative standard deviation (RSD) of around 10% by prespotting an internal standard solution onto the paper prior to application of the blood sample. We have demonstrated that paper spray mass spectrometry can be used to quantitatively measure drug concentrations over the entire therapeutic range for a wide variety of drugs. The high quality analytical data obtained indicate that the technique may be a viable option for therapeutic drug monitoring.

  14. Improvement of the tetrachloromercurate absorption technique for measuring low atmospheric SO2 mixing ratios

    NASA Astrophysics Data System (ADS)

    Jaeschke, W.; Beltz, N.; Haunold, W.; Krischke, U.

    1997-07-01

    During the Gas-Phase Sulfur Intercomparison Experiment (GASIE) in 1994 an analytical system for measuring sulfur dioxide mixing ratios at low parts per trillion (pptv) levels was employed. It is based on the absorption of SO2 on a tetrachloromercurate(II)-impregnated filter. The subsequent analysis uses a chemiluminescence reaction by treating the resulting disulfitomercurate(II) complex with an acidic cerium sulfate solution. An improved sampling device has been introduced that increases the maximum sampling volume from 200 L to 500 L. It is also possible to determine the blank value accurately for each sample. The absorption efficiency of the sampling system is 98.7±6.4% at a nominal flow rate of 10 L/min. The calculated (3σ) detection limit is 3±1 pptv SO2. The sample solution is stable for up to 30 days, which allows the samples to be safely stored or shipped before analysis. This permits the use of a sensitive, compact, and reliable sampling system in the field with subsequent analysis under optimal conditions in the laboratory. A continuous flow chemiluminescence (CFCL) analyzer for on-line measurements is also presented. The system is based on the same chemical principles as the described filter technique.

  15. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    PubMed

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample types that can be processed and minimizes the time between sample collection, sample processing and analysis, and generation of actionable intelligence. The fully integrated Expert System is capable of interpreting a wide range or sample types and input DNA quantities, allowing samples to be processed and interpreted without a technical operator.

  16. Analysis of simulated high burnup nuclear fuel by laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Singh, Manjeet; Sarkar, Arnab; Banerjee, Joydipta; Bhagat, R. K.

    2017-06-01

    Advanced Heavy Water Reactor (AHWR) grade (Th-U)O2 fuel sample and Simulated High Burn-Up Nuclear Fuels (SIMFUEL) samples mimicking the 28 and 43 GWd/Te irradiated burn-up fuel were studied using laser-induced breakdown spectroscopy (LIBS) setup in a simulated hot-cell environment from a distance of > 1.5 m. Resolution of < 38 pm has been used to record the complex spectra of the SIMFUEL samples. By using spectrum comparison and database matching > 60 emission lines of fission products was identified. Among them only a few emission lines were found to generate calibration curves. The study demonstrates the possibility to investigate impurities at concentrations around hundreds of ppm, rapidly at atmospheric pressure without any sample preparation. The results of Ba and Mo showed the advantage of LIBS analysis over traditional methods involving sample dissolution, which introduces possible elemental loss. Limits of detections (LOD) under Ar atmosphere shows significant improvement, which is shown to be due to the formation of stable plasma.

  17. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  18. Application of dual-cloud point extraction for the trace levels of copper in serum of different viral hepatitis patients by flame atomic absorption spectrometry: A multivariate study

    NASA Astrophysics Data System (ADS)

    Arain, Salma Aslam; Kazi, Tasneem G.; Afridi, Hassan Imran; Abbasi, Abdul Rasool; Panhwar, Abdul Haleem; Naeemullah; Shanker, Bhawani; Arain, Mohammad Balal

    2014-12-01

    An efficient, innovative preconcentration method, dual-cloud point extraction (d-CPE) has been developed for the extraction and preconcentration of copper (Cu2+) in serum samples of different viral hepatitis patients prior to couple with flame atomic absorption spectrometry (FAAS). The d-CPE procedure was based on forming complexes of elemental ions with complexing reagent 1-(2-pyridylazo)-2-naphthol (PAN), and subsequent entrapping the complexes in nonionic surfactant (Triton X-114). Then the surfactant rich phase containing the metal complexes was treated with aqueous nitric acid solution, and metal ions were back extracted into the aqueous phase, as second cloud point extraction stage, and finally determined by flame atomic absorption spectrometry using conventional nebulization. The multivariate strategy was applied to estimate the optimum values of experimental variables for the recovery of Cu2+ using d-CPE. In optimum experimental conditions, the limit of detection and the enrichment factor were 0.046 μg L-1 and 78, respectively. The validity and accuracy of proposed method were checked by analysis of Cu2+ in certified sample of serum (CRM) by d-CPE and conventional CPE procedure on same CRM. The proposed method was successfully applied to the determination of Cu2+ in serum samples of different viral hepatitis patients and healthy controls.

  19. Predictive Engineering Tools for Injection-Molded Long-Carbon-Fiber Thermoplastic Composites - Fourth FY 2015 Quarterly Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Fifield, Leonard S.; Wollan, Eric J.

    2015-11-13

    During the last quarter of FY 2015, the following technical progress has been made toward project milestones: 1) PlastiComp used the PlastiComp direct in-line (D-LFT) Pushtrusion system to injection mold 40 30wt% LCF/PP parts with ribs, 40 30wt% LCF/PP parts without ribs, 10 30wt% LCF/PA66 parts with ribs, and 35 30wt% LCF/PA66 parts without ribs. In addition, purge materials from the injection molding nozzle were obtained for fiber length analysis, and molding parameters were sent to PNNL for process modeling. 2) Magna cut samples at four selected locations (named A, B, C and D) from the non-ribbed Magna-molded parts basedmore » on a plan discussed with PNNL and the team and shipped these samples to Virginia Tech for fiber orientation and length measurements. 3) Virginia Tech started fiber orientation and length measurements for the samples taken from the complex parts using Virginia Tech’s established procedure. 4) PNNL and Autodesk built ASMI models for the complex parts with and without ribs, reviewed process datasheets and performed preliminary analyses of these complex parts using the actual molding parameters received from Magna and PlastiComp to compare predicted to experimental mold filling patterns. 5) Autodesk assisted PNNL in developing the workflow to use Moldflow fiber orientation and length results in ABAQUS® simulations. 6) Autodesk advised the team on the practicality and difficulty of material viscosity characterization from the D-LFT process. 7) PNNL developed a procedure to import fiber orientation and length results from a 3D ASMI analysis to a 3D ABAQUS® model for structural analyses of the complex part for later weight reduction study. 8) In discussion with PNNL and Magna, Toyota developed mechanical test setups and built fixtures for three-point bending and torsion tests of the complex parts. 9) Toyota built a finite element model for the complex parts subjected to torsion loading. 10) PNNL built the 3D ABAQUS® model of the complex ribbed part subjected to 3-point bending. 11) University of Illinois (Prof. C.L. Tucker) advised the team on fiber orientation and fiber length measurement options, modeling issues as well as interpretation of data.« less

  20. STRUCTURE IN THE 3D GALAXY DISTRIBUTION: III. FOURIER TRANSFORMING THE UNIVERSE: PHASE AND POWER SPECTRA.

    PubMed

    Scargle, Jeffrey D; Way, M J; Gazis, P R

    2017-04-10

    We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform (FFT) of finely binned galaxy positions. In both cases deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multi-point hierarchy. We identify some threads of modern large scale inference methodology that will presumably yield detections in new wider and deeper surveys.

  1. STRUCTURE IN THE 3D GALAXY DISTRIBUTION: III. FOURIER TRANSFORMING THE UNIVERSE: PHASE AND POWER SPECTRA

    PubMed Central

    Scargle, Jeffrey D.; Way, M. J.; Gazis, P. R.

    2017-01-01

    We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform (FFT) of finely binned galaxy positions. In both cases deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multi-point hierarchy. We identify some threads of modern large scale inference methodology that will presumably yield detections in new wider and deeper surveys. PMID:29628519

  2. Structure in the 3D Galaxy Distribution. III. Fourier Transforming the Universe: Phase and Power Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scargle, Jeffrey D.; Way, M. J.; Gazis, P. R., E-mail: Jeffrey.D.Scargle@nasa.gov, E-mail: Michael.J.Way@nasa.gov, E-mail: PGazis@sbcglobal.net

    We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform of finely binned galaxy positions. In both cases, deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fouriermore » transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multipoint hierarchy. We identify some threads of modern large-scale inference methodology that will presumably yield detections in new wider and deeper surveys.« less

  3. Structure in the 3D Galaxy Distribution: III. Fourier Transforming the Universe: Phase and Power Spectra

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Way, M. J.; Gazis, P. R.

    2017-01-01

    We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform (FFT) of finely binned galaxy positions. In both cases deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multi-point hierarchy. We identify some threads of modern large scale inference methodology that will presumably yield detections in new wider and deeper surveys.

  4. Neutron diffraction study of the in situ oxidation of UO(2).

    PubMed

    Desgranges, Lionel; Baldinozzi, Gianguido; Rousseau, Gurvan; Nièpce, Jean-Claude; Calvarin, Gilbert

    2009-08-17

    This paper discusses uranium oxide crystal structure modifications that are observed during the low-temperature oxidation which transforms UO(2) into U(3)O(8). The symmetries and the structural parameters of UO(2), beta-U(4)O(9), beta-U(3)O(7), and U(3)O(8) were determined by refining neutron diffraction patterns on pure single-phase samples. Neutron diffraction patterns were also collected during the in situ oxidation of powder samples at 483 K. The lattice parameters and relative ratios of the four pure phases were measured during the progression of the isothermal oxidation. The transformation of UO(2) into U(3)O(8) involves a complex modification of the oxygen sublattice and the onset of complex superstructures for U(4)O(9) and U(3)O(7), associated with regular stacks of complex defects known as cuboctahedra, which consist of 13 oxygen atoms. The kinetics of the oxidation process are discussed on the basis of the results of the structural analysis.

  5. Advancing Clinical Proteomics via Analysis Based on Biological Complexes: A Tale of Five Paradigms.

    PubMed

    Goh, Wilson Wen Bin; Wong, Limsoon

    2016-09-02

    Despite advances in proteomic technologies, idiosyncratic data issues, for example, incomplete coverage and inconsistency, resulting in large data holes, persist. Moreover, because of naïve reliance on statistical testing and its accompanying p values, differential protein signatures identified from such proteomics data have little diagnostic power. Thus, deploying conventional analytics on proteomics data is insufficient for identifying novel drug targets or precise yet sensitive biomarkers. Complex-based analysis is a new analytical approach that has potential to resolve these issues but requires formalization. We categorize complex-based analysis into five method classes or paradigms and propose an even-handed yet comprehensive evaluation rubric based on both simulated and real data. The first four paradigms are well represented in the literature. The fifth and newest paradigm, the network-paired (NP) paradigm, represented by a method called Extremely Small SubNET (ESSNET), dominates in precision-recall and reproducibility, maintains strong performance in small sample sizes, and sensitively detects low-abundance complexes. In contrast, the commonly used over-representation analysis (ORA) and direct-group (DG) test paradigms maintain good overall precision but have severe reproducibility issues. The other two paradigms considered here are the hit-rate and rank-based network analysis paradigms; both of these have good precision-recall and reproducibility, but they do not consider low-abundance complexes. Therefore, given its strong performance, NP/ESSNET may prove to be a useful approach for improving the analytical resolution of proteomics data. Additionally, given its stability, it may also be a powerful new approach toward functional enrichment tests, much like its ORA and DG counterparts.

  6. Miniaturizing and automation of free acidity measurements for uranium (VI)-HNO3 solutions: Development of a new sequential injection analysis for a sustainable radio-analytical chemistry.

    PubMed

    Néri-Quiroz, José; Canto, Fabrice; Guillerme, Laurent; Couston, Laurent; Magnaldo, Alastair; Dugas, Vincent

    2016-10-01

    A miniaturized and automated approach for the determination of free acidity in solutions containing uranium (VI) is presented. The measurement technique is based on the concept of sequential injection analysis with on-line spectroscopic detection. The proposed methodology relies on the complexation and alkalimetric titration of nitric acid using a pH 5.6 sodium oxalate solution. The titration process is followed by UV/VIS detection at 650nm thanks to addition of Congo red as universal pH indicator. Mixing sequence as well as method validity was investigated by numerical simulation. This new analytical design allows fast (2.3min), reliable and accurate free acidity determination of low volume samples (10µL) containing uranium/[H(+)] moles ratio of 1:3 with relative standard deviation of <7.0% (n=11). The linearity range of the free nitric acid measurement is excellent up to 2.77molL(-1) with a correlation coefficient (R(2)) of 0.995. The method is specific, presence of actinide ions up to 0.54molL(-1) does not interfere on the determination of free nitric acid. In addition to automation, the developed sequential injection analysis method greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight fold. These analytical parameters are important especially in nuclear-related applications to improve laboratory safety, personnel exposure to radioactive samples and to drastically reduce environmental impacts or analytical radioactive waste. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Microfluidics-to-Mass Spectrometry: A review of coupling methods and applications

    PubMed Central

    Wang, Xue; Yi, Lian; Mukhitov, Nikita; Schrell, Adrian M.; Dhumpa, Raghuram; Roper, Michael G.

    2014-01-01

    Microfluidic devices offer great advantages in integrating sample processes, minimizing sample and reagent volumes, and increasing analysis speed, while mass spectrometry detection provides high information content, is sensitive, and can be used in quantitative analyses. The coupling of microfluidic devices to mass spectrometers is becoming more common with the strengths of both systems being combined to analyze precious and complex samples. This review summarizes select achievements published between 2010 – July 2014 in novel coupling between microfluidic devices and mass spectrometers. The review is subdivided by the types of ionization sources employed, and the different microfluidic systems used. PMID:25458901

  8. Anopheles gambiae complex (Diptera:Culicidae) near Bissau City, Guinea Bissau, West Africa.

    PubMed

    Fonseca, L F; Di Deco, M A; Carrara, G C; Dabo, I; Do Rosario, V; Petrarca, V

    1996-11-01

    Cytogenetic studies on mosquitoes collected inside bednets near Bissau City confirmed the presence of Anopheles melas Theobald and An. gambiae Giles sensu stricto, the latter species prevailing in rainy season samples (approximately 80% in average) and the former in dry season samples (> 90%). Seasonal and ecogeographical variations in the frequency of species and chromosomal inversions were analyzed. The analysis of An. gambiae sensu stricto confirmed the existence of the Bissau chromosomal form. The deficiency of heterokaryotypes in most samples indicated the possible coexistence of another chromosomal form not completely panmictic (i.e., randomly mating) with the Bissau form.

  9. Metal oxide based multisensor array and portable database for field analysis of antioxidants

    PubMed Central

    Sharpe, Erica; Bradley, Ryan; Frasco, Thalia; Jayathilaka, Dilhani; Marsh, Amanda; Andreescu, Silvana

    2014-01-01

    We report a novel chemical sensing array based on metal oxide nanoparticles as a portable and inexpensive paper-based colorimetric method for polyphenol detection and field characterization of antioxidant containing samples. Multiple metal oxide nanoparticles with various polyphenol binding properties were used as active sensing materials to develop the sensor array and establish a database of polyphenol standards that include epigallocatechin gallate, gallic acid, resveratrol, and Trolox among others. Unique charge-transfer complexes are formed between each polyphenol and each metal oxide on the surface of individual sensors in the array, creating distinct optically detectable signals which have been quantified and logged into a reference database for polyphenol identification. The field-portable Pantone/X-Rite© CapSure® color reader was used to create this database and to facilitate rapid colorimetric analysis. The use of multiple metal-oxide sensors allows for cross-validation of results and increases accuracy of analysis. The database has enabled successful identification and quantification of antioxidant constituents within real botanical extractions including green tea. Formation of charge-transfer complexes is also correlated with antioxidant activity exhibiting electron transfer capabilities of each polyphenol. The antioxidant activity of each sample was calculated and validated against the oxygen radical absorbance capacity (ORAC) assay showing good comparability. The results indicate that this method can be successfully used for a more comprehensive analysis of antioxidant containing samples as compared to conventional methods. This technology can greatly simplify investigations into plant phenolics and make possible the on-site determination of antioxidant composition and activity in remote locations. PMID:24610993

  10. Band Excitation for Scanning Probe Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jesse, Stephen

    2017-01-02

    The Band Excitation (BE) technique for scanning probe microscopy uses a precisely determined waveform that contains specific frequencies to excite the cantilever or sample in an atomic force microscope to extract more information, and more reliable information from a sample. There are a myriad of details and complexities associated with implementing the BE technique. There is therefore a need to have a user friendly interface that allows typical microscopists access to this methodology. This software enables users of atomic force microscopes to easily: build complex band-excitation waveforms, set-up the microscope scanning conditions, configure the input and output electronics for generatemore » the waveform as a voltage signal and capture the response of the system, perform analysis on the captured response, and display the results of the measurement.« less

  11. Skylab M518 multipurpose furnace convection analysis

    NASA Technical Reports Server (NTRS)

    Bourgeois, S. V.; Spradley, L. W.

    1975-01-01

    An analysis was performed of the convection which existed on ground tests and during skylab processing of two experiments: vapor growth of IV-VI compounds growth of spherical crystals. A parallel analysis was also performed on Skylab experiment indium antimonide crystals because indium antimonide (InSb) was used and a free surface existed in the tellurium-doped Skylab III sample. In addition, brief analyses were also performed of the microsegregation in germanium experiment because the Skylab crystals indicated turbulent convection effects. Simple dimensional analysis calculations and a more accurate, but complex, convection computer model, were used in the analysis.

  12. Direct analysis of δ13C and concentration of dissolved organic carbon (DOC) in environmental samples by TOC-IRMS

    NASA Astrophysics Data System (ADS)

    Kirkels, Frédérique; Cerli, Chiara; Federherr, Eugen; Kalbitz, Karsten

    2014-05-01

    Dissolved organic carbon (DOC) plays an important role in carbon cycling in terrestrial and aquatic systems. Stable isotope analysis (delta 13C) of DOC could provide valuable insights in its origin, fluxes and environmental fate. Precise and routine analysis of delta 13C and DOC concentration are therefore highly desirable. A promising, new system has been developed for this purpose, linking a high-temperature combustion TOC analyzer trough an interface with a continuous flow isotope ratio mass spectrometer (Elementar group, Hanau, Germany). This TOC-IRMS system enables simultaneous stable isotope (bulk delta 13C) and concentration analysis of DOC, with high oxidation efficiency by high-temperature combustion for complex mixtures as natural DOC. To give delta 13C analysis by TOC-IRMS the necessary impulse for broad-scale application, we present a detailed evaluation of its analytical performance for realistic and challenging conditions inclusive low DOC concentrations and environmental samples. High precision (standard deviation, SD predominantly < 0.15 permil) and accuracy (R2 = 0.9997, i.e. comparison TOC-IRMS and conventional EA-IRMS) were achieved by TOC-IRMS for a broad diversity of DOC solutions. This precision is comparable or even slightly better than that typically reported for EA-IRMS systems, and improves previous techniques for δ13C analysis of DOC. Simultaneously, very good precision was obtained for DOC concentration measurements. Assessment of natural abundance and slightly 13C enriched DOC, a wide range of concentrations (0.2-150 mgC/L) and injection volumes (0.05-3 ml), demonstrated good analytical performance with negligible memory effects, no concentration/volume effects and a wide linearity. Low DOC concentrations (< 2 mgC/L), were correctly analyzed without any pre-concentration. Moreover, TOC-IRMS was successfully applied to analyze DOC from diverse terrestrial, freshwater and marine environments (SD < 0.23 permil). In summary, the TOC-IRMS performs fast and reliable analysis of DOC concentration and δ13C in aqueous samples, without any pre-concentration/freeze-drying. Flexible usage is highlighted by automated, online analysis, a variable injection volume, high throughput and no extensive maintenance. Sample analysis is simple, using small aliquots and with minimal sample preparation. Further investigations should focus on complex, saline matrices and very low DOC concentrations, to achieve a potential lower limit of 0.2 mgC/L. High-resolution, routine delta 13C analysis of DOC by TOC-IRMS offers opportunities for wide-scale application in terrestrial, freshwater and marine research to elucidate the role of DOC in biogeochemical processes and ecosystem functioning.

  13. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    PubMed

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  14. PyBioMed: a python library for various molecular representations of chemicals, proteins and DNAs and their interactions.

    PubMed

    Dong, Jie; Yao, Zhi-Jiang; Zhang, Lin; Luo, Feijun; Lin, Qinlu; Lu, Ai-Ping; Chen, Alex F; Cao, Dong-Sheng

    2018-03-20

    With the increasing development of biotechnology and informatics technology, publicly available data in chemistry and biology are undergoing explosive growth. Such wealthy information in these data needs to be extracted and transformed to useful knowledge by various data mining methods. Considering the amazing rate at which data are accumulated in chemistry and biology fields, new tools that process and interpret large and complex interaction data are increasingly important. So far, there are no suitable toolkits that can effectively link the chemical and biological space in view of molecular representation. To further explore these complex data, an integrated toolkit for various molecular representation is urgently needed which could be easily integrated with data mining algorithms to start a full data analysis pipeline. Herein, the python library PyBioMed is presented, which comprises functionalities for online download for various molecular objects by providing different IDs, the pretreatment of molecular structures, the computation of various molecular descriptors for chemicals, proteins, DNAs and their interactions. PyBioMed is a feature-rich and highly customized python library used for the characterization of various complex chemical and biological molecules and interaction samples. The current version of PyBioMed could calculate 775 chemical descriptors and 19 kinds of chemical fingerprints, 9920 protein descriptors based on protein sequences, more than 6000 DNA descriptors from nucleotide sequences, and interaction descriptors from pairwise samples using three different combining strategies. Several examples and five real-life applications were provided to clearly guide the users how to use PyBioMed as an integral part of data analysis projects. By using PyBioMed, users are able to start a full pipelining from getting molecular data, pretreating molecules, molecular representation to constructing machine learning models conveniently. PyBioMed provides various user-friendly and highly customized APIs to calculate various features of biological molecules and complex interaction samples conveniently, which aims at building integrated analysis pipelines from data acquisition, data checking, and descriptor calculation to modeling. PyBioMed is freely available at http://projects.scbdd.com/pybiomed.html .

  15. Monte Carlo isotopic inventory analysis for complex nuclear systems

    NASA Astrophysics Data System (ADS)

    Phruksarojanakun, Phiphat

    Monte Carlo Inventory Simulation Engine (MCise) is a newly developed method for calculating isotopic inventory of materials. It offers the promise of modeling materials with complex processes and irradiation histories, which pose challenges for current, deterministic tools, and has strong analogies to Monte Carlo (MC) neutral particle transport. The analog method, including considerations for simple, complex and loop flows, is fully developed. In addition, six variance reduction tools provide unique capabilities of MCise to improve statistical precision of MC simulations. Forced Reaction forces an atom to undergo a desired number of reactions in a given irradiation environment. Biased Reaction Branching primarily focuses on improving statistical results of the isotopes that are produced from rare reaction pathways. Biased Source Sampling aims at increasing frequencies of sampling rare initial isotopes as the starting particles. Reaction Path Splitting increases the population by splitting the atom at each reaction point, creating one new atom for each decay or transmutation product. Delta Tracking is recommended for high-frequency pulsing to reduce the computing time. Lastly, Weight Window is introduced as a strategy to decrease large deviations of weight due to the uses of variance reduction techniques. A figure of merit is necessary to compare the efficiency of different variance reduction techniques. A number of possibilities for figure of merit are explored, two of which are robust and subsequently used. One is based on the relative error of a known target isotope (1/R 2T) and the other on the overall detection limit corrected by the relative error (1/DkR 2T). An automated Adaptive Variance-reduction Adjustment (AVA) tool is developed to iteratively define parameters for some variance reduction techniques in a problem with a target isotope. Sample problems demonstrate that AVA improves both precision and accuracy of a target result in an efficient manner. Potential applications of MCise include molten salt fueled reactors and liquid breeders in fusion blankets. As an example, the inventory analysis of a liquid actinide fuel in the In-Zinerator, a sub-critical power reactor driven by a fusion source, is examined. The result reassures MCise as a reliable tool for inventory analysis of complex nuclear systems.

  16. A ricin forensic profiling approach based on a complex set of biomarkers.

    PubMed

    Fredriksson, Sten-Åke; Wunschel, David S; Lindström, Susanne Wiklund; Nilsson, Calle; Wahl, Karen; Åstot, Crister

    2018-08-15

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1-PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods and robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    PubMed Central

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-01-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944

  18. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation

    NASA Astrophysics Data System (ADS)

    Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten

    2015-04-01

    Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.

  19. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  20. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  1. Analysis of the Magnetic Field Influence on the Rheological Properties of Healthy Persons Blood

    PubMed Central

    Nawrocka-Bogusz, Honorata

    2013-01-01

    The influence of magnetic field on whole blood rheological properties remains a weakly known phenomenon. An in vitro analysis of the magnetic field influence on the rheological properties of healthy persons blood is presented in this work. The study was performed on blood samples taken from 25 healthy nonsmoking persons and included comparative analysis of the results of both the standard rotary method (flow curve measurement) and the oscillatory method known also as the mechanical dynamic analysis, performed before and after exposition of blood samples to magnetic field. The principle of the oscillatory technique lies in determining the amplitude and phase of the oscillations of the studied sample subjected to action of a harmonic force of controlled amplitude and frequency. The flow curve measurement involved determining the shear rate dependence of blood viscosity. The viscoelastic properties of the blood samples were analyzed in terms of complex blood viscosity. All the measurements have been performed by means of the Contraves LS40 rheometer. The data obtained from the flow curve measurements complemented by hematocrit and plasma viscosity measurements have been analyzed using the rheological model of Quemada. No significant changes of the studied rheological parameters have been found. PMID:24078918

  2. Wavelet-Based Interpolation and Representation of Non-Uniformly Sampled Spacecraft Mission Data

    NASA Technical Reports Server (NTRS)

    Bose, Tamal

    2000-01-01

    A well-documented problem in the analysis of data collected by spacecraft instruments is the need for an accurate, efficient representation of the data set. The data may suffer from several problems, including additive noise, data dropouts, an irregularly-spaced sampling grid, and time-delayed sampling. These data irregularities render most traditional signal processing techniques unusable, and thus the data must be interpolated onto an even grid before scientific analysis techniques can be applied. In addition, the extremely large volume of data collected by scientific instrumentation presents many challenging problems in the area of compression, visualization, and analysis. Therefore, a representation of the data is needed which provides a structure which is conducive to these applications. Wavelet representations of data have already been shown to possess excellent characteristics for compression, data analysis, and imaging. The main goal of this project is to develop a new adaptive filtering algorithm for image restoration and compression. The algorithm should have low computational complexity and a fast convergence rate. This will make the algorithm suitable for real-time applications. The algorithm should be able to remove additive noise and reconstruct lost data samples from images.

  3. A Visual Evaluation Study of Graph Sampling Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Fangyan; Zhang, Song; Wong, Pak C.

    2017-01-29

    We evaluate a dozen prevailing graph-sampling techniques with an ultimate goal to better visualize and understand big and complex graphs that exhibit different properties and structures. The evaluation uses eight benchmark datasets with four different graph types collected from Stanford Network Analysis Platform and NetworkX to give a comprehensive comparison of various types of graphs. The study provides a practical guideline for visualizing big graphs of different sizes and structures. The paper discusses results and important observations from the study.

  4. Sensor Analytics: Radioactive gas Concentration Estimation and Error Propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale N.; Fagan, Deborah K.; Suarez, Reynold

    2007-04-15

    This paper develops the mathematical statistics of a radioactive gas quantity measurement and associated error propagation. The probabilistic development is a different approach to deriving attenuation equations and offers easy extensions to more complex gas analysis components through simulation. The mathematical development assumes a sequential process of three components; I) the collection of an environmental sample, II) component gas extraction from the sample through the application of gas separation chemistry, and III) the estimation of radioactivity of component gases.

  5. GLIMMPSE Lite: Calculating Power and Sample Size on Smartphone Devices

    PubMed Central

    Munjal, Aarti; Sakhadeo, Uttara R.; Muller, Keith E.; Glueck, Deborah H.; Kreidler, Sarah M.

    2014-01-01

    Researchers seeking to develop complex statistical applications for mobile devices face a common set of difficult implementation issues. In this work, we discuss general solutions to the design challenges. We demonstrate the utility of the solutions for a free mobile application designed to provide power and sample size calculations for univariate, one-way analysis of variance (ANOVA), GLIMMPSE Lite. Our design decisions provide a guide for other scientists seeking to produce statistical software for mobile platforms. PMID:25541688

  6. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  7. Comparative proteomic analysis using samples obtained with laser microdissection and saturation dye labelling.

    PubMed

    Wilson, Kate E; Marouga, Rita; Prime, John E; Pashby, D Paul; Orange, Paul R; Crosier, Steven; Keith, Alexander B; Lathe, Richard; Mullins, John; Estibeiro, Peter; Bergling, Helene; Hawkins, Edward; Morris, Christopher M

    2005-10-01

    Comparative proteomic methods are rapidly being applied to many different biological systems including complex tissues. One pitfall of these methods is that in some cases, such as oncology and neuroscience, tissue complexity requires isolation of specific cell types and sample is limited. Laser microdissection (LMD) is commonly used for obtaining such samples for proteomic studies. We have combined LMD with sensitive thiol-reactive saturation dye labelling of protein samples and 2-D DIGE to identify protein changes in a test system, the isolated CA1 pyramidal neurone layer of a transgenic (Tg) rat carrying a human amyloid precursor protein transgene. Saturation dye labelling proved to be extremely sensitive with a spot map of over 5,000 proteins being readily produced from 5 mug total protein, with over 100 proteins being significantly altered at p < 0.0005. Of the proteins identified, all showed coherent changes associated with transgene expression. It was, however, difficult to identify significantly different proteins using PMF and MALDI-TOF on gels containing less than 500 mug total protein. The use of saturation dye labelling of limiting samples will therefore require the use of highly sensitive MS techniques to identify the significantly altered proteins isolated using methods such as LMD.

  8. Genome-scale approaches to the epigenetics of common human disease

    PubMed Central

    2011-01-01

    Traditionally, the pathology of human disease has been focused on microscopic examination of affected tissues, chemical and biochemical analysis of biopsy samples, other available samples of convenience, such as blood, and noninvasive or invasive imaging of varying complexity, in order to classify disease and illuminate its mechanistic basis. The molecular age has complemented this armamentarium with gene expression arrays and selective analysis of individual genes. However, we are entering a new era of epigenomic profiling, i.e., genome-scale analysis of cell-heritable nonsequence genetic change, such as DNA methylation. The epigenome offers access to stable measurements of cellular state and to biobanked material for large-scale epidemiological studies. Some of these genome-scale technologies are beginning to be applied to create the new field of epigenetic epidemiology. PMID:19844740

  9. High energy PIXE: A tool to characterize multi-layer thick samples

    NASA Astrophysics Data System (ADS)

    Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.

    2018-02-01

    High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.

  10. Rapid discrimination of the causal agents of urinary tract infection using ToF-SIMS with chemometric cluster analysis

    NASA Astrophysics Data System (ADS)

    Fletcher, John S.; Henderson, Alexander; Jarvis, Roger M.; Lockyer, Nicholas P.; Vickerman, John C.; Goodacre, Royston

    2006-07-01

    Advances in time of flight secondary ion mass spectrometry (ToF-SIMS) have enabled this technique to become a powerful tool for the analysis of biological samples. Such samples are often very complex and as a result full interpretation of the acquired data can be extremely difficult. To simplify the interpretation of these information rich data, the use of chemometric techniques is becoming widespread in the ToF-SIMS community. Here we discuss the application of principal components-discriminant function analysis (PC-DFA) to the separation and classification of a number of bacterial samples that are known to be major causal agents of urinary tract infection. A large data set has been generated using three biological replicates of each isolate and three machine replicates were acquired from each biological replicate. Ordination plots generated using the PC-DFA are presented demonstrating strain level discrimination of the bacteria. The results are discussed in terms of biological differences between certain species and with reference to FT-IR, Raman spectroscopy and pyrolysis mass spectrometric studies of similar samples.

  11. Evaluation of alternative model selection criteria in the analysis of unimodal response curves using CART

    USGS Publications Warehouse

    Ribic, C.A.; Miller, T.W.

    1998-01-01

    We investigated CART performance with a unimodal response curve for one continuous response and four continuous explanatory variables, where two variables were important (ie directly related to the response) and the other two were not. We explored performance under three relationship strengths and two explanatory variable conditions: equal importance and one variable four times as important as the other. We compared CART variable selection performance using three tree-selection rules ('minimum risk', 'minimum risk complexity', 'one standard error') to stepwise polynomial ordinary least squares (OLS) under four sample size conditions. The one-standard-error and minimum-risk-complexity methods performed about as well as stepwise OLS with large sample sizes when the relationship was strong. With weaker relationships, equally important explanatory variables and larger sample sizes, the one-standard-error and minimum-risk-complexity rules performed better than stepwise OLS. With weaker relationships and explanatory variables of unequal importance, tree-structured methods did not perform as well as stepwise OLS. Comparing performance within tree-structured methods, with a strong relationship and equally important explanatory variables, the one-standard-error-rule was more likely to choose the correct model than were the other tree-selection rules 1) with weaker relationships and equally important explanatory variables; and 2) under all relationship strengths when explanatory variables were of unequal importance and sample sizes were lower.

  12. Community structure of fish larvae in mangroves with different root types in Labuhan coastal area, Sepulu - Madura

    NASA Astrophysics Data System (ADS)

    Muzaki, Farid Kamal; Giffari, Aninditha; Saptarini, Dian

    2017-06-01

    Mangrove root complexity and shading are well known to give positive correlation for both juveniles and adult fishes. However, it is remain unclear whether that complexity would affect the community of fish larvae (ichthyoplankton). This study aimed to address the question, especially in mangrove area in coastal area of Sepulu, Madura which projected as a mangrove protection area. Sampling periods were from March to May, 2016. The samples of fish larvae were collected by plankton net (mesh-size 0.150 and 0.265 mm) from six different locations representing different root types (stilt root, pneumatophore, combination of stilt root-pneumatophore and unvegetated area). As the results, 6 families were identified, namely Gobiidae, Blennidae, Pomacentridae, Carangidae, Engraulidae and Ambassidae, respectively. Gobiidae seems to be the most abundant and widely dispersed in the area. Results of two-way AnovadanTukey HSD (both at p=0.05) indicate that there were significant difference in the larval abundance regarding locations, sampling periods and interaction of both factors. As for number of taxa, significant difference occurred only from factors of locations and sampling periods, but not for interaction of both factors. Highest larval abundance and number of taxa occurred in Rhizophoraspp (with stilt root), indicating that root complexity would affect the community of fish larvae. Ordination by canonical analysis shows that different taxa of the fish larvae are tend to be distributed on different locations.

  13. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy.

    PubMed

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  14. Evaluation of the photoprotective effect of β-cyclodextrin on the emission of volatile degradation products of ranitidine.

    PubMed

    Jamrógiewicz, Marzena; Wielgomas, Bartosz; Strankowski, Michał

    2014-09-01

    The process of the photo-excitation of ranitidine hydrochloride (RAN) in a solid state makes visible changes to its colour and generates an unpleasant odour. The purpose of the present study was to observe the protective effects of β-cyclodextrin (CD) complexation as well as the effect of the mixture of two stoichiometries 1:1 and 1:2 (RAN:CD, IC) on the photostability of samples in a solid state. Samples of inclusion complexes (IC) and physical mixtures (PM) were prepared and irradiated for 48h in a Suntest CPS+ chamber. Irradiated samples were analyzed using nuclear magnetic resonance ((1)H NMR), infrared spectroscopy (FT-IR), the differential scanning calorimetry method (DSC) and thermogravimetry analysis (TGA). Volatiles were monitored with the use of headspace-solid phase microextraction-gas chromatography-mass spectrometry (HS-SPME-GC-MS). The protective effect of CD was noticed with respect to IC, and also PM. Achieved photostabilization of complexed RAN against photodegradation could be explained due to either the inclusion of the furan part of RAN into the CD cavity as shown by the (1)H NMR ROESY (rotation frame nuclear Overhauser effect spectroscopy) spectrum or the screening effect of CD. FT-IR spectra, DSC curves and microscope images of irradiated samples of protected RAN did not indicate any physical changes, such as phase transfer. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  16. A newly validated and characterized spectrophotometric method for determination of a three water pollutants metal ions

    NASA Astrophysics Data System (ADS)

    Mohamed, Marwa E.; Frag, Eman Y. Z.; Mohamed, Mona A.

    2018-01-01

    A simple, fast and accurate spectrophotometric method had been developed to determine lead (II), chromium (III) and barium (II) ions in pure forms and in spiked water samples using thoron (THO) as a reagent forming colored complexes. It was found that the formed complexes absorbed maximally at 539, 540 and 538 nm for Pb(II)-THO, Cr(III)-THO and Ba(II)-THO complexes, respectively. The optimum experimental conditions for these complexes had been studied carefully. Beer's law was obeyed in the range 1-35, 1-70, and 1-45 μg mL- 1 for Pb (II), Cr(III) and Ba(II) ions with THO reagent, respectively. Different parameters such as linearity, selectivity, recovery, limits of quantification and detection, precision and accuracy were also evaluated in order to validate the proposed method. The results showed that, THO was effective in simultaneous determination of Pb(II), Cr(III) and Ba(III) ions in pure forms and in spiked water samples. Also, the results of the proposed method were compared with that obtained from atomic absorption spectrometry. The isolated solid complexes had been characterized using elemental analysis, X-ray powder diffraction (XRD), IR, mass spectrometry and TD-DFT calculations. Their biological activities were investigated against different types of bacteria and fungi organisms.

  17. Analyses of polycyclic aromatic hydrocarbon (PAH) and chiral-PAH analogues-methyl-β-cyclodextrin guest-host inclusion complexes by fluorescence spectrophotometry and multivariate regression analysis.

    PubMed

    Greene, LaVana; Elzey, Brianda; Franklin, Mariah; Fakayode, Sayo O

    2017-03-05

    The negative health impact of polycyclic aromatic hydrocarbons (PAHs) and differences in pharmacological activity of enantiomers of chiral molecules in humans highlights the need for analysis of PAHs and their chiral analogue molecules in humans. Herein, the first use of cyclodextrin guest-host inclusion complexation, fluorescence spectrophotometry, and chemometric approach to PAH (anthracene) and chiral-PAH analogue derivatives (1-(9-anthryl)-2,2,2-triflouroethanol (TFE)) analyses are reported. The binding constants (K b ), stoichiometry (n), and thermodynamic properties (Gibbs free energy (ΔG), enthalpy (ΔH), and entropy (ΔS)) of anthracene and enantiomers of TFE-methyl-β-cyclodextrin (Me-β-CD) guest-host complexes were also determined. Chemometric partial-least-square (PLS) regression analysis of emission spectra data of Me-β-CD-guest-host inclusion complexes was used for the determination of anthracene and TFE enantiomer concentrations in Me-β-CD-guest-host inclusion complex samples. The values of calculated K b and negative ΔG suggest the thermodynamic favorability of anthracene-Me-β-CD and enantiomeric of TFE-Me-β-CD inclusion complexation reactions. However, anthracene-Me-β-CD and enantiomer TFE-Me-β-CD inclusion complexations showed notable differences in the binding affinity behaviors and thermodynamic properties. The PLS regression analysis resulted in square-correlation-coefficients of 0.997530 or better and a low LOD of 3.81×10 -7 M for anthracene and 3.48×10 -8 M for TFE enantiomers at physiological conditions. Most importantly, PLS regression accurately determined the anthracene and TFE enantiomer concentrations with an average low error of 2.31% for anthracene, 4.44% for R-TFE and 3.60% for S-TFE. The results of the study are highly significant because of its high sensitivity and accuracy for analysis of PAH and chiral PAH analogue derivatives without the need of an expensive chiral column, enantiomeric resolution, or use of a polarized light. Published by Elsevier B.V.

  18. Global loss of a nuclear lamina component, lamin A/C, and LINC complex components SUN1, SUN2, and nesprin-2 in breast cancer.

    PubMed

    Matsumoto, Ayaka; Hieda, Miki; Yokoyama, Yuhki; Nishioka, Yu; Yoshidome, Katsuhide; Tsujimoto, Masahiko; Matsuura, Nariaki

    2015-10-01

    Cancer cells exhibit a variety of features indicative of atypical nuclei. However, the molecular mechanisms underlying these phenomena remain to be elucidated. The linker of nucleoskeleton and cytoskeleton (LINC) complex, a nuclear envelope protein complex consisting mainly of the SUN and nesprin proteins, connects nuclear lamina and cytoskeletal filaments and helps to regulate the size and shape of the nucleus. Using immunohistology, we found that a nuclear lamina component, lamin A/C and all of the investigated LINC complex components, SUN1, SUN2, and nesprin-2, were downregulated in human breast cancer tissues. In the majority of cases, we observed lower expression levels of these analytes in samples' cancerous regions as compared to their cancer-associated noncancerous regions (in cancerous regions, percentage of tissue samples exhibiting low protein expression: lamin A/C, 85% [n = 73]; SUN1, 88% [n = 43]; SUN2, 74% [n = 43]; and nesprin-2, 79% [n = 53]). Statistical analysis showed that the frequencies of recurrence and HER2 expression were negatively correlated with lamin A/C expression (P < 0.05), and intrinsic subtype and ki-67 level were associated with nesprin-2 expression (P < 0.05). In addition, combinatorial analysis using the above four parameters showed that all patients exhibited reduced expression of at least one of four components despite the tumor's pathological classification. Furthermore, several cultured breast cancer cell lines expressed less SUN1, SUN2, nesprin-2 mRNA, and lamin A/C compared to noncancerous mammary gland cells. Together, these results suggest that the strongly reduced expression of LINC complex and nuclear lamina components may play fundamental pathological functions in breast cancer progression. © 2015 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  19. Dual-color Proteomic Profiling of Complex Samples with a Microarray of 810 Cancer-related Antibodies*

    PubMed Central

    Schröder, Christoph; Jacob, Anette; Tonack, Sarah; Radon, Tomasz P.; Sill, Martin; Zucknick, Manuela; Rüffer, Sven; Costello, Eithne; Neoptolemos, John P.; Crnogorac-Jurcevic, Tatjana; Bauer, Andrea; Fellenberg, Kurt; Hoheisel, Jörg D.

    2010-01-01

    Antibody microarrays have the potential to enable comprehensive proteomic analysis of small amounts of sample material. Here, protocols are presented for the production, quality assessment, and reproducible application of antibody microarrays in a two-color mode with an array of 1,800 features, representing 810 antibodies that were directed at 741 cancer-related proteins. In addition to measures of array quality, we implemented indicators for the accuracy and significance of dual-color detection. Dual-color measurements outperform a single-color approach concerning assay reproducibility and discriminative power. In the analysis of serum samples, depletion of high-abundance proteins did not improve technical assay quality. On the contrary, depletion introduced a strong bias in protein representation. In an initial study, we demonstrated the applicability of the protocols to proteins derived from urine samples. We identified differences between urine samples from pancreatic cancer patients and healthy subjects and between sexes. This study demonstrates that biomedically relevant data can be produced. As demonstrated by the thorough quality analysis, the dual-color antibody array approach proved to be competitive with other proteomic techniques and comparable in performance to transcriptional microarray analyses. PMID:20164060

  20. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

Top