Sample records for large-scale quantitative study

  1. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  2. The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.

    2006-01-01

    This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…

  3. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  4. Quantitative Serum Nuclear Magnetic Resonance Metabolomics in Large-Scale Epidemiology: A Primer on -Omic Technologies

    PubMed Central

    Kangas, Antti J; Soininen, Pasi; Lawlor, Debbie A; Davey Smith, George; Ala-Korpela, Mika

    2017-01-01

    Abstract Detailed metabolic profiling in large-scale epidemiologic studies has uncovered novel biomarkers for cardiometabolic diseases and clarified the molecular associations of established risk factors. A quantitative metabolomics platform based on nuclear magnetic resonance spectroscopy has found widespread use, already profiling over 400,000 blood samples. Over 200 metabolic measures are quantified per sample; in addition to many biomarkers routinely used in epidemiology, the method simultaneously provides fine-grained lipoprotein subclass profiling and quantification of circulating fatty acids, amino acids, gluconeogenesis-related metabolites, and many other molecules from multiple metabolic pathways. Here we focus on applications of magnetic resonance metabolomics for quantifying circulating biomarkers in large-scale epidemiology. We highlight the molecular characterization of risk factors, use of Mendelian randomization, and the key issues of study design and analyses of metabolic profiling for epidemiology. We also detail how integration of metabolic profiling data with genetics can enhance drug development. We discuss why quantitative metabolic profiling is becoming widespread in epidemiology and biobanking. Although large-scale applications of metabolic profiling are still novel, it seems likely that comprehensive biomarker data will contribute to etiologic understanding of various diseases and abilities to predict disease risks, with the potential to translate into multiple clinical settings. PMID:29106475

  5. Boarding School, Academic Motivation and Engagement, and Psychological Well-Being: A Large-Scale Investigation

    ERIC Educational Resources Information Center

    Martin, Andrew J.; Papworth, Brad; Ginns, Paul; Liem, Gregory Arief D.

    2014-01-01

    Boarding school has been a feature of education systems for centuries. Minimal large-scale quantitative data have been collected to examine its association with important educational and other outcomes. The present study represents one of the largest studies into boarding school conducted to date. It investigates boarding school and students'…

  6. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  7. "Invisible" Bilingualism--"Invisible" Language Ideologies: Greek Teachers' Attitudes Towards Immigrant Pupils' Heritage Languages

    ERIC Educational Resources Information Center

    Gkaintartzi, Anastasia; Kiliari, Angeliki; Tsokalidou, Roula

    2015-01-01

    This paper presents data from two studies--a nationwide quantitative research and an ethnographic study--on Greek schoolteachers' attitudes towards immigrant pupils' bilingualism. The quantitative data come from a large-scale questionnaire survey, which aimed at the investigation of the needs and requirements for the implementation of a pilot…

  8. Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations

    NASA Technical Reports Server (NTRS)

    Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.

    1993-01-01

    We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.

  9. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  10. Heritage Language Maintenance and Education in the Greek Sociolinguistic Context: Albanian Immigrant Parents' Views

    ERIC Educational Resources Information Center

    Gkaintartzi, Anastasia; Kiliari, Angeliki; Tsokalidou, Roula

    2016-01-01

    This paper presents data from two studies--a nationwide quantitative research and an ethnographic study--on immigrant parents' perspectives about heritage language maintenance and education in Greek state schools. The quantitative data come from a large-scale questionnaire survey, which aimed at the investigation of the needs and requirements for…

  11. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  12. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  13. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  14. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1974-01-01

    Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  15. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities. [cryogenic traonics wind tunnel

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  16. Statistical Model to Analyze Quantitative Proteomics Data Obtained by 18O/16O Labeling and Linear Ion Trap Mass Spectrometry

    PubMed Central

    Jorge, Inmaculada; Navarro, Pedro; Martínez-Acedo, Pablo; Núñez, Estefanía; Serrano, Horacio; Alfranca, Arántzazu; Redondo, Juan Miguel; Vázquez, Jesús

    2009-01-01

    Statistical models for the analysis of protein expression changes by stable isotope labeling are still poorly developed, particularly for data obtained by 16O/18O labeling. Besides large scale test experiments to validate the null hypothesis are lacking. Although the study of mechanisms underlying biological actions promoted by vascular endothelial growth factor (VEGF) on endothelial cells is of considerable interest, quantitative proteomics studies on this subject are scarce and have been performed after exposing cells to the factor for long periods of time. In this work we present the largest quantitative proteomics study to date on the short term effects of VEGF on human umbilical vein endothelial cells by 18O/16O labeling. Current statistical models based on normality and variance homogeneity were found unsuitable to describe the null hypothesis in a large scale test experiment performed on these cells, producing false expression changes. A random effects model was developed including four different sources of variance at the spectrum-fitting, scan, peptide, and protein levels. With the new model the number of outliers at scan and peptide levels was negligible in three large scale experiments, and only one false protein expression change was observed in the test experiment among more than 1000 proteins. The new model allowed the detection of significant protein expression changes upon VEGF stimulation for 4 and 8 h. The consistency of the changes observed at 4 h was confirmed by a replica at a smaller scale and further validated by Western blot analysis of some proteins. Most of the observed changes have not been described previously and are consistent with a pattern of protein expression that dynamically changes over time following the evolution of the angiogenic response. With this statistical model the 18O labeling approach emerges as a very promising and robust alternative to perform quantitative proteomics studies at a depth of several thousand proteins. PMID:19181660

  17. [Development and application of morphological analysis method in Aspergillus niger fermentation].

    PubMed

    Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2015-02-01

    Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.

  18. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  19. Quantitative Large-Scale Three-Dimensional Imaging of Human Kidney Biopsies: A Bridge to Precision Medicine in Kidney Disease.

    PubMed

    Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M

    2018-06-05

    Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.

  20. The Timing of Teacher Hires and Teacher Qualifications: Is There an Association?

    ERIC Educational Resources Information Center

    Engel, Mimi

    2012-01-01

    Background: Case studies suggest that late hiring timelines are common in large urban school districts and result in the loss of qualified teachers to surrounding suburbs. To date, however, there has been no large-scale quantitative investigation of the relationship between the timing of teacher hires and teacher qualifications. Purpose: This…

  1. Development and Evaluation of a Parallel Reaction Monitoring Strategy for Large-Scale Targeted Metabolomics Quantification.

    PubMed

    Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin

    2016-04-19

    Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.

  2. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    PubMed

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  3. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Model Analysis of an Aircraft Fueslage Panel using Experimental and Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A.; Buehrle, Ralph D.; Storaasli, Olaf L.

    1998-01-01

    The application of Electro-Optic Holography (EOH) for measuring the center bay vibration modes of an aircraft fuselage panel under forced excitation is presented. The requirement of free-free panel boundary conditions made the acquisition of quantitative EOH data challenging since large scale rigid body motions corrupted measurements of the high frequency vibrations of interest. Image processing routines designed to minimize effects of large scale motions were applied to successfully resurrect quantitative EOH vibrational amplitude measurements

  5. Tools for understanding landscapes: combining large-scale surveys to characterize change. Chapter 9.

    Treesearch

    W. Keith Moser; Janine Bolliger; Don C. Bragg; Mark H. Hansen; Mark A. Hatfield; Timothy A. Nigh; Lisa A. Schulte

    2008-01-01

    All landscapes change continuously. Since change is perceived and interpreted through measures of scale, any quantitative analysis of landscapes must identify and describe the spatiotemporal mosaics shaped by large-scale structures and processes. This process is controlled by core influences, or "drivers," that shape the change and affect the outcome...

  6. Motivation to Read among Rural Adolescents

    ERIC Educational Resources Information Center

    Belken, Gloria

    2013-01-01

    This study used quantitative methods to investigate motivation to read among high school students in a tenth-grade English course at a rural high school in the Midwestern USA. Data were collected and analyzed to replicate previous studies. In this study, when compared to large-scale surveys, respondents showed more positive attitudes toward…

  7. Harnessing quantitative genetics and genomics for understanding and improving complex traits in crops

    USDA-ARS?s Scientific Manuscript database

    Classical quantitative genetics aids crop improvement by providing the means to estimate heritability, genetic correlations, and predicted responses to various selection schemes. Genomics has the potential to aid quantitative genetics and applied crop improvement programs via large-scale, high-thro...

  8. Multifractal spectrum and lacunarity as measures of complexity of osseointegration.

    PubMed

    de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko

    2016-07-01

    The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.

  9. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    PubMed

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Impact of Chromosome 4p- Syndrome on Communication and Expressive Language Skills: A Preliminary Investigation

    ERIC Educational Resources Information Center

    Marshall, Althea T.

    2010-01-01

    Purpose: The purpose of this investigation was to examine the impact of Chromosome 4p- syndrome on the communication and expressive language phenotype of a large cross-cultural population of children, adolescents, and adults. Method: A large-scale survey study was conducted and a descriptive research design was used to analyze quantitative and…

  11. Secondary Students' Stable and Unstable Optics Conceptions Using Contextualized Questions

    ERIC Educational Resources Information Center

    Chu, Hye-Eun; Treagust, David F.

    2014-01-01

    This study focuses on elucidating and explaining reasons for the stability of and interrelationships between students' conceptions about "Light Propagation" and "Visibility of Objects" using contextualized questions across 3 years of secondary schooling from Years 7 to 9. In a large-scale quantitative study involving 1,233…

  12. Embarking on large-scale qualitative research: reaping the benefits of mixed methods in studying youth, clubs and drugs

    PubMed Central

    Hunt, Geoffrey; Moloney, Molly; Fazio, Adam

    2012-01-01

    Qualitative research is often conceptualized as inherently small-scale research, primarily conducted by a lone researcher enmeshed in extensive and long-term fieldwork or involving in-depth interviews with a small sample of 20 to 30 participants. In the study of illicit drugs, traditionally this has often been in the form of ethnographies of drug-using subcultures. Such small-scale projects have produced important interpretive scholarship that focuses on the culture and meaning of drug use in situated, embodied contexts. Larger-scale projects are often assumed to be solely the domain of quantitative researchers, using formalistic survey methods and descriptive or explanatory models. In this paper, however, we will discuss qualitative research done on a comparatively larger scale—with in-depth qualitative interviews with hundreds of young drug users. Although this work incorporates some quantitative elements into the design, data collection, and analysis, the qualitative dimension and approach has nevertheless remained central. Larger-scale qualitative research shares some of the challenges and promises of smaller-scale qualitative work including understanding drug consumption from an emic perspective, locating hard-to-reach populations, developing rapport with respondents, generating thick descriptions and a rich analysis, and examining the wider socio-cultural context as a central feature. However, there are additional challenges specific to the scale of qualitative research, which include data management, data overload and problems of handling large-scale data sets, time constraints in coding and analyzing data, and personnel issues including training, organizing and mentoring large research teams. Yet large samples can prove to be essential for enabling researchers to conduct comparative research, whether that be cross-national research within a wider European perspective undertaken by different teams or cross-cultural research looking at internal divisions and differences within diverse communities and cultures. PMID:22308079

  13. Climate Change and Macro-Economic Cycles in Pre-Industrial Europe

    PubMed Central

    Pei, Qing; Zhang, David D.; Lee, Harry F.; Li, Guodong

    2014-01-01

    Climate change has been proven to be the ultimate cause of social crisis in pre-industrial Europe at a large scale. However, detailed analyses on climate change and macro-economic cycles in the pre-industrial era remain lacking, especially within different temporal scales. Therefore, fine-grained, paleo-climate, and economic data were employed with statistical methods to quantitatively assess the relations between climate change and agrarian economy in Europe during AD 1500 to 1800. In the study, the Butterworth filter was adopted to filter the data series into a long-term trend (low-frequency) and short-term fluctuations (high-frequency). Granger Causality Analysis was conducted to scrutinize the associations between climate change and macro-economic cycle at different frequency bands. Based on quantitative results, climate change can only show significant effects on the macro-economic cycle within the long-term. In terms of the short-term effects, society can relieve the influences from climate variations by social adaptation methods and self-adjustment mechanism. On a large spatial scale, temperature holds higher importance for the European agrarian economy than precipitation. By examining the supply-demand mechanism in the grain market, population during the study period acted as the producer in the long term, whereas as the consumer in the short term. These findings merely reflect the general interactions between climate change and macro-economic cycles at the large spatial region with a long-term study period. The findings neither illustrate individual incidents that can temporarily distort the agrarian economy nor explain some specific cases. In the study, the scale thinking in the analysis is raised as an essential methodological issue for the first time to interpret the associations between climatic impact and macro-economy in the past agrarian society within different temporal scales. PMID:24516601

  14. Climate change and macro-economic cycles in pre-industrial europe.

    PubMed

    Pei, Qing; Zhang, David D; Lee, Harry F; Li, Guodong

    2014-01-01

    Climate change has been proven to be the ultimate cause of social crisis in pre-industrial Europe at a large scale. However, detailed analyses on climate change and macro-economic cycles in the pre-industrial era remain lacking, especially within different temporal scales. Therefore, fine-grained, paleo-climate, and economic data were employed with statistical methods to quantitatively assess the relations between climate change and agrarian economy in Europe during AD 1500 to 1800. In the study, the Butterworth filter was adopted to filter the data series into a long-term trend (low-frequency) and short-term fluctuations (high-frequency). Granger Causality Analysis was conducted to scrutinize the associations between climate change and macro-economic cycle at different frequency bands. Based on quantitative results, climate change can only show significant effects on the macro-economic cycle within the long-term. In terms of the short-term effects, society can relieve the influences from climate variations by social adaptation methods and self-adjustment mechanism. On a large spatial scale, temperature holds higher importance for the European agrarian economy than precipitation. By examining the supply-demand mechanism in the grain market, population during the study period acted as the producer in the long term, whereas as the consumer in the short term. These findings merely reflect the general interactions between climate change and macro-economic cycles at the large spatial region with a long-term study period. The findings neither illustrate individual incidents that can temporarily distort the agrarian economy nor explain some specific cases. In the study, the scale thinking in the analysis is raised as an essential methodological issue for the first time to interpret the associations between climatic impact and macro-economy in the past agrarian society within different temporal scales.

  15. Environmental impacts of large-scale CSP plants in northwestern China.

    PubMed

    Wu, Zhiyong; Hou, Anping; Chang, Chun; Huang, Xiang; Shi, Duoqi; Wang, Zhifeng

    2014-01-01

    Several concentrated solar power demonstration plants are being constructed, and a few commercial plants have been announced in northwestern China. However, the mutual impacts between the concentrated solar power plants and their surrounding environments have not yet been addressed comprehensively in literature by the parties involved in these projects. In China, these projects are especially important as an increasing amount of low carbon electricity needs to be generated in order to maintain the current economic growth while simultaneously lessening pollution. In this study, the authors assess the potential environmental impacts of large-scale concentrated solar power plants. Specifically, the water use intensity, soil erosion and soil temperature are quantitatively examined. It was found that some of the impacts are favorable, while some impacts are negative in relation to traditional power generation techniques and some need further research before they can be reasonably appraised. In quantitative terms, concentrated solar power plants consume about 4000 L MW(-1) h(-1) of water if wet cooling technology is used, and the collectors lead to the soil temperature changes of between 0.5 and 4 °C; however, it was found that the soil erosion is dramatically alleviated. The results of this study are helpful to decision-makers in concentrated solar power site selection and regional planning. Some conclusions of this study are also valid for large-scale photovoltaic plants.

  16. The Role of Forests in Regulating the River Flow Regime of Large Basins of the World

    NASA Astrophysics Data System (ADS)

    Salazar, J. F.; Villegas, J. C.; Mercado-Bettin, D. A.; Rodríguez, E.

    2016-12-01

    Many natural and social phenomena depend on river flow regimes that are being altered by global change. Understanding the mechanisms behind such alterations is crucial for predicting river flow regimes in a changing environment. Here we explore potential linkages between the presence of forests and the capacity of river basins for regulating river flows. Regulation is defined here as the capacity of river basins to attenuate the amplitude of the river flow regime, that is to reduce the difference between high and low flows. We first use scaling theory to show how scaling properties of observed river flows can be used to classify river basins as regulated or unregulated. This parsimonious classification is based on a physical interpretation of the scaling properties (particularly the scaling exponents) that is novel (most previous studies have focused on the interpretation of the scaling exponents for floods only), and widely-applicable to different basins (the only assumption is that river flows in a given river basin exhibit scaling properties through well-known power laws). Then we show how this scaling framework can be used to explore global-change-induced temporal variations in the regulation capacity of river basins. Finally, we propose a conceptual hypothesis (the "Forest reservoir concept") to explain how large-scale forests can exert important effects on the long-term water balance partitioning and regulation capacity of large basins of the world. Our quantitative results are based on data analysis (river flows and land cover features) from 22 large basins of the world, with emphasis in the Amazon river and its main tributaries. Collectively, our findings support the hypothesis that forest cover enhances the capacity of large river basins to maintain relatively high mean river flows, as well as to regulate (ameliorate) extreme river flows. Advancing towards this quantitative understanding of the relation between forest cover and river flow regimes is crucial for water management- and land cover-related decisions.

  17. The Role of Forests in Regulating the River Flow Regime of Large Basins of the World

    NASA Astrophysics Data System (ADS)

    Salazar, J. F.; Villegas, J. C.; Mercado-Bettin, D. A.; Rodríguez, E.

    2017-12-01

    Many natural and social phenomena depend on river flow regimes that are being altered by global change. Understanding the mechanisms behind such alterations is crucial for predicting river flow regimes in a changing environment. Here we explore potential linkages between the presence of forests and the capacity of river basins for regulating river flows. Regulation is defined here as the capacity of river basins to attenuate the amplitude of the river flow regime, that is to reduce the difference between high and low flows. We first use scaling theory to show how scaling properties of observed river flows can be used to classify river basins as regulated or unregulated. This parsimonious classification is based on a physical interpretation of the scaling properties (particularly the scaling exponents) that is novel (most previous studies have focused on the interpretation of the scaling exponents for floods only), and widely-applicable to different basins (the only assumption is that river flows in a given river basin exhibit scaling properties through well-known power laws). Then we show how this scaling framework can be used to explore global-change-induced temporal variations in the regulation capacity of river basins. Finally, we propose a conceptual hypothesis (the "Forest reservoir concept") to explain how large-scale forests can exert important effects on the long-term water balance partitioning and regulation capacity of large basins of the world. Our quantitative results are based on data analysis (river flows and land cover features) from 22 large basins of the world, with emphasis in the Amazon river and its main tributaries. Collectively, our findings support the hypothesis that forest cover enhances the capacity of large river basins to maintain relatively high mean river flows, as well as to regulate (ameliorate) extreme river flows. Advancing towards this quantitative understanding of the relation between forest cover and river flow regimes is crucial for water management- and land cover-related decisions.

  18. Advantages of Social Network Analysis in Educational Research

    ERIC Educational Resources Information Center

    Ushakov, K. M.; Kukso, K. N.

    2015-01-01

    Currently one of the main tools for the large scale studies of schools is statistical analysis. Although it is the most common method and it offers greatest opportunities for analysis, there are other quantitative methods for studying schools, such as network analysis. We discuss the potential advantages that network analysis has for educational…

  19. An Account of Studies of Organizational Development in Schools.

    ERIC Educational Resources Information Center

    Runkel, Philip J.; Schmuck, Richard A.

    Most organizational development (OD) projects in schools are never reported in the literature. This paper discusses benefits, outcomes, and success factors disclosed by the first large-scale quantitative survey of OD in schoools conducted by Fullan, Miles, and Taylor in 1978. The paper also explores other relevant studies published through early…

  20. Predicting Southern Appalachian overstory vegetation with digital terrain data

    Treesearch

    Paul V. Bolstad; Wayne Swank; James Vose

    1998-01-01

    Vegetation in mountainous regions responds to small-scale variation in terrain, largely due to effects on both temperature and soil moisture. However, there are few studies of quantitative, terrain-based methods for predicting vegetation composition. This study investigated relationships between forest composition, elevation, and a derived index of terrain shape, and...

  1. Reconciling Rigour and Impact by Collaborative Research Design: Study of Teacher Agency

    ERIC Educational Resources Information Center

    Pantic, Nataša

    2017-01-01

    This paper illustrates a new way of working collaboratively on the development of a methodology for studying teacher agency for social justice. Increasing emphasis of impact on change as a purpose of social research raises questions about appropriate research designs. Large-scale quantitative research framed within externally set parameters has…

  2. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  4. Reframing Approaches to Narrating Young People's Conceptualisations of Citizenship in Education Research

    ERIC Educational Resources Information Center

    Akar, Bassel

    2018-01-01

    Large-scale quantitative studies on citizenship and citizenship education research have advanced an international and comparative field of democratic citizenship education. Their instruments, however, informed by theoretical variables constructed in Western Europe and North America mostly measure young people's understandings of a predefined…

  5. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    PubMed

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Representation matters: quantitative behavioral variation in wild worm strains

    NASA Astrophysics Data System (ADS)

    Brown, Andre

    Natural genetic variation in populations is the basis of genome-wide association studies, an approach that has been applied in large studies of humans to study the genetic architecture of complex traits including disease risk. Of course, the traits you choose to measure determine which associated genes you discover (or miss). In large-scale human studies, the measured traits are usually taken as a given during the association step because they are expensive to collect and standardize. Working with the nematode worm C. elegans, we do not have the same constraints. In this talk I will describe how large-scale imaging of worm behavior allows us to develop alternative representations of behavior that vary differently across wild populations. The alternative representations yield novel traits that can be used for genome-wide association studies and may reveal basic properties of the genotype-phenotype map that are obscured if only a small set of fixed traits are used.

  7. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  8. Quantifying streamflow change caused by forest disturbance at a large spatial scale: A single watershed study

    NASA Astrophysics Data System (ADS)

    Wei, Xiaohua; Zhang, Mingfang

    2010-12-01

    Climatic variability and forest disturbance are commonly recognized as two major drivers influencing streamflow change in large-scale forested watersheds. The greatest challenge in evaluating quantitative hydrological effects of forest disturbance is the removal of climatic effect on hydrology. In this paper, a method was designed to quantify respective contributions of large-scale forest disturbance and climatic variability on streamflow using the Willow River watershed (2860 km2) located in the central part of British Columbia, Canada. Long-term (>50 years) data on hydrology, climate, and timber harvesting history represented by equivalent clear-cutting area (ECA) were available to discern climatic and forestry influences on streamflow by three steps. First, effective precipitation, an integrated climatic index, was generated by subtracting evapotranspiration from precipitation. Second, modified double mass curves were developed by plotting accumulated annual streamflow against annual effective precipitation, which presented a much clearer picture of the cumulative effects of forest disturbance on streamflow following removal of climatic influence. The average annual streamflow changes that were attributed to forest disturbances and climatic variability were then estimated to be +58.7 and -72.4 mm, respectively. The positive (increasing) and negative (decreasing) values in streamflow change indicated opposite change directions, which suggest an offsetting effect between forest disturbance and climatic variability in the study watershed. Finally, a multivariate Autoregressive Integrated Moving Average (ARIMA) model was generated to establish quantitative relationships between accumulated annual streamflow deviation attributed to forest disturbances and annual ECA. The model was then used to project streamflow change under various timber harvesting scenarios. The methodology can be effectively applied to any large-scale single watershed where long-term data (>50 years) are available.

  9. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    PubMed

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  10. Image segmentation evaluation for very-large datasets

    NASA Astrophysics Data System (ADS)

    Reeves, Anthony P.; Liu, Shuang; Xie, Yiting

    2016-03-01

    With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.

  11. Science, marketing and wishful thinking in quantitative proteomics.

    PubMed

    Hackett, Murray

    2008-11-01

    In a recent editorial (J. Proteome Res. 2007, 6, 1633) and elsewhere questions have been raised regarding the lack of attention paid to good analytical practice with respect to the reporting of quantitative results in proteomics. Using those comments as a starting point, several issues are discussed that relate to the challenges involved in achieving adequate sampling with MS-based methods in order to generate valid data for large-scale studies. The discussion touches on the relationships that connect sampling depth and the power to detect protein abundance change, conflict of interest, and strategies to overcome bureaucratic obstacles that impede the use of peer-to-peer technologies for transfer and storage of large data files generated in such experiments.

  12. Post-16 Physics and Chemistry Uptake: Combining Large-Scale Secondary Analysis with In-Depth Qualitative Methods

    ERIC Educational Resources Information Center

    Hampden-Thompson, Gillian; Lubben, Fred; Bennett, Judith

    2011-01-01

    Quantitative secondary analysis of large-scale data can be combined with in-depth qualitative methods. In this paper, we discuss the role of this combined methods approach in examining the uptake of physics and chemistry in post compulsory schooling for students in England. The secondary data analysis of the National Pupil Database (NPD) served…

  13. Advances in the Quantitative Characterization of the Shape of Ash-Sized Pyroclast Populations: Fractal Analyses Coupled to Micro- and Nano-Computed Tomography Techniques

    NASA Astrophysics Data System (ADS)

    Rausch, J.; Vonlanthen, P.; Grobety, B. H.

    2014-12-01

    The quantification of shape parameters in pyroclasts is fundamental to infer the dominant type of magma fragmentation (magmatic vs. phreatomagmatic), as well as the behavior of volcanic plumes and clouds in the atmosphere. In a case study aiming at reconstructing the fragmentation mechanisms triggering maar eruptions in two geologically and compositionally distinctive volcanic fields (West and East Eifel, Germany), the shapes of a large number of ash particle contours obtained from SEM images were analyzed by a dilation-based fractal method. Volcanic particle contours are pseudo-fractals showing mostly two distinct slopes in Richardson plots related to the fractal dimensions D1 (small-scale "textural" dimension) and D2 (large-scale "morphological" dimension). The validity of the data obtained from 2D sections was tested by analysing SEM micro-CT slices of one particle cut in different orientations and positions. Results for West Eifel maar particles yield large D1 values (> 1.023), resembling typical values of magmatic particles, which are characterized by a complex shape, especially at small scales. In contrast, the D1 values of ash particles from one East Eifel maar deposit are much smaller, coinciding with the fractal dimensions obtained from phreatomagmatic end-member particles. These quantitative morphological analyses suggest that the studied maar eruptions were triggered by two different fragmentation processes: phreatomagmatic in the East Eifel and magmatic in the West Eifel. The application of fractal analysis to quantitatively characterize the shape of pyroclasts and the linking of fractal dimensions to specific fragmentation processes has turned out to be a very promising tool for studying the fragmentation history of any volcanic eruption. The next step is to extend morphological analysis of volcanic particles to 3 dimensions. SEM micro-CT, already applied in this study, offers the required resolution, but is not suitable for the analysis of a large number of particles. Newly released nano CT-scanners, however, allows the simultaneous analysis of a statistically relevant number of particles (in the hundreds range). Preliminary results of a first trial will be presented.

  14. Doing Disability Research in a Southern Context: Challenges and Possibilities

    ERIC Educational Resources Information Center

    Singal, Nidhi

    2010-01-01

    Research on disability issues in countries of the South is primarily dominated by a focus on generating large scale quantitative data sets. This paper discusses the many challenges, opportunities and dilemmas faced in designing and undertaking a qualitative research study in one district in India. The Disability, Education and Poverty Project…

  15. Intra- and Inter-Individual Variation in Self-Reported Code-Switching Patterns of Adult Multilinguals

    ERIC Educational Resources Information Center

    Dewaele, Jean-Marc; Li, Wei

    2014-01-01

    The present study is a large-scale quantitative analysis of intra-individual variation (linked to type of interlocutor) and inter-individual variation (linked to multilingualism, sociobiographical variables and three personality traits) in self-reported frequency of code-switching (CS) among 2116 multilinguals. We found a significant effect of…

  16. "I Was Just so Different": The Experiences of Women Diagnosed with an Autism Spectrum Disorder in Adulthood in Relation to Gender and Social Relationships

    ERIC Educational Resources Information Center

    Kanfiszer, Lucie; Davies, Fran; Collins, Suzanne

    2017-01-01

    Existing literature exploring autism spectrum disorders within female populations predominantly utilises quantitative methodology. A limited number of small-scale, qualitative studies have explored the experiences of adolescent girls with autism spectrum disorder, but adult women have remained largely unheard. This study aims to broaden the…

  17. Establishing the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS): Operationalizing Community-based Research in a Large National Quantitative Study.

    PubMed

    Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela

    2016-08-19

    Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several advantages that go hand-in-hand with its obstacles. We offer guidance on implementing this approach, such that the process can be better planned and result in success.

  18. Three Advantages of Cross-National Comparative Ethnography--Methodological Reflections from a Study of Migrants and Minority Ethnic Youth in English and Spanish Schools

    ERIC Educational Resources Information Center

    Jørgensen, Clara Rübner

    2015-01-01

    This paper discusses the strengths of using ethnographic research methods in cross-national comparative research. It focuses particularly on the potential of applying such methods to the study of migrants and minority ethnic youth in education, where large-scale quantitative studies or single-sited ethnographies are currently dominant. By linking…

  19. A quantitative approach to the topology of large-scale structure. [for galactic clustering computation

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III; Weinberg, David H.; Melott, Adrian L.

    1987-01-01

    A quantitative measure of the topology of large-scale structure: the genus of density contours in a smoothed density distribution, is described and applied. For random phase (Gaussian) density fields, the mean genus per unit volume exhibits a universal dependence on threshold density, with a normalizing factor that can be calculated from the power spectrum. If large-scale structure formed from the gravitational instability of small-amplitude density fluctuations, the topology observed today on suitable scales should follow the topology in the initial conditions. The technique is illustrated by applying it to simulations of galaxy clustering in a flat universe dominated by cold dark matter. The technique is also applied to a volume-limited sample of the CfA redshift survey and to a model in which galaxies reside on the surfaces of polyhedral 'bubbles'. The topology of the evolved mass distribution and 'biased' galaxy distribution in the cold dark matter models closely matches the topology of the density fluctuations in the initial conditions. The topology of the observational sample is consistent with the random phase, cold dark matter model.

  20. Large scale anomalies in the microwave background: causation and correlation.

    PubMed

    Aslanyan, Grigor; Easther, Richard

    2013-12-27

    Most treatments of large scale anomalies in the microwave sky are a posteriori, with unquantified look-elsewhere effects. We contrast these with physical models of specific inhomogeneities in the early Universe which can generate these apparent anomalies. Physical models predict correlations between candidate anomalies and the corresponding signals in polarization and large scale structure, reducing the impact of cosmic variance. We compute the apparent spatial curvature associated with large-scale inhomogeneities and show that it is typically small, allowing for a self-consistent analysis. As an illustrative example we show that a single large plane wave inhomogeneity can contribute to low-l mode alignment and odd-even asymmetry in the power spectra and the best-fit model accounts for a significant part of the claimed odd-even asymmetry. We argue that this approach can be generalized to provide a more quantitative assessment of potential large scale anomalies in the Universe.

  1. Quantitative Resistance: More Than Just Perception of a Pathogen

    PubMed Central

    2017-01-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. PMID:28302676

  2. Comparisons of ionospheric electron density distributions reconstructed by GPS computerized tomography, backscatter ionograms, and vertical ionograms

    NASA Astrophysics Data System (ADS)

    Zhou, Chen; Lei, Yong; Li, Bofeng; An, Jiachun; Zhu, Peng; Jiang, Chunhua; Zhao, Zhengyu; Zhang, Yuannong; Ni, Binbin; Wang, Zemin; Zhou, Xuhua

    2015-12-01

    Global Positioning System (GPS) computerized ionosphere tomography (CIT) and ionospheric sky wave ground backscatter radar are both capable of measuring the large-scale, two-dimensional (2-D) distributions of ionospheric electron density (IED). Here we report the spatial and temporal electron density results obtained by GPS CIT and backscatter ionogram (BSI) inversion for three individual experiments. Both the GPS CIT and BSI inversion techniques demonstrate the capability and the consistency of reconstructing large-scale IED distributions. To validate the results, electron density profiles obtained from GPS CIT and BSI inversion are quantitatively compared to the vertical ionosonde data, which clearly manifests that both methods output accurate information of ionopsheric electron density and thereby provide reliable approaches to ionospheric soundings. Our study can improve current understanding of the capability and insufficiency of these two methods on the large-scale IED reconstruction.

  3. Research-Based Recommendations for the Use of Accommodations in Large-Scale Assessments: 2012 Update. Practical Guidelines for the Education of English Language Learners. Book 4

    ERIC Educational Resources Information Center

    Kieffer, Michael J.; Rivera, Mabel; Francis, David J.

    2012-01-01

    This report presents results from a new quantitative synthesis of research on the effectiveness and validity of test accommodations for English language learners (ELLs) taking large-scale assessments. In 2006, the Center on Instruction published a review of the literature on test accommodations for ELLs titled "Practical Guidelines for the…

  4. Emergence of Multiscaling in a Random-Force Stirred Fluid

    NASA Astrophysics Data System (ADS)

    Yakhot, Victor; Donzis, Diego

    2017-07-01

    We consider the transition to strong turbulence in an infinite fluid stirred by a Gaussian random force. The transition is defined as a first appearance of anomalous scaling of normalized moments of velocity derivatives (dissipation rates) emerging from the low-Reynolds-number Gaussian background. It is shown that, due to multiscaling, strongly intermittent rare events can be quantitatively described in terms of an infinite number of different "Reynolds numbers" reflecting a multitude of anomalous scaling exponents. The theoretically predicted transition disappears at Rλ≤3 . The developed theory is in quantitative agreement with the outcome of large-scale numerical simulations.

  5. Conducting pilot and feasibility studies.

    PubMed

    Cope, Diane G

    2015-03-01

    Planning a well-designed research study can be tedious and laborious work. However, this process is critical and ultimately can produce valid, reliable study findings. Designing a large-scale randomized, controlled trial (RCT)-the gold standard in quantitative research-can be even more challenging. Even the most well-planned study potentially can result in issues with research procedures and design, such as recruitment, retention, or methodology. One strategy that may facilitate sound study design is the completion of a pilot or feasibility study prior to the initiation of a larger-scale trial. This article will discuss pilot and feasibility studies, their advantages and disadvantages, and implications for oncology nursing research. 
.

  6. Response of deep and shallow tropical maritime cumuli to large-scale processes

    NASA Technical Reports Server (NTRS)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  7. Impact of spatially correlated pore-scale heterogeneity on drying porous media

    NASA Astrophysics Data System (ADS)

    Borgman, Oshri; Fantinel, Paolo; Lühder, Wieland; Goehring, Lucas; Holtzman, Ran

    2017-07-01

    We study the effect of spatially-correlated heterogeneity on isothermal drying of porous media. We combine a minimal pore-scale model with microfluidic experiments with the same pore geometry. Our simulated drying behavior compares favorably with experiments, considering the large sensitivity of the emergent behavior to the uncertainty associated with even small manufacturing errors. We show that increasing the correlation length in particle sizes promotes preferential drying of clusters of large pores, prolonging liquid connectivity and surface wetness and thus higher drying rates for longer periods. Our findings improve our quantitative understanding of how pore-scale heterogeneity impacts drying, which plays a role in a wide range of processes ranging from fuel cells to curing of paints and cements to global budgets of energy, water and solutes in soils.

  8. An Exploratory Analysis of the Longitudinal Impact of Principal Change on Elementary School Achievement

    ERIC Educational Resources Information Center

    Hochbein, Craig; Cunningham, Brittany C.

    2013-01-01

    Recent reform initiatives, such as the Title I School Improvement Grants and Race to the Top, recommended a principal change to jump-start school turnaround. Yet, few educational researchers have examined principal change as way to improve schools in a state of systematic reform; furthermore, no large-scale quantitative study has determined the…

  9. Implementation of School Choice Policy: Interpretation and Response by Parents of Students with Special Educational Needs.

    ERIC Educational Resources Information Center

    Bagley, Carl; Woods, Philip A.; Woods, Glenys

    2001-01-01

    Provides empirically based insights into preferences, perceptions, and responses of parents of students with special education needs to the 1990s restructured school system in England. Uses analyses of quantitative/qualitative data generated by a large-scale research study on school choice. Reveals depth and range of problems encountered by these…

  10. Evaluating Comprehensive School Reform Models at Scale: Focus on Implementation

    ERIC Educational Resources Information Center

    Vernez, Georges; Karam, Rita; Mariano, Louis T.; DeMartini, Christine

    2006-01-01

    This study was designed to fill the "implementation measurement" gap. A methodology to quantitatively measure the level of Comprehensive School Reform (CSR) implementation that can be used across a variety of CSR models was developed, and then applied to measure actual implementation of four different CSR models in a large number of schools. The…

  11. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  12. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  13. Solar Wind Turbulent Cascade from MHD to Sub-ion Scales: Large-size 3D Hybrid Particle-in-cell Simulations

    NASA Astrophysics Data System (ADS)

    Franci, Luca; Landi, Simone; Verdini, Andrea; Matteini, Lorenzo; Hellinger, Petr

    2018-01-01

    Properties of the turbulent cascade from fluid to kinetic scales in collisionless plasmas are investigated by means of large-size 3D hybrid (fluid electrons, kinetic protons) particle-in-cell simulations. Initially isotropic Alfvénic fluctuations rapidly develop a strongly anisotropic turbulent cascade, mainly in the direction perpendicular to the ambient magnetic field. The omnidirectional magnetic field spectrum shows a double power-law behavior over almost two decades in wavenumber, with a Kolmogorov-like index at large scales, a spectral break around ion scales, and a steepening at sub-ion scales. Power laws are also observed in the spectra of the ion bulk velocity, density, and electric field, at both magnetohydrodynamic (MHD) and kinetic scales. Despite the complex structure, the omnidirectional spectra of all fields at ion and sub-ion scales are in remarkable quantitative agreement with those of a 2D simulation with similar physical parameters. This provides a partial, a posteriori validation of the 2D approximation at kinetic scales. Conversely, at MHD scales, the spectra of the density and of the velocity (and, consequently, of the electric field) exhibit differences between the 2D and 3D cases. Although they can be partly ascribed to the lower spatial resolution, the main reason is likely the larger importance of compressible effects in the full 3D geometry. Our findings are also in remarkable quantitative agreement with solar wind observations.

  14. Quantitative study of the violation of kperpendicular factorization in hadroproduction of quarks at collider energies.

    PubMed

    Fujii, Hirotsugu; Gelis, François; Venugopalan, Raju

    2005-10-14

    We demonstrate the violation of kperpendicular factorization for quark production in high energy hadronic collisions. This violation is quantified in the color glass condensate framework and studied as a function of the quark mass, the quark transverse momentum, and the saturation scale Q(s), which is a measure of large parton densities. At x values where parton densities are large but leading twist shadowing effects are still small, violations of kperpendicularkfactorization can be significant--especially for lighter quarks. At very small x, where leading twist shadowing is large, we show that violations of kperpendicular factorization are relatively weaker.

  15. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  16. Confirmatory Factor Analytic Structure and Measurement Invariance of Quantitative Autistic Traits Measured by the Social Responsiveness Scale-2

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Ratliff, Kristin R.; Gruber, Chris; Zhang, Yi; Law, Paul A.; Constantino, John N.

    2014-01-01

    Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large ("N" = 9635) accumulated collection of reports on quantitative autistic traits using…

  17. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  18. A leap forward in geographic scale for forest ectomycorrhizal fungi

    Treesearch

    Filipa Cox; Nadia Barsoum; Martin I. Bidartondo; Isabella Børja; Erik Lilleskov; Lars O. Nilsson; Pasi Rautio; Kath Tubby; Lars Vesterdal

    2010-01-01

    In this letter we propose a first large-scale assessment of mycorrhizas with a European-wide network of intensively monitored forest plots as a research platform. This effort would create a qualitative and quantitative shift in mycorrhizal research by delivering the first continental-scale map of mycorrhizal fungi. Readersmay note that several excellent detailed...

  19. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  20. Quantitation of 87 Proteins by nLC-MRM/MS in Human Plasma: Workflow for Large-Scale Analysis of Biobank Samples.

    PubMed

    Rezeli, Melinda; Sjödin, Karin; Lindberg, Henrik; Gidlöf, Olof; Lindahl, Bertil; Jernberg, Tomas; Spaak, Jonas; Erlinge, David; Marko-Varga, György

    2017-09-01

    A multiple reaction monitoring (MRM) assay was developed for precise quantitation of 87 plasma proteins including the three isoforms of apolipoprotein E (APOE) associated with cardiovascular diseases using nanoscale liquid chromatography separation and stable isotope dilution strategy. The analytical performance of the assay was evaluated and we found an average technical variation of 4.7% in 4-5 orders of magnitude dynamic range (≈0.2 mg/L to 4.5 g/L) from whole plasma digest. Here, we report a complete workflow, including sample processing adapted to 96-well plate format and normalization strategy for large-scale studies. To further investigate the MS-based quantitation the amount of six selected proteins was measured by routinely used clinical chemistry assays as well and the two methods showed excellent correlation with high significance (p-value < 10e-5) for the six proteins, in addition for the cardiovascular predictor factor, APOB: APOA1 ratio (r = 0.969, p-value < 10e-5). Moreover, we utilized the developed assay for screening of biobank samples from patients with myocardial infarction and performed the comparative analysis of patient groups with STEMI (ST- segment elevation myocardial infarction), NSTEMI (non ST- segment elevation myocardial infarction) and type-2 AMI (type-2 myocardial infarction) patients.

  1. Investigation on the Size Effect in Large-Scale Beta-Processed Ti-17 Disks Based on Quantitative Metallography

    NASA Astrophysics Data System (ADS)

    Zhang, Saifei; Zeng, Weidong; Gao, Xiongxiong; Zhao, Xingdong; Li, Siqing

    2017-10-01

    The present study investigates the mechanical properties of large-scale beta-processed Ti-17 forgings because of the increasing interest in beta thermal-mechanical processing method for fabricating compressor disks or blisks in aero-engines due to its advantage in damage tolerance performance. Three Ti-17 disks with different weights of 57, 250 and 400 kg were prepared by beta processing techniques firstly for comparative study. The results reveal a significant `size effect' in beta-processed Ti-17 disks, i.e., dependences of high cycle fatigue, tensile properties and fracture toughness of beta-processed Ti-17 disks on disk size (or weight). With increasing disk weight from 57 to 400 kg, the fatigue limit (fatigue strength at 107 cycles, R = -1) was reduced from 583 to 495 MPa, tensile yield strength dropped from 1073 to 1030 MPa, while fracture toughness ( K IC) rose from 70.9 to 95.5 MPaṡm1/2. Quantitative metallography analysis shows that the `size effect' of mechanical properties can be attributed to evident differences between microstructures of the three disk forgings. With increasing disk size, nearly all microstructural components in the basket-weave microstructure, including prior β grain, α layers at β grain boundaries (GB- α) and α lamellas at the interior of the grains, get coarsened to different degrees. Further, the microstructural difference between the beta-processed disks is proved to be the consequence of longer pre-forging soaking time and lower post-forging cooling rate for large disks than small ones. Finally, suggestions are made from the perspective of microstructural control on how to improve mechanical properties of large-scale beta-processed Ti-17 forgings.

  2. Quantitative Resistance: More Than Just Perception of a Pathogen.

    PubMed

    Corwin, Jason A; Kliebenstein, Daniel J

    2017-04-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. © 2017 American Society of Plant Biologists. All rights reserved.

  3. Large perceptual distortions of locomotor action space occur in ground-based coordinates: Angular expansion and the large-scale horizontal-vertical illusion.

    PubMed

    Klein, Brennan J; Li, Zhi; Durgin, Frank H

    2016-04-01

    What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides to dissociate egocentric from allocentric reference frames. In Experiment 1, it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Large perceptual distortions of locomotor action space occur in ground-based coordinates: Angular expansion and the large-scale horizontal-vertical illusion

    PubMed Central

    Klein, Brennan J.; Li, Zhi; Durgin, Frank H.

    2015-01-01

    What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides in order to dissociate egocentric from allocentric reference frames. In Experiment 1 it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. PMID:26594884

  5. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  6. Collection of quantitative chemical release field data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demirgian, J.; Macha, S.; Loyola Univ.

    1999-01-01

    Detection and quantitation of chemicals in the environment requires Fourier-transform infrared (FTIR) instruments that are properly calibrated and tested. This calibration and testing requires field testing using matrices that are representative of actual instrument use conditions. Three methods commonly used for developing calibration files and training sets in the field are a closed optical cell or chamber, a large-scale chemical release, and a small-scale chemical release. There is no best method. The advantages and limitations of each method should be considered in evaluating field results. Proper calibration characterizes the sensitivity of an instrument, its ability to detect a component inmore » different matrices, and the quantitative accuracy and precision of the results.« less

  7. Void probability as a function of the void's shape and scale-invariant models

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1991-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  8. Understanding Loan Aversion in Education: Evidence from High School Seniors, Community College Students, and Adults. CEPA Working Paper No. 16-15

    ERIC Educational Resources Information Center

    Boatman, Angela; Evans, Brent; Soliz, Adela

    2016-01-01

    Student loans are a crucial aspect of financing a college education for millions of Americans, yet we have surprisingly little empirical evidence concerning individuals' unwillingness to borrow money for educational purposes. This study provides the first large-scale quantitative evidence of levels of loan aversion in the United States. Using…

  9. The role of the airline transportation network in the prediction and predictability of global epidemics.

    PubMed

    Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro

    2006-02-14

    The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.

  10. Commentary: The Observed Association between Autistic Severity Measured by the Social Responsiveness Scale (SRS) and General Psychopathology-- A Response to Hus et al.()

    ERIC Educational Resources Information Center

    Constantino, John N.; Frazier, Thomas W.

    2013-01-01

    In their analysis of the accumulated data from the clinically ascertained Simons Simplex Collection (SSC), Hus et al. (2013) provide a large-scale clinical replication of previously reported associations (see Constantino, Hudziak & Todd, 2003) between quantitative autistic traits [as measured by the Social Responsiveness Scale (SRS)] and…

  11. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  12. An Alternative to the Search for Single Polymorphisms: Toward Molecular Personality Scales for the Five-Factor Model

    PubMed Central

    McCrae, Robert R.; Scally, Matthew; Terracciano, Antonio; Abecasis, Gonçalo R.; Costa, Paul T.

    2011-01-01

    There is growing evidence that personality traits are affected by many genes, all of which have very small effects. As an alternative to the largely-unsuccessful search for individual polymorphisms associated with personality traits, we identified large sets of potentially related single nucleotide polymorphisms (SNPs) and summed them to form molecular personality scales (MPSs) with from 4 to 2,497 SNPs. Scales were derived from two-thirds of a large (N = 3,972) sample of individuals from Sardinia who completed the Revised NEO Personality Inventory and were assessed in a genome-wide association scan. When MPSs were correlated with the phenotype in the remaining third of the sample, very small but significant associations were found for four of the five personality factors when the longest scales were examined. These data suggest that MPSs for Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness (but not Extraversion) contain genetic information that can be refined in future studies, and the procedures described here should be applicable to other quantitative traits. PMID:21114353

  13. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  14. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  15. Identifying Coherent Structures in a 3-Stream Supersonic Jet Flow using Time-Resolved Schlieren Imaging

    NASA Astrophysics Data System (ADS)

    Tenney, Andrew; Coleman, Thomas; Berry, Matthew; Magstadt, Andy; Gogineni, Sivaram; Kiel, Barry

    2015-11-01

    Shock cells and large scale structures present in a three-stream non-axisymmetric jet are studied both qualitatively and quantitatively. Large Eddy Simulation is utilized first to gain an understanding of the underlying physics of the flow and direct the focus of the physical experiment. The flow in the experiment is visualized using long exposure Schlieren photography, with time resolved Schlieren photography also a possibility. Velocity derivative diagnostics are calculated from the grey-scale Schlieren images are analyzed using continuous wavelet transforms. Pressure signals are also captured in the near-field of the jet to correlate with the velocity derivative diagnostics and assist in unraveling this complex flow. We acknowledge the support of AFRL through an SBIR grant.

  16. Ascertaining Validity in the Abstract Realm of PMESII Simulation Models: An Analysis of the Peace Support Operations Model (PSOM)

    DTIC Science & Technology

    2009-06-01

    simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE

  17. Annual runoff and evapotranspiration of forestlands and non-forestlands in selected basins of the Loess Plateau of China.

    Treesearch

    Yanhui Wang; Pengtao Yu; Karl-Heinz Feger; Xiaohua Wei; Ge Sun; et al

    2011-01-01

    Large-scale forestation has been undertaken over decades principally to control the serious soil erosion in the Loess Plateau of China. A quantitative assessment of the hydrological effects of forestation, especially on basin water yield, is critical for the sustainable forestry development within this dry region. In this study, we constructed the multi-annual water...

  18. Transcriptome sequencing and annotation of the halophytic microalga Dunaliella salina * #

    PubMed Central

    Hong, Ling; Liu, Jun-li; Midoun, Samira Z.; Miller, Philip C.

    2017-01-01

    The unicellular green alga Dunaliella salina is well adapted to salt stress and contains compounds (including β-carotene and vitamins) with potential commercial value. A large transcriptome database of D. salina during the adjustment, exponential and stationary growth phases was generated using a high throughput sequencing platform. We characterized the metabolic processes in D. salina with a focus on valuable metabolites, with the aim of manipulating D. salina to achieve greater economic value in large-scale production through a bioengineering strategy. Gene expression profiles under salt stress verified using quantitative polymerase chain reaction (qPCR) implied that salt can regulate the expression of key genes. This study generated a substantial fraction of D. salina transcriptional sequences for the entire growth cycle, providing a basis for the discovery of novel genes. This first full-scale transcriptome study of D. salina establishes a foundation for further comparative genomic studies. PMID:28990374

  19. Transcriptional analysis of the Arabidopsis ovule by massively parallel signature sequencing

    PubMed Central

    Sánchez-León, Nidia; Arteaga-Vázquez, Mario; Alvarez-Mejía, César; Mendiola-Soto, Javier; Durán-Figueroa, Noé; Rodríguez-Leal, Daniel; Rodríguez-Arévalo, Isaac; García-Campayo, Vicenta; García-Aguilar, Marcelina; Olmedo-Monfil, Vianey; Arteaga-Sánchez, Mario; Martínez de la Vega, Octavio; Nobuta, Kan; Vemaraju, Kalyan; Meyers, Blake C.; Vielle-Calzada, Jean-Philippe

    2012-01-01

    The life cycle of flowering plants alternates between a predominant sporophytic (diploid) and an ephemeral gametophytic (haploid) generation that only occurs in reproductive organs. In Arabidopsis thaliana, the female gametophyte is deeply embedded within the ovule, complicating the study of the genetic and molecular interactions involved in the sporophytic to gametophytic transition. Massively parallel signature sequencing (MPSS) was used to conduct a quantitative large-scale transcriptional analysis of the fully differentiated Arabidopsis ovule prior to fertilization. The expression of 9775 genes was quantified in wild-type ovules, additionally detecting >2200 new transcripts mapping to antisense or intergenic regions. A quantitative comparison of global expression in wild-type and sporocyteless (spl) individuals resulted in 1301 genes showing 25-fold reduced or null activity in ovules lacking a female gametophyte, including those encoding 92 signalling proteins, 75 transcription factors, and 72 RNA-binding proteins not reported in previous studies based on microarray profiling. A combination of independent genetic and molecular strategies confirmed the differential expression of 28 of them, showing that they are either preferentially active in the female gametophyte, or dependent on the presence of a female gametophyte to be expressed in sporophytic cells of the ovule. Among 18 genes encoding pentatricopeptide-repeat proteins (PPRs) that show transcriptional activity in wild-type but not spl ovules, CIHUATEOTL (At4g38150) is specifically expressed in the female gametophyte and necessary for female gametogenesis. These results expand the nature of the transcriptional universe present in the ovule of Arabidopsis, and offer a large-scale quantitative reference of global expression for future genomic and developmental studies. PMID:22442422

  20. Transcriptional analysis of the Arabidopsis ovule by massively parallel signature sequencing.

    PubMed

    Sánchez-León, Nidia; Arteaga-Vázquez, Mario; Alvarez-Mejía, César; Mendiola-Soto, Javier; Durán-Figueroa, Noé; Rodríguez-Leal, Daniel; Rodríguez-Arévalo, Isaac; García-Campayo, Vicenta; García-Aguilar, Marcelina; Olmedo-Monfil, Vianey; Arteaga-Sánchez, Mario; de la Vega, Octavio Martínez; Nobuta, Kan; Vemaraju, Kalyan; Meyers, Blake C; Vielle-Calzada, Jean-Philippe

    2012-06-01

    The life cycle of flowering plants alternates between a predominant sporophytic (diploid) and an ephemeral gametophytic (haploid) generation that only occurs in reproductive organs. In Arabidopsis thaliana, the female gametophyte is deeply embedded within the ovule, complicating the study of the genetic and molecular interactions involved in the sporophytic to gametophytic transition. Massively parallel signature sequencing (MPSS) was used to conduct a quantitative large-scale transcriptional analysis of the fully differentiated Arabidopsis ovule prior to fertilization. The expression of 9775 genes was quantified in wild-type ovules, additionally detecting >2200 new transcripts mapping to antisense or intergenic regions. A quantitative comparison of global expression in wild-type and sporocyteless (spl) individuals resulted in 1301 genes showing 25-fold reduced or null activity in ovules lacking a female gametophyte, including those encoding 92 signalling proteins, 75 transcription factors, and 72 RNA-binding proteins not reported in previous studies based on microarray profiling. A combination of independent genetic and molecular strategies confirmed the differential expression of 28 of them, showing that they are either preferentially active in the female gametophyte, or dependent on the presence of a female gametophyte to be expressed in sporophytic cells of the ovule. Among 18 genes encoding pentatricopeptide-repeat proteins (PPRs) that show transcriptional activity in wild-type but not spl ovules, CIHUATEOTL (At4g38150) is specifically expressed in the female gametophyte and necessary for female gametogenesis. These results expand the nature of the transcriptional universe present in the ovule of Arabidopsis, and offer a large-scale quantitative reference of global expression for future genomic and developmental studies.

  1. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  2. The topology of large-scale structure. III - Analysis of observations

    NASA Astrophysics Data System (ADS)

    Gott, J. Richard, III; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.; Weinberg, David H.; Gammie, Charles; Polk, Kevin; Vogeley, Michael; Jeffrey, Scott; Bhavsar, Suketu P.; Melott, Adrian L.; Giovanelli, Riccardo; Hayes, Martha P.; Tully, R. Brent; Hamilton, Andrew J. S.

    1989-05-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.

  3. The topology of large-scale structure. III - Analysis of observations. [in universe

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III; Weinberg, David H.; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.

    1989-01-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.

  4. How much does a tokamak reactor cost?

    NASA Astrophysics Data System (ADS)

    Freidberg, J.; Cerfon, A.; Ballinger, S.; Barber, J.; Dogra, A.; McCarthy, W.; Milanese, L.; Mouratidis, T.; Redman, W.; Sandberg, A.; Segal, D.; Simpson, R.; Sorensen, C.; Zhou, M.

    2017-10-01

    The cost of a fusion reactor is of critical importance to its ultimate acceptability as a commercial source of electricity. While there are general rules of thumb for scaling both overnight cost and levelized cost of electricity the corresponding relations are not very accurate or universally agreed upon. We have carried out a series of scaling studies of tokamak reactor costs based on reasonably sophisticated plasma and engineering models. The analysis is largely analytic, requiring only a simple numerical code, thus allowing a very large number of designs. Importantly, the studies are aimed at plasma physicists rather than fusion engineers. The goals are to assess the pros and cons of steady state burning plasma experiments and reactors. One specific set of results discusses the benefits of higher magnetic fields, now possible because of the recent development of high T rare earth superconductors (REBCO); with this goal in mind, we calculate quantitative expressions, including both scaling and multiplicative constants, for cost and major radius as a function of central magnetic field.

  5. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  6. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  7. Engineering Digestion: Multiscale Processes of Food Digestion.

    PubMed

    Bornhorst, Gail M; Gouseti, Ourania; Wickham, Martin S J; Bakalis, Serafim

    2016-03-01

    Food digestion is a complex, multiscale process that has recently become of interest to the food industry due to the developing links between food and health or disease. Food digestion can be studied by using either in vitro or in vivo models, each having certain advantages or disadvantages. The recent interest in food digestion has resulted in a large number of studies in this area, yet few have provided an in-depth, quantitative description of digestion processes. To provide a framework to develop these quantitative comparisons, a summary is given here between digestion processes and parallel unit operations in the food and chemical industry. Characterization parameters and phenomena are suggested for each step of digestion. In addition to the quantitative characterization of digestion processes, the multiscale aspect of digestion must also be considered. In both food systems and the gastrointestinal tract, multiple length scales are involved in food breakdown, mixing, absorption. These different length scales influence digestion processes independently as well as through interrelated mechanisms. To facilitate optimized development of functional food products, a multiscale, engineering approach may be taken to describe food digestion processes. A framework for this approach is described in this review, as well as examples that demonstrate the importance of process characterization as well as the multiple, interrelated length scales in the digestion process. © 2016 Institute of Food Technologists®

  8. Measuring the topology of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III

    1988-01-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  9. Measuring the topology of large-scale structure in the universe

    NASA Astrophysics Data System (ADS)

    Gott, J. Richard, III

    1988-11-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  10. A behavioral-level HDL description of SFQ logic circuits for quantitative performance analysis of large-scale SFQ digital systems

    NASA Astrophysics Data System (ADS)

    Matsuzaki, F.; Yoshikawa, N.; Tanaka, M.; Fujimaki, A.; Takai, Y.

    2003-10-01

    Recently many single flux quantum (SFQ) logic circuits containing several thousands of Josephson junctions have been designed successfully by using digital domain simulation based on the hard ware description language (HDL). In the present HDL-based design of SFQ circuits, a structure-level HDL description has been used, where circuits are made up of basic gate cells. However, in order to analyze large-scale SFQ digital systems, such as a microprocessor, more higher-level circuit abstraction is necessary to reduce the circuit simulation time. In this paper we have investigated the way to describe functionality of the large-scale SFQ digital circuits by a behavior-level HDL description. In this method, the functionality and the timing of the circuit block is defined directly by describing their behavior by the HDL. Using this method, we can dramatically reduce the simulation time of large-scale SFQ digital circuits.

  11. Parameterizing atmosphere-land surface exchange for climate models with satellite data: A case study for the Southern Great Plains CART site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, W.

    High-resolution satellite data provide detailed, quantitative descriptions of land surface characteristics over large areas so that objective scale linkage becomes feasible. With the aid of satellite data, Sellers et al. and Wood and Lakshmi examined the linearity of processes scaled up from 30 m to 15 km. If the phenomenon is scale invariant, then the aggregated value of a function or flux is equivalent to the function computed from aggregated values of controlling variables. The linear relation may be realistic for limited land areas having no large surface contrasts to cause significant horizontal exchange. However, for areas with sharp surfacemore » contrasts, horizontal exchange and different dynamics in the atmospheric boundary may induce nonlinear interactions, such as at interfaces of land-water, forest-farm land, and irrigated crops-desert steppe. The linear approach, however, represents the simplest scenario, and is useful for developing an effective scheme for incorporating subgrid land surface processes into large-scale models. Our studies focus on coupling satellite data and ground measurements with a satellite-data-driven land surface model to parameterize surface fluxes for large-scale climate models. In this case study, we used surface spectral reflectance data from satellite remote sensing to characterize spatial and temporal changes in vegetation and associated surface parameters in an area of about 350 {times} 400 km covering the southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site of the US Department of Energy`s Atmospheric Radiation Measurement (ARM) Program.« less

  12. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network

    PubMed Central

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L.; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-01-01

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae. In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. PMID:28325812

  13. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network.

    PubMed

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-05-05

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. Copyright © 2017 Usaj et al.

  14. Quantitative patterns of stylistic influence in the evolution of literature.

    PubMed

    Hughes, James M; Foti, Nicholas J; Krakauer, David C; Rockmore, Daniel N

    2012-05-15

    Literature is a form of expression whose temporal structure, both in content and style, provides a historical record of the evolution of culture. In this work we take on a quantitative analysis of literary style and conduct the first large-scale temporal stylometric study of literature by using the vast holdings in the Project Gutenberg Digital Library corpus. We find temporal stylistic localization among authors through the analysis of the similarity structure in feature vectors derived from content-free word usage, nonhomogeneous decay rates of stylistic influence, and an accelerating rate of decay of influence among modern authors. Within a given time period we also find evidence for stylistic coherence with a given literary topic, such that writers in different fields adopt different literary styles. This study gives quantitative support to the notion of a literary "style of a time" with a strong trend toward increasingly contemporaneous stylistic influence.

  15. Sex Differences in Magical Ideation: A Community-Based Twin Study

    PubMed Central

    Karcher, Nicole R.; Slutske, Wendy S.; Kerns, John G.; Piasecki, Thomas M.; Martin, Nicholas G.

    2014-01-01

    Two questions regarding sex differences in magical ideation were investigated in this study: (1) whether there are mean level sex differences on the Magical Ideation Scale (MIS), and (2) whether there are quantitative and/or qualitative sex differences in the genetic contributions to variation on this scale. These questions were evaluated using data obtained from a large community sample of adult Australian twins (N=4,355) that included opposite-sex pairs. Participants completed a modified 15-item version of the MIS within a larger assessment battery. Women reported both higher means and variability on the MIS than men; this was also observed within families (in opposite-sex twin pairs). Biometric modeling indicated that the proportion of variation in MIS scores due to genetic influences (indicating quantitative sex differences) and the specific latent genetic contributions to this variation (indicating qualitative sex differences) were the same in men and women. These findings clarify the nature of sex differences in magical ideation and point to avenues for future research. PMID:24364500

  16. Large-Scale Diffraction Patterns from Circular Objects

    ERIC Educational Resources Information Center

    Rinard, Phillip M.

    1976-01-01

    Investigates quantitatively the diffractions of light by a U.S. penny and an aperture of the same size. Differences noted between the theory and measurements are discussed, with probable causes indicated. (Author/CP)

  17. The impact of library services in primary care trusts in NHS North West England: a large-scale retrospective quantitative study of online resource usage in relation to types of service.

    PubMed

    Bell, Katherine; Glover, Steven William; Brodie, Colin; Roberts, Anne; Gleghorn, Colette

    2009-06-01

    Within NHS North West England there are 24 primary care trusts (PCTs), all with access to different types of library services. This study aims to evaluate the impact the type of library service has on online resource usage. We conducted a large-scale retrospective quantitative study across all PCT staff in NHS NW England using Athens sessions log data. We studied the Athens log usage of 30,381 staff, with 8,273 active Athens accounts and 100,599 sessions from 1 January 2007 to 31 December 2007. In 2007, PCTs with outreach librarians achieved 43% penetration of staff with active Athens accounts compared with PCTs with their own library service (28.23%); PCTs with service level agreements (SLAs) with acute hospital library services (22.5%) and with no library service (19.68%). This pattern was also observed when we looked at the average number of Athens user sessions per person, and usage of Dialog Datastar databases and Proquest full text journal collections. Our findings have shown a correlation of e-resource usage and type of library service. Outreach librarians have proved to be an efficient model for promoting and driving up resources usage. PCTs with no library service have shown the lowest level of resource usage.

  18. A Method for Label-Free, Differential Top-Down Proteomics.

    PubMed

    Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L

    2016-01-01

    Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.

  19. Deformation and Failure Mechanisms of Shape Memory Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daly, Samantha Hayes

    2015-04-15

    The goal of this research was to understand the fundamental mechanics that drive the deformation and failure of shape memory alloys (SMAs). SMAs are difficult materials to characterize because of the complex phase transformations that give rise to their unique properties, including shape memory and superelasticity. These phase transformations occur across multiple length scales (one example being the martensite-austenite twinning that underlies macroscopic strain localization) and result in a large hysteresis. In order to optimize the use of this hysteretic behavior in energy storage and damping applications, we must first have a quantitative understanding of this transformation behavior. Prior resultsmore » on shape memory alloys have been largely qualitative (i.e., mapping phase transformations through cracked oxide coatings or surface morphology). The PI developed and utilized new approaches to provide a quantitative, full-field characterization of phase transformation, conducting a comprehensive suite of experiments across multiple length scales and tying these results to theoretical and computational analysis. The research funded by this award utilized new combinations of scanning electron microscopy, diffraction, digital image correlation, and custom testing equipment and procedures to study phase transformation processes at a wide range of length scales, with a focus at small length scales with spatial resolution on the order of 1 nanometer. These experiments probe the basic connections between length scales during phase transformation. In addition to the insights gained on the fundamental mechanisms driving transformations in shape memory alloys, the unique experimental methodologies developed under this award are applicable to a wide range of solid-to-solid phase transformations and other strain localization mechanisms.« less

  20. Ketamine as a novel treatment for major depressive disorder and bipolar depression: a systematic review and quantitative meta-analysis.

    PubMed

    Lee, Ellen E; Della Selva, Megan P; Liu, Anson; Himelhoch, Seth

    2015-01-01

    Given the significant disability, morbidity and mortality associated with depression, the promising recent trials of ketamine highlight a novel intervention. A meta-analysis was conducted to assess the efficacy of ketamine in comparison with placebo for the reduction of depressive symptoms in patients who meet criteria for a major depressive episode. Two electronic databases were searched in September 2013 for English-language studies that were randomized placebo-controlled trials of ketamine treatment for patients with major depressive disorder or bipolar depression and utilized a standardized rating scale. Studies including participants receiving electroconvulsive therapy and adolescent/child participants were excluded. Five studies were included in the quantitative meta-analysis. The quantitative meta-analysis showed that ketamine significantly reduced depressive symptoms. The overall effect size at day 1 was large and statistically significant with an overall standardized mean difference of 1.01 (95% confidence interval 0.69-1.34) (P<.001), with the effects sustained at 7 days postinfusion. The heterogeneity of the studies was low and not statistically significant, and the funnel plot showed no publication bias. The large and statistically significant effect of ketamine on depressive symptoms supports a promising, new and effective pharmacotherapy with rapid onset, high efficacy and good tolerability. Copyright © 2015. Published by Elsevier Inc.

  1. The Effectiveness of a Mixed-Mode Survey on Domestic Violence in Curaçao: Response and Data Quality

    ERIC Educational Resources Information Center

    van Wijk, Nikil; de Leeuw, Edith; de Bruijn, Jeanne

    2015-01-01

    To collect reliable statistical data on domestic violence in Curaçao, we conducted a large-scale quantitative study (n = 816). To meet with the special needs of the population and topic, we designed a tailored mixed-mode survey to assess the prevalence of domestic violence in Curaçao and its health consequences. Great care was taken to reduce…

  2. Extra dimension searches at hadron colliders to next-to-leading order-QCD

    NASA Astrophysics Data System (ADS)

    Kumar, M. C.; Mathews, Prakash; Ravindran, V.

    2007-11-01

    The quantitative impact of NLO-QCD corrections for searches of large and warped extra dimensions at hadron colliders are investigated for the Drell-Yan process. The K-factor for various observables at hadron colliders are presented. Factorisation, renormalisation scale dependence and uncertainties due to various parton distribution functions are studied. Uncertainties arising from the error on experimental data are estimated using the MRST parton distribution functions.

  3. Determination of plasma parameters from soft X-ray images for coronal holes /open magnetic field configurations/ and coronal large-scale structures /extended closed-field configurations/

    NASA Technical Reports Server (NTRS)

    Maxson, C. W.; Vaiana, G. S.

    1977-01-01

    In connection with high-quality solar soft X-ray images the 'quiet' features of the inner corona have been separated into two sharply different components, including the strongly reduced emission areas or coronal holes (CH) and the extended regions of looplike emission features or large-scale structures (LSS). Particular central meridian passage observations of the prominent CH1 on August 21, 1973, are selected for a quantitative study. Histogram photographic density distributions for full-disk images at other central meridian passages of CH 1 are also presented, and the techniques of converting low photographic density data to deposited energy are discussed, with particular emphasis on the problems associated with the CH data.

  4. Lunar terrain mapping and relative-roughness analysis

    NASA Technical Reports Server (NTRS)

    Rowan, L. C.; Mccauley, J. F.; Holm, E. A.

    1971-01-01

    Terrain maps of the equatorial zone were prepared at scales of 1:2,000,000 and 1:1,000,000 to classify lunar terrain with respect to roughness and to provide a basis for selecting sites for Surveyor and Apollo landings, as well as for Ranger and Lunar Orbiter photographs. Lunar terrain was described by qualitative and quantitative methods and divided into four fundamental classes: maria, terrae, craters, and linear features. Some 35 subdivisions were defined and mapped throughout the equatorial zone, and, in addition, most of the map units were illustrated by photographs. The terrain types were analyzed quantitatively to characterize and order their relative roughness characteristics. For some morphologically homogeneous mare areas, relative roughness can be extrapolated to the large scales from measurements at small scales.

  5. Large-scale protein-protein interaction analysis in Arabidopsis mesophyll protoplasts by split firefly luciferase complementation.

    PubMed

    Li, Jian-Feng; Bush, Jenifer; Xiong, Yan; Li, Lei; McCormack, Matthew

    2011-01-01

    Protein-protein interactions (PPIs) constitute the regulatory network that coordinates diverse cellular functions. There are growing needs in plant research for creating protein interaction maps behind complex cellular processes and at a systems biology level. However, only a few approaches have been successfully used for large-scale surveys of PPIs in plants, each having advantages and disadvantages. Here we present split firefly luciferase complementation (SFLC) as a highly sensitive and noninvasive technique for in planta PPI investigation. In this assay, the separate halves of a firefly luciferase can come into close proximity and transiently restore its catalytic activity only when their fusion partners, namely the two proteins of interest, interact with each other. This assay was conferred with quantitativeness and high throughput potential when the Arabidopsis mesophyll protoplast system and a microplate luminometer were employed for protein expression and luciferase measurement, respectively. Using the SFLC assay, we could monitor the dynamics of rapamycin-induced and ascomycin-disrupted interaction between Arabidopsis FRB and human FKBP proteins in a near real-time manner. As a proof of concept for large-scale PPI survey, we further applied the SFLC assay to testing 132 binary PPIs among 8 auxin response factors (ARFs) and 12 Aux/IAA proteins from Arabidopsis. Our results demonstrated that the SFLC assay is ideal for in vivo quantitative PPI analysis in plant cells and is particularly powerful for large-scale binary PPI screens.

  6. An alternative to the search for single polymorphisms: toward molecular personality scales for the five-factor model.

    PubMed

    McCrae, Robert R; Scally, Matthew; Terracciano, Antonio; Abecasis, Gonçalo R; Costa, Paul T

    2010-12-01

    There is growing evidence that personality traits are affected by many genes, all of which have very small effects. As an alternative to the largely unsuccessful search for individual polymorphisms associated with personality traits, the authors identified large sets of potentially related single nucleotide polymorphisms (SNPs) and summed them to form molecular personality scales (MPSs) with from 4 to 2,497 SNPs. Scales were derived from two thirds of a large (N = 3,972) sample of individuals from Sardinia who completed the Revised NEO Personality Inventory (P. T. Costa, Jr., & R. R. McCrae, 1992) and were assessed in a genomewide association scan. When MPSs were correlated with the phenotype in the remaining one third of the sample, very small but significant associations were found for 4 of the 5e personality factors when the longest scales were examined. These data suggest that MPSs for Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness (but not Extraversion) contain genetic information that can be refined in future studies, and the procedures described here should be applicable to other quantitative traits. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  7. A Systematic Review of Barriers and Facilitators to Minority Research Participation Among African Americans, Latinos, Asian Americans, and Pacific Islanders

    PubMed Central

    Duran, Nelida; Norris, Keith

    2014-01-01

    To assess the experienced or perceived barriers and facilitators to health research participation for major US racial/ethnic minority populations, we conducted a systematic review of qualitative and quantitative studies from a search on PubMed and Web of Science from January 2000 to December 2011. With 44 articles included in the review, we found distinct and shared barriers and facilitators. Despite different expressions of mistrust, all groups represented in these studies were willing to participate for altruistic reasons embedded in cultural and community priorities. Greater comparative understanding of barriers and facilitators to racial/ethnic minorities’ research participation can improve population-specific recruitment and retention strategies and could better inform future large-scale prospective quantitative and in-depth ethnographic studies. PMID:24328648

  8. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  9. Algorithm and Application of Gcp-Independent Block Adjustment for Super Large-Scale Domestic High Resolution Optical Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.

    2018-04-01

    The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.

  10. Large-visual-angle microstructure inspired from quantitative design of Morpho butterflies' lamellae deviation using the FDTD/PSO method.

    PubMed

    Wang, Wanlin; Zhang, Wang; Chen, Weixin; Gu, Jiajun; Liu, Qinglei; Deng, Tao; Zhang, Di

    2013-01-15

    The wide angular range of the treelike structure in Morpho butterfly scales was investigated by finite-difference time-domain (FDTD)/particle-swarm-optimization (PSO) analysis. Using the FDTD method, different parameters in the Morpho butterflies' treelike structure were studied and their contributions to the angular dependence were analyzed. Then a wide angular range was realized by the PSO method from quantitatively designing the lamellae deviation (Δy), which was a crucial parameter with angular range. The field map of the wide-range reflection in a large area was given to confirm the wide angular range. The tristimulus values and corresponding color coordinates for various viewing directions were calculated to confirm the blue color in different observation angles. The wide angular range realized by the FDTD/PSO method will assist us in understanding the scientific principles involved and also in designing artificial optical materials.

  11. Water balance model for Kings Creek

    NASA Technical Reports Server (NTRS)

    Wood, Eric F.

    1990-01-01

    Particular attention is given to the spatial variability that affects the representation of water balance at the catchment scale in the context of macroscale water-balance modeling. Remotely sensed data are employed for parameterization, and the resulting model is developed so that subgrid spatial variability is preserved and therefore influences the grid-scale fluxes of the model. The model permits the quantitative evaluation of the surface-atmospheric interactions related to the large-scale hydrologic water balance.

  12. Natural disasters and population mobility in Bangladesh.

    PubMed

    Gray, Clark L; Mueller, Valerie

    2012-04-17

    The consequences of environmental change for human migration have gained increasing attention in the context of climate change and recent large-scale natural disasters, but as yet relatively few large-scale and quantitative studies have addressed this issue. We investigate the consequences of climate-related natural disasters for long-term population mobility in rural Bangladesh, a region particularly vulnerable to environmental change, using longitudinal survey data from 1,700 households spanning a 15-y period. Multivariate event history models are used to estimate the effects of flooding and crop failures on local population mobility and long-distance migration while controlling for a large set of potential confounders at various scales. The results indicate that flooding has modest effects on mobility that are most visible at moderate intensities and for women and the poor. However, crop failures unrelated to flooding have strong effects on mobility in which households that are not directly affected but live in severely affected areas are the most likely to move. These results point toward an alternate paradigm of disaster-induced mobility that recognizes the significant barriers to migration for vulnerable households as well their substantial local adaptive capacity.

  13. Species diversity of edaphic mites (Acari: Oribatida) and effects of topography, soil properties and litter gradients on their qualitative and quantitative composition in 64 km² of forest in Amazonia.

    PubMed

    de Moraes, Jamile; Franklin, Elizabeth; de Morais, José Wellington; de Souza, Jorge Luiz Pereira

    2011-09-01

    Small-scale spatial distribution of oribatid mites has been investigated in Amazonia. In addition, medium- and large-scale studies are needed to establish the utility of these mites in detecting natural environmental variability, and to distinguish this variability from anthropogenic impacts. We are expanding the knowledge about oribatid mites in a wet upland forest reserve, and investigate whether a standardized and integrated protocol is an efficient way to assess the effects of environmental variables on their qualitative and quantitative composition on a large spatial scale inside an ecological reserve in Central Amazonia, Brazil. Samples for Berlese-Tullgren extraction were taken in 72 plots of 250 × 6 m distributed over 64 km(2). In total 3,182 adult individuals, from 82 species and 79 morphospecies were recorded, expanding the number of species known in the reserve from 149 to 254. Galumna, Rostrozetes and Scheloribates were the most speciose genera, and 57 species were rare. Rostrozetes ovulum, Pergalumna passimpuctata and Archegozetes longisetosus were the most abundant species, and the first two were the most frequent. Species number and abundance were not correlated with clay content, slope, pH and litter quantity. However, Principal Coordinate Analysis indicated that as the percentage of clay content, litter quantity and pH changed, the oribatid mite qualitative and quantitative composition also changed. The standardized protocol effectively captured the diversity, as we collected one of the largest registers of oribatid mites' species for Amazonia. Moreover, biological and ecological data were integrated to capture the effects of environmental variables accounting for their diversity and abundance.

  14. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  15. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  16. Quantitative Assessment the Relationship between p21 rs1059234 Polymorphism and Cancer Risk.

    PubMed

    Huang, Yong-Sheng; Fan, Qian-Qian; Li, Chuang; Nie, Meng; Quan, Hong-Yang; Wang, Lin

    2015-01-01

    p21 is a cyclin-dependent kinase inhibitor, which can arrest cell proliferation and serve as a tumor suppressor. Though many studies were published to assess the relationship between p21 rs1059234 polymorphism and various cancer risks, there was no definite conclusion on this association. To derive a more precise quantitative assessment of the relationship, a large scale meta-analysis of 5,963 cases and 8,405 controls from 16 eligible published case-control studies was performed. Our analysis suggested that rs1059234 was not associated with the integral cancer risk for both dominant model [(T/T+C/T) vs C/C, OR=1.00, 95% CI: 0.84-1.18] and recessive model [T/T vs (C/C+C/T), OR=1.03, 95% CI: 0.93-1.15)]. However, further stratified analysis showed rs1059234 was greatly associated with the risk of squamous cell carcinoma of head and neck (SCCHN). Thus, larger scale primary studies are still required to further evaluate the interaction of p21 rs1059234 polymorphism and cancer risk in specific cancer subtypes.

  17. Determinants of fruit and vegetable consumption among children and adolescents: a review of the literature. Part II: qualitative studies.

    PubMed

    Krølner, Rikke; Rasmussen, Mette; Brug, Johannes; Klepp, Knut-Inge; Wind, Marianne; Due, Pernille

    2011-10-14

    Large proportions of children do not fulfil the World Health Organization recommendation of eating at least 400 grams of fruit and vegetables (FV) per day. To promote an increased FV intake among children it is important to identify factors which influence their consumption. Both qualitative and quantitative studies are needed. Earlier reviews have analysed evidence from quantitative studies. The aim of this paper is to present a systematic review of qualitative studies of determinants of children's FV intake. Relevant studies were identified by searching Anthropology Plus, Cinahl, CSA illumine, Embase, International Bibliography of the Social Sciences, Medline, PsycINFO, and Web of Science using combinations of synonyms for FV intake, children/adolescents and qualitative methods as search terms. The literature search was completed by December 1st 2010. Papers were included if they applied qualitative methods to investigate 6-18-year-olds' perceptions of factors influencing their FV consumption. Quantitative studies, review studies, studies reported in other languages than English, and non-peer reviewed or unpublished manuscripts were excluded. The papers were reviewed systematically using standardised templates for summary of papers, quality assessment, and synthesis of findings across papers. The review included 31 studies, mostly based on US populations and focus group discussions. The synthesis identified the following potential determinants for FV intake which supplement the quantitative knowledge base: Time costs; lack of taste guarantee; satiety value; appropriate time/occasions/settings for eating FV; sensory and physical aspects; variety, visibility, methods of preparation; access to unhealthy food; the symbolic value of food for image, gender identity and social interaction with peers; short term outcome expectancies. The review highlights numerous potential determinants which have not been investigated thoroughly in quantitative studies. Future large scale quantitative studies should attempt to quantify the importance of these factors. Further, mechanisms behind gender, age and socioeconomic differences in FV consumption are proposed which should be tested quantitatively in order to better tailor interventions to vulnerable groups. Finally, the review provides input to the conceptualisation and measurements of concepts (i.e. peer influence, availability in schools) which may refine survey instruments and theoretical frameworks concerning eating behaviours.

  18. Urban area thermal monitoring: Liepaja case study using satellite and aerial thermal data

    NASA Astrophysics Data System (ADS)

    Gulbe, Linda; Caune, Vairis; Korats, Gundars

    2017-12-01

    The aim of this study is to explore large (60 m/pixel) and small scale (individual building level) temperature distribution patterns from thermal remote sensing data and to conclude what kind of information could be extracted from thermal remote sensing on regular basis. Landsat program provides frequent large scale thermal images useful for analysis of city temperature patterns. During the study correlation between temperature patterns and vegetation content based on NDVI and building coverage based on OpenStreetMap data was studied. Landsat based temperature patterns were independent from the season, negatively correlated with vegetation content and positively correlated with building coverage. Small scale analysis included spatial and raster descriptor analysis for polygons corresponding to roofs of individual buildings for evaluating insulation of roofs. Remote sensing and spatial descriptors are poorly related to heat consumption data, however, thermal aerial data median and entropy can help to identify poorly insulated roofs. Automated quantitative roof analysis has high potential for acquiring city wide information about roof insulation, but quality is limited by reference data quality and information on building types, and roof materials would be crucial for further studies.

  19. Racking Response of Reinforced Concrete Cut and Cover Tunnel

    DOT National Transportation Integrated Search

    2016-01-01

    Currently, the knowledge base and quantitative data sets concerning cut and cover tunnel seismic response are scarce. In this report, a large-scale experimental program is conducted to assess: i) stiffness, capacity, and potential seismically-induced...

  20. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  1. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  2. Topology of large-scale structure. IV - Topology in two dimensions

    NASA Technical Reports Server (NTRS)

    Melott, Adrian L.; Cohen, Alexander P.; Hamilton, Andrew J. S.; Gott, J. Richard, III; Weinberg, David H.

    1989-01-01

    In a recent series of papers, an algorithm was developed for quantitatively measuring the topology of the large-scale structure of the universe and this algorithm was applied to numerical models and to three-dimensional observational data sets. In this paper, it is shown that topological information can be derived from a two-dimensional cross section of a density field, and analytic expressions are given for a Gaussian random field. The application of a two-dimensional numerical algorithm for measuring topology to cross sections of three-dimensional models is demonstrated.

  3. Jovian meterology: Large-scale moist convection without a lower boundary

    NASA Technical Reports Server (NTRS)

    Gierasch, P. J.

    1975-01-01

    It is proposed that Jupiter's cloud bands represent large scale convection whose character is determined by the phase change of water at a level where the temperature is about 275K. It is argued that there are three important layers in the atmosphere: a tropopause layer where emission to space occurs; an intermediate layer between the tropopause and the water cloud base; and the deep layer below the water cloud. All arguments are only semi-quantitative. It is pointed out that these ingredients are essential to Jovian meteorology.

  4. Measuring safety climate in health care.

    PubMed

    Flin, R; Burns, C; Mearns, K; Yule, S; Robertson, E M

    2006-04-01

    To review quantitative studies of safety climate in health care to examine the psychometric properties of the questionnaires designed to measure this construct. A systematic literature review was undertaken to study sample and questionnaire design characteristics (source, no of items, scale type), construct validity (content validity, factor structure and internal reliability, concurrent validity), within group agreement, and level of analysis. Twelve studies were examined. There was a lack of explicit theoretical underpinning for most questionnaires and some instruments did not report standard psychometric criteria. Where this information was available, several questionnaires appeared to have limitations. More consideration should be given to psychometric factors in the design of healthcare safety climate instruments, especially as these are beginning to be used in large scale surveys across healthcare organisations.

  5. Multiscale/multiresolution landslides susceptibility mapping

    NASA Astrophysics Data System (ADS)

    Grozavu, Adrian; Cătălin Stanga, Iulian; Valeriu Patriche, Cristian; Toader Juravle, Doru

    2014-05-01

    Within the European strategies, landslides are considered an important threatening that requires detailed studies to identify areas where these processes could occur in the future and to design scientific and technical plans for landslide risk mitigation. In this idea, assessing and mapping the landslide susceptibility is an important preliminary step. Generally, landslide susceptibility at small scale (for large regions) can be assessed through qualitative approach (expert judgements), based on a few variables, while studies at medium and large scale requires quantitative approach (e.g. multivariate statistics), a larger set of variables and, necessarily, the landslide inventory. Obviously, the results vary more or less from a scale to another, depending on the available input data, but also on the applied methodology. Since it is almost impossible to have a complete landslide inventory on large regions (e.g. at continental level), it is very important to verify the compatibility and the validity of results obtained at different scales, identifying the differences and fixing the inherent errors. This paper aims at assessing and mapping the landslide susceptibility at regional level through a multiscale-multiresolution approach from small scale and low resolution to large scale and high resolution of data and results, comparing the compatibility of results. While the first ones could be used for studies at european and national level, the later ones allows results validation, including through fields surveys. The test area, namely the Barlad Plateau (more than 9000 sq.km) is located in Eastern Romania, covering a region where both the natural environment and the human factor create a causal context that favor these processes. The landslide predictors were initially derived from various databases available at pan-european level and progressively completed and/or enhanced together with scale and the resolution: the topography (from SRTM at 90 meters to digital elevation models based on topographical maps, 1:25,000 and 1:5,000), the lithology (from geological maps, 1:200,000), land cover and land use (from CLC 2006 to maps derived from orthorectified aerial images, 0.5 meters resolution), rainfall (from Worldclim, ECAD to our own data), the seismicity (the seismic zonation of Romania) etc. The landslide inventory was created as polygonal data based on aerial images (resolution 0.5 meters), the information being considered at county level (NUTS 3) and, eventually, at communal level (LAU2). The methodological framework is based on the logistic regression as a quantitative method and the analytic hierarchy process as a semi-qualitative methods, both being applied once identically for all scales and once recalibrated for each scale and resolution (from 1:1,000,000 and one km pixel resolution to 1:25,000 and ten meters resolution). The predictive performance of the two models was assessed using the ROC (Receiver Operating Characteristic) curve and the AUC (Area Under Curve) parameter and the results indicate a good correspondence between the susceptibility estimated for the test samples (0.855-0.890) and for the validation samples (0.830-0.865). Finally, the results were compared in pairs in order to fix the errors at small scale and low resolution and to optimize the methodology for landslide susceptibility mapping on large areas.

  6. A Small-scale Physical Model of the Lower Mississippi River for Studying the Potential of Medium- and Large-scale Diversions

    NASA Astrophysics Data System (ADS)

    Willson, C. S.

    2011-12-01

    Over the past several thousand years the Mississippi River has formed one of the world's largest deltas and much of the Louisiana coast. However, in the last 100 years or so, anthropogenic controls have been placed on the system to maintain important navigation routes and for flood control resulting in the loss of the natural channel shifting necessary for replenishment of the deltaic coast with fresh sediment and resources. In addition, the high relative sea level rise in the lowermost portion of the river is causing a change in the distributary flow patterns of the river and deposition center. River and sediment diversions are being proposed as way to re-create some of the historical distribution of river water and sediments into the delta region. In response to a need for improving the understanding of the potential for medium- and large-scale river and sediment diversions, the state of Louisiana funded the construction of a small-scale physical model (SSPM) of the lower ~76 river miles (RM). The SSPM is a 1:12,000 horizontal, 1:500 vertical, highly-distorted, movable bed physical model designed to provide qualitative and semi-quantitative results regarding bulk noncohesive sediment transport characteristics in the river and through medium- and large-scale diversion structures. The SSPM was designed based on Froude similarity for the hydraulics and Shields similarity for sand transport and has a sediment time scale of 1 year prototype to 30 minutes model allowing for decadal length studies of the land building potential of diversions. Annual flow and sediment hydrographs were developed from historical records and a uniform relative sea level rise of 3 feet in 100 years is used to account for the combined effects of eustatic sea level rise and subsidence. Data collected during the experiments include river stages, dredging amounts and high-resolution video of transport patterns within the main channel and photographs of the sand deposition patterns in the diversion receiving areas. First, the similarity analysis that went into the model design along with a discussion of the resulting limitations will be presented. Next, calibration and validation results will be shown demonstrating the ability of the SSPM to capture the general lower Mississippi River sediment transport trends and deposition patterns. Third, results from a series of diversion experiments will be presented to semi-quantitatively show the effectiveness of diversion locations, sizes, and operating strategies on the quantities of sand diverted from the main river and the changes in main channel dredging volumes. These results will are then correlated with recent field and numerical studies of the study area. This talk will then close with a brief discussion of a new and improved physical model that will cover a larger domain and be designed to provide more quantitative results.

  7. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  8. The detection of large deletions or duplications in genomic DNA.

    PubMed

    Armour, J A L; Barton, D E; Cockburn, D J; Taylor, G R

    2002-11-01

    While methods for the detection of point mutations and small insertions or deletions in genomic DNA are well established, the detection of larger (>100 bp) genomic duplications or deletions can be more difficult. Most mutation scanning methods use PCR as a first step, but the subsequent analyses are usually qualitative rather than quantitative. Gene dosage methods based on PCR need to be quantitative (i.e., they should report molar quantities of starting material) or semi-quantitative (i.e., they should report gene dosage relative to an internal standard). Without some sort of quantitation, heterozygous deletions and duplications may be overlooked and therefore be under-ascertained. Gene dosage methods provide the additional benefit of reporting allele drop-out in the PCR. This could impact on SNP surveys, where large-scale genotyping may miss null alleles. Here we review recent developments in techniques for the detection of this type of mutation and compare their relative strengths and weaknesses. We emphasize that comprehensive mutation analysis should include scanning for large insertions and deletions and duplications. Copyright 2002 Wiley-Liss, Inc.

  9. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    NASA Astrophysics Data System (ADS)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.

  10. MEAN-FIELD MODELING OF AN α{sup 2} DYNAMO COUPLED WITH DIRECT NUMERICAL SIMULATIONS OF RIGIDLY ROTATING CONVECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masada, Youhei; Sano, Takayoshi, E-mail: ymasada@harbor.kobe-u.ac.jp, E-mail: sano@ile.osaka-u.ac.jp

    2014-10-10

    The mechanism of large-scale dynamos in rigidly rotating stratified convection is explored by direct numerical simulations (DNS) in Cartesian geometry. A mean-field dynamo model is also constructed using turbulent velocity profiles consistently extracted from the corresponding DNS results. By quantitative comparison between the DNS and our mean-field model, it is demonstrated that the oscillatory α{sup 2} dynamo wave, excited and sustained in the convection zone, is responsible for large-scale magnetic activities such as cyclic polarity reversal and spatiotemporal migration. The results provide strong evidence that a nonuniformity of the α-effect, which is a natural outcome of rotating stratified convection, canmore » be an important prerequisite for large-scale stellar dynamos, even without the Ω-effect.« less

  11. Large-scale multiplex absolute protein quantification of drug-metabolizing enzymes and transporters in human intestine, liver, and kidney microsomes by SWATH-MS: Comparison with MRM/SRM and HR-MRM/PRM.

    PubMed

    Nakamura, Kenji; Hirayama-Kurogi, Mio; Ito, Shingo; Kuno, Takuya; Yoneyama, Toshihiro; Obuchi, Wataru; Terasaki, Tetsuya; Ohtsuki, Sumio

    2016-08-01

    The purpose of the present study was to examine simultaneously the absolute protein amounts of 152 membrane and membrane-associated proteins, including 30 metabolizing enzymes and 107 transporters, in pooled microsomal fractions of human liver, kidney, and intestine by means of SWATH-MS with stable isotope-labeled internal standard peptides, and to compare the results with those obtained by MRM/SRM and high resolution (HR)-MRM/PRM. The protein expression levels of 27 metabolizing enzymes, 54 transporters, and six other membrane proteins were quantitated by SWATH-MS; other targets were below the lower limits of quantitation. Most of the values determined by SWATH-MS differed by less than 50% from those obtained by MRM/SRM or HR-MRM/PRM. Various metabolizing enzymes were expressed in liver microsomes more abundantly than in other microsomes. Ten, 13, and eight transporters listed as important for drugs by International Transporter Consortium were quantified in liver, kidney, and intestinal microsomes, respectively. Our results indicate that SWATH-MS enables large-scale multiplex absolute protein quantification while retaining similar quantitative capability to MRM/SRM or HR-MRM/PRM. SWATH-MS is expected to be useful methodology in the context of drug development for elucidating the molecular mechanisms of drug absorption, metabolism, and excretion in the human body based on protein profile information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caillouet, Laurie; Vidal, Jean -Philippe; Sauquet, Eric

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the 1871–2012 period built on the NOAA Twentieth Century global extended atmospheric reanalysis (20CR). The objective is to fill in the spatial and temporal data gaps in surface observations in order to improve our knowledge on the local-scale climate variability from the late nineteenth century onwards. The SANDHY (Stepwise ANalogue Downscaling method for HYdrology) statistical downscaling method, initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between large-scale 20CR predictors and local-scale predictands from the Safran high-resolution near-surface reanalysis,more » available from 1958 onwards only. SANDHY provides a daily ensemble of 125 analogue dates over the 1871–2012 period for 608 climatically homogeneous zones paving France. Large precipitation biases in intermediary seasons are shown to occur in regions with high seasonal asymmetry like the Mediterranean. Moreover, winter and summer temperatures are respectively over- and under-estimated over the whole of France. Two analogue subselection methods are therefore developed with the aim of keeping the structure of the SANDHY method unchanged while reducing those seasonal biases. The calendar selection keeps the analogues closest to the target calendar day. The stepwise selection applies two new analogy steps based on similarity of the sea surface temperature (SST) and the large-scale 2 m temperature ( T). Comparisons to the Safran reanalysis over 1959–2007 and to homogenized series over the whole twentieth century show that biases in the interannual cycle of precipitation and temperature are reduced with both methods. The stepwise subselection moreover leads to a large improvement of interannual correlation and reduction of errors in seasonal temperature time series. When the calendar subselection is an easily applicable method suitable in a quantitative precipitation forecast context, the stepwise subselection method allows for potential season shifts and SST trends and is therefore better suited for climate reconstructions and climate change studies. Furthermore, the probabilistic downscaling of 20CR over the period 1871–2012 with the SANDHY probabilistic downscaling method combined with the stepwise subselection thus constitutes a perfect framework for assessing the recent observed meteorological events but also future events projected by climate change impact studies and putting them in a historical perspective.« less

  13. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    DOE PAGES

    Caillouet, Laurie; Vidal, Jean -Philippe; Sauquet, Eric; ...

    2016-03-16

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the 1871–2012 period built on the NOAA Twentieth Century global extended atmospheric reanalysis (20CR). The objective is to fill in the spatial and temporal data gaps in surface observations in order to improve our knowledge on the local-scale climate variability from the late nineteenth century onwards. The SANDHY (Stepwise ANalogue Downscaling method for HYdrology) statistical downscaling method, initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between large-scale 20CR predictors and local-scale predictands from the Safran high-resolution near-surface reanalysis,more » available from 1958 onwards only. SANDHY provides a daily ensemble of 125 analogue dates over the 1871–2012 period for 608 climatically homogeneous zones paving France. Large precipitation biases in intermediary seasons are shown to occur in regions with high seasonal asymmetry like the Mediterranean. Moreover, winter and summer temperatures are respectively over- and under-estimated over the whole of France. Two analogue subselection methods are therefore developed with the aim of keeping the structure of the SANDHY method unchanged while reducing those seasonal biases. The calendar selection keeps the analogues closest to the target calendar day. The stepwise selection applies two new analogy steps based on similarity of the sea surface temperature (SST) and the large-scale 2 m temperature ( T). Comparisons to the Safran reanalysis over 1959–2007 and to homogenized series over the whole twentieth century show that biases in the interannual cycle of precipitation and temperature are reduced with both methods. The stepwise subselection moreover leads to a large improvement of interannual correlation and reduction of errors in seasonal temperature time series. When the calendar subselection is an easily applicable method suitable in a quantitative precipitation forecast context, the stepwise subselection method allows for potential season shifts and SST trends and is therefore better suited for climate reconstructions and climate change studies. Furthermore, the probabilistic downscaling of 20CR over the period 1871–2012 with the SANDHY probabilistic downscaling method combined with the stepwise subselection thus constitutes a perfect framework for assessing the recent observed meteorological events but also future events projected by climate change impact studies and putting them in a historical perspective.« less

  14. Fiber networks amplify active stress

    PubMed Central

    Ronceray, Pierre; Broedersz, Chase P.

    2016-01-01

    Large-scale force generation is essential for biological functions such as cell motility, embryonic development, and muscle contraction. In these processes, forces generated at the molecular level by motor proteins are transmitted by disordered fiber networks, resulting in large-scale active stresses. Although these fiber networks are well characterized macroscopically, this stress generation by microscopic active units is not well understood. Here we theoretically study force transmission in these networks. We find that collective fiber buckling in the vicinity of a local active unit results in a rectification of stress towards strongly amplified isotropic contraction. This stress amplification is reinforced by the networks’ disordered nature, but saturates for high densities of active units. Our predictions are quantitatively consistent with experiments on reconstituted tissues and actomyosin networks and shed light on the role of the network microstructure in shaping active stresses in cells and tissue. PMID:26921325

  15. Inquiring Minds Want to Know: Progress Report on SCALE-UP Physics at Penn State Erie

    NASA Astrophysics Data System (ADS)

    Hall, Jonathan

    2008-03-01

    SCALE-UP (Student Centered Activities for Large Enrollment University Programs) is a ``studio'' approach to learning developed by Bob Beichner at North Carolina State University. SCALE-UP was adapted for teaching and learning in the introductory calculus-based mechanics course at Penn State Erie, The Behrend College, starting in Spring 2007. We are presently doing quantitative and qualitative research on using inquiry-based learning with first year college students, in particular how it effects female students and students from groups that are traditionally under-represented in STEM fields. Using field notes of observations of the classes, focus groups, and the collection of quantitative data, the feedback generated by the research is also being used to improve the delivery of the course, and in the planning of adopting SCALE-UP to the second semester course on electromagnetism in the Fall 2008 semester.

  16. The “unreasonable effectiveness” of stratigraphic and geomorphic experiments

    NASA Astrophysics Data System (ADS)

    Paola, Chris; Straub, Kyle; Mohrig, David; Reinhardt, Liam

    2009-12-01

    The growth of quantitative analysis and prediction in Earth-surface science has been accompanied by growth in experimental stratigraphy and geomorphology. Experimenters have grown increasingly bold in targeting landscape elements from channel reaches up to the entire erosional networks and depositional basins, often using very small facilities. The experiments produce spatial structure and kinematics that, although imperfect, compare well with natural systems despite differences of spatial scale, time scale, material properties, and number of active processes. Experiments have been particularly useful in studying a wide range of forms of self-organized (autogenic) complexity that occur in morphodynamic systems. Autogenic dynamics creates much of the spatial structure we see in the landscape and in preserved strata, and is strongly associated with sediment storage and release. The observed consistency between experimental and field systems despite large differences in governing dimensionless numbers is what we mean by "unreasonable effectiveness". We suggest that unreasonable experimental effectiveness arises from natural scale independence. We generalize existing ideas to relate internal similarity, in which a small part of a system is similar to the larger system, to external similarity, in which a small copy of a system is similar to the larger system. We propose that internal similarity implies external similarity, though not the converse. The external similarity of landscape experiments to natural landscapes suggests that natural scale independence may be even more characteristic of morphodynamics than it is of better studied cases such as turbulence. We urge a shift in emphasis in experimental stratigraphy and geomorphology away from classical dynamical scaling and towards a quantitative understanding of the origins and limits of scale independence. Other research areas with strong growth potential in experimental surface dynamics include physical-biotic interactions, cohesive effects, stochastic processes, the interplay of structural and geomorphic self-organization, extraction of quantitative process information from landscape and stratigraphic records, and closer interaction between experimentation and theory.

  17. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  18. Chromatin as active matter

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankit; Ganai, Nirmalendu; Sengupta, Surajit; Menon, Gautam I.

    2017-01-01

    Active matter models describe a number of biophysical phenomena at the cell and tissue scale. Such models explore the macroscopic consequences of driving specific soft condensed matter systems of biological relevance out of equilibrium through ‘active’ processes. Here, we describe how active matter models can be used to study the large-scale properties of chromosomes contained within the nuclei of human cells in interphase. We show that polymer models for chromosomes that incorporate inhomogeneous activity reproduce many general, yet little understood, features of large-scale nuclear architecture. These include: (i) the spatial separation of gene-rich, low-density euchromatin, predominantly found towards the centre of the nucleus, vis a vis. gene-poor, denser heterochromatin, typically enriched in proximity to the nuclear periphery, (ii) the differential positioning of individual gene-rich and gene-poor chromosomes, (iii) the formation of chromosome territories, as well as (iv), the weak size-dependence of the positions of individual chromosome centres-of-mass relative to the nuclear centre that is seen in some cell types. Such structuring is induced purely by the combination of activity and confinement and is absent in thermal equilibrium. We systematically explore active matter models for chromosomes, discussing how our model can be generalized to study variations in chromosome positioning across different cell types. The approach and model we outline here represent a preliminary attempt towards a quantitative, first-principles description of the large-scale architecture of the cell nucleus.

  19. Large-scale precipitation estimation using Kalpana-1 IR measurements and its validation using GPCP and GPCC data

    NASA Astrophysics Data System (ADS)

    Prakash, Satya; Mahesh, C.; Gairola, Rakesh M.

    2011-12-01

    Large-scale precipitation estimation is very important for climate science because precipitation is a major component of the earth's water and energy cycles. In the present study, the GOES precipitation index technique has been applied to the Kalpana-1 satellite infrared (IR) images of every three-hourly, i.e., of 0000, 0300, 0600,…., 2100 hours UTC, for rainfall estimation as a preparatory to the INSAT-3D. After the temperatures of all the pixels in a grid are known, they are distributed to generate a three-hourly 24-class histogram of brightness temperatures of IR (10.5-12.5 μm) images for a 1.0° × 1.0° latitude/longitude box. The daily, monthly, and seasonal rainfall have been estimated using these three-hourly rain estimates for the entire south-west monsoon period of 2009 in the present study. To investigate the potential of these rainfall estimates, the validation of monthly and seasonal rainfall estimates has been carried out using the Global Precipitation Climatology Project and Global Precipitation Climatology Centre data. The validation results show that the present technique works very well for the large-scale precipitation estimation qualitatively as well as quantitatively. The results also suggest that the simple IR-based estimation technique can be used to estimate rainfall for tropical areas at a larger temporal scale for climatological applications.

  20. Phenazines affect biofilm formation by Pseudomonas aeruginosa in similar ways at various scales

    PubMed Central

    Ramos, Itzel; Dietrich, Lars E. P.; Price-Whelan, Alexa; Newman, Dianne K.

    2010-01-01

    Pseudomonads produce phenazines, a group of small, redox-active compounds with diverse physiological functions. In this study, we compared the phenotypes of Pseudomonas aeruginosa strain PA14 and a mutant unable to synthesize phenazines in flow cell and colony biofilms quantitatively. Although phenazine production does not impact the ability of PA14 to attach to surfaces, as has been shown for Pseudomonas chlororaphis (Maddula, 2006; Maddula, 2008), it influences swarming motility and the surface-to-volume ratio of mature biofilms. These results indicate that phenazines affect biofilm development across a large range of scales, but in unique ways for different Pseudomonas species. PMID:20123017

  1. Selfing results in inbreeding depression of growth but not of gas exchange of surviving adult black spruce trees

    Treesearch

    Kurt Johnsen; John E. Major; Chris A. Maier

    2003-01-01

    Summary In most tree species, inbreeding greatly reduces seed production, seed viability, survival and growth. In a previous large-scale quantitative analysis of a black spruce (Picea mariana (Mill.) B.S.P.) diallel experiment, selfing had large deleterious effects on growth but no impact on stable carbon isotope discrimination (an...

  2. Quantitative stem cell biology: the threat and the glory.

    PubMed

    Pollard, Steven M

    2016-11-15

    Major technological innovations over the past decade have transformed our ability to extract quantitative data from biological systems at an unprecedented scale and resolution. These quantitative methods and associated large datasets should lead to an exciting new phase of discovery across many areas of biology. However, there is a clear threat: will we drown in these rivers of data? On 18th July 2016, stem cell biologists gathered in Cambridge for the 5th annual Cambridge Stem Cell Symposium to discuss 'Quantitative stem cell biology: from molecules to models'. This Meeting Review provides a summary of the data presented by each speaker, with a focus on quantitative techniques and the new biological insights that are emerging. © 2016. Published by The Company of Biologists Ltd.

  3. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  4. How Many Are Enough? A Quantitative Analysis of the Effects of the Number of Response Options on the Academic Performance of Students with Disabilities on Large-Scale Assessments

    ERIC Educational Resources Information Center

    Freeman, Sarah Reives

    2013-01-01

    The main focus of this study is to determine the effect of test design on the academic performance of students with disabilities participating in the NCEXTEND2 modified assessment program during the 2010-2011 school year. Participation of all students in state and federal accountability measure is required by No Child Left Behind (2001) and the…

  5. A Large-Scale Quantitative Proteomic Approach to Identifying Sulfur Mustard-Induced Protein Phosphorylation Cascades

    DTIC Science & Technology

    2010-01-01

    snapshot of SM-induced toxicity. Over the past few years, innovations in systems biology and biotechnology have led to important advances in our under...perturbations. SILAC has been used to study tumor metastasis (3, 4), focal adhesion- associated proteins, growth factor signaling, and insulin regula- tion (5...stained with colloidal Coomassie blue. After it was destained, the gel lane was excised into six regions, and each region was cut into 1 mm cubes

  6. In the eye of the beholder: the effect of rater variability and different rating scales on QTL mapping.

    PubMed

    Poland, Jesse A; Nelson, Rebecca J

    2011-02-01

    The agronomic importance of developing durably resistant cultivars has led to substantial research in the field of quantitative disease resistance (QDR) and, in particular, mapping quantitative trait loci (QTL) for disease resistance. The assessment of QDR is typically conducted by visual estimation of disease severity, which raises concern over the accuracy and precision of visual estimates. Although previous studies have examined the factors affecting the accuracy and precision of visual disease assessment in relation to the true value of disease severity, the impact of this variability on the identification of disease resistance QTL has not been assessed. In this study, the effects of rater variability and rating scales on mapping QTL for northern leaf blight resistance in maize were evaluated in a recombinant inbred line population grown under field conditions. The population of 191 lines was evaluated by 22 different raters using a direct percentage estimate, a 0-to-9 ordinal rating scale, or both. It was found that more experienced raters had higher precision and that using a direct percentage estimation of diseased leaf area produced higher precision than using an ordinal scale. QTL mapping was then conducted using the disease estimates from each rater using stepwise general linear model selection (GLM) and inclusive composite interval mapping (ICIM). For GLM, the same QTL were largely found across raters, though some QTL were only identified by a subset of raters. The magnitudes of estimated allele effects at identified QTL varied drastically, sometimes by as much as threefold. ICIM produced highly consistent results across raters and for the different rating scales in identifying the location of QTL. We conclude that, despite variability between raters, the identification of QTL was largely consistent among raters, particularly when using ICIM. However, care should be taken in estimating QTL allele effects, because this was highly variable and rater dependent.

  7. Bridging the gap between small and large scale sediment budgets? - A scaling challenge in the Upper Rhone Basin, Switzerland

    NASA Astrophysics Data System (ADS)

    Schoch, Anna; Blöthe, Jan; Hoffmann, Thomas; Schrott, Lothar

    2016-04-01

    A large number of sediment budgets have been compiled on different temporal and spatial scales in alpine regions. Detailed sediment budgets based on the quantification of a number of sediment storages (e.g. talus cones, moraine deposits) exist only for a few small scale drainage basins (up to 10² km²). In contrast, large scale sediment budgets (> 10³ km²) consider only long term sediment sinks such as valley fills and lakes. Until now, these studies often neglect small scale sediment storages in the headwaters. However, the significance of these sediment storages have been reported. A quantitative verification whether headwaters function as sediment source regions is lacking. Despite substantial transport energy in mountain environments due to steep gradients and high relief, sediment flux in large river systems is frequently disconnected from alpine headwaters. This leads to significant storage of coarse-grained sediment along the flow path from rockwall source regions to large sedimentary sinks in major alpine valleys. To improve the knowledge on sediment budgets in large scale alpine catchments and to bridge the gap between small and large scale sediment budgets, we apply a multi-method approach comprising investigations on different spatial scales in the Upper Rhone Basin (URB). The URB is the largest inneralpine basin in the European Alps with a size of > 5400 km². It is a closed system with Lake Geneva acting as an ultimate sediment sink for suspended and clastic sediment. We examine the spatial pattern and volumes of sediment storages as well as the morphometry on the local and catchment-wide scale. We mapped sediment storages and bedrock in five sub-regions of the study area (Goms, Lötschen valley, Val d'Illiez, Vallée de la Liène, Turtmann valley) in the field and from high-resolution remote sensing imagery to investigate the spatial distribution of different sediment storage types (e.g. talus deposits, debris flow cones, alluvial fans). These sub-regions cover all three litho-tectonic units of the URB (Helvetic nappes, Penninic nappes, External massifs) and different catchment sizes to capture the inherent variability. Different parameters characterizing topography, surface characteristics, and vegetation cover are analyzed for each storage type. The data is then used in geostatistical models (PCA, stepwise logistic regression) to predict the spatial distribution of sediment storage for the whole URB. We further conduct morphometric analyses of the URB to gain information on the varying degree of glacial imprint and postglacial landscape evolution and their control on the spatial distribution of sediment storage in a large scale drainage basin. Geophysical methods (ground penetrating radar and electrical resistivity tomography) are applied on different sediment storage types on the local scale to estimate mean thicknesses. Additional data from published studies are used to complement our dataset. We integrate the local data in the statistical model on the spatial distribution of sediment storages for the whole URB. Hence, we can extrapolate the stored sediment volumes to the regional scale in order to bridge the gap between small and large scale studies.

  8. Stable isotope dimethyl labelling for quantitative proteomics and beyond

    PubMed Central

    Hsu, Jue-Liang; Chen, Shu-Hui

    2016-01-01

    Stable-isotope reductive dimethylation, a cost-effective, simple, robust, reliable and easy-to- multiplex labelling method, is widely applied to quantitative proteomics using liquid chromatography-mass spectrometry. This review focuses on biological applications of stable-isotope dimethyl labelling for a large-scale comparative analysis of protein expression and post-translational modifications based on its unique properties of the labelling chemistry. Some other applications of the labelling method for sample preparation and mass spectrometry-based protein identification and characterization are also summarized. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644970

  9. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  10. Forest Connectivity Regions of Canada Using Circuit Theory and Image Analysis

    PubMed Central

    Pelletier, David; Lapointe, Marc-Élie; Wulder, Michael A.; White, Joanne C.; Cardille, Jeffrey A.

    2017-01-01

    Ecological processes are increasingly well understood over smaller areas, yet information regarding interconnections and the hierarchical nature of ecosystems remains less studied and understood. Information on connectivity over large areas with high resolution source information provides for both local detail and regional context. The emerging capacity to apply circuit theory to create maps of omnidirectional connectivity provides an opportunity for improved and quantitative depictions of forest connectivity, supporting the formation and testing of hypotheses about the density of animal movement, ecosystem structure, and related links to natural and anthropogenic forces. In this research, our goal was to delineate regions where connectivity regimes are similar across the boreal region of Canada using new quantitative analyses for characterizing connectivity over large areas (e.g., millions of hectares). Utilizing the Earth Observation for Sustainable Development of forests (EOSD) circa 2000 Landsat-derived land-cover map, we created and analyzed a national-scale map of omnidirectional forest connectivity at 25m resolution over 10000 tiles of 625 km2 each, spanning the forested regions of Canada. Using image recognition software to detect corridors, pinch points, and barriers to movements at multiple spatial scales in each tile, we developed a simple measure of the structural complexity of connectivity patterns in omnidirectional connectivity maps. We then mapped the Circuitscape resistance distance measure and used it in conjunction with the complexity data to study connectivity characteristics in each forested ecozone. Ecozone boundaries masked substantial systematic patterns in connectivity characteristics that are uncovered using a new classification of connectivity patterns that revealed six clear groups of forest connectivity patterns found in Canada. The resulting maps allow exploration of omnidirectional forest connectivity patterns at full resolution while permitting quantitative analyses of connectivity over broad areas, informing modeling, planning and monitoring efforts. PMID:28146573

  11. A stochastic two-scale model for pressure-driven flow between rough surfaces

    PubMed Central

    Larsson, Roland; Lundström, Staffan; Wall, Peter; Almqvist, Andreas

    2016-01-01

    Seal surface topography typically consists of global-scale geometric features as well as local-scale roughness details and homogenization-based approaches are, therefore, readily applied. These provide for resolving the global scale (large domain) with a relatively coarse mesh, while resolving the local scale (small domain) in high detail. As the total flow decreases, however, the flow pattern becomes tortuous and this requires a larger local-scale domain to obtain a converged solution. Therefore, a classical homogenization-based approach might not be feasible for simulation of very small flows. In order to study small flows, a model allowing feasibly-sized local domains, for really small flow rates, is developed. Realization was made possible by coupling the two scales with a stochastic element. Results from numerical experiments, show that the present model is in better agreement with the direct deterministic one than the conventional homogenization type of model, both quantitatively in terms of flow rate and qualitatively in reflecting the flow pattern. PMID:27436975

  12. A robust quantitative near infrared modeling approach for blend monitoring.

    PubMed

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  13. Reverse Fluorescence Enhancement and Colorimetric Bimodal Signal Readout Immunochromatography Test Strip for Ultrasensitive Large-Scale Screening and Postoperative Monitoring.

    PubMed

    Yao, Yingyi; Guo, Weisheng; Zhang, Jian; Wu, Yudong; Fu, Weihua; Liu, Tingting; Wu, Xiaoli; Wang, Hanjie; Gong, Xiaoqun; Liang, Xing-Jie; Chang, Jin

    2016-09-07

    Ultrasensitive and quantitative fast screening of cancer biomarkers by immunochromatography test strip (ICTS) is still challenging in clinic. The gold nanoparticles (NPs) based ICTS with colorimetric readout enables a quick spectrum screening but suffers from nonquantitative performance; although ICTS with fluorescence readout (FICTS) allows quantitative detection, its sensitivity still deserves more efforts and attentions. In this work, by taking advantages of colorimetric ICTS and FICTS, we described a reverse fluorescence enhancement ICTS (rFICTS) with bimodal signal readout for ultrasensitive and quantitative fast screening of carcinoembryonic antigen (CEA). In the presence of target, gold NPs aggregation in T line induced colorimetric readout, allowing on-the-spot spectrum screening in 10 min by naked eye. Meanwhile, the reverse fluorescence enhancement signal enabled more accurately quantitative detection with better sensitivity (5.89 pg/mL for CEA), which is more than 2 orders of magnitude lower than that of the conventional FICTS. The accuracy and stability of the rFICTS were investigated with more than 100 clinical serum samples for large-scale screening. Furthermore, this rFICTS also realized postoperative monitoring by detecting CEA in a patient with colon cancer and comparing with CT imaging diagnosis. These results indicated this rFICTS is particularly suitable for point-of-care (POC) diagnostics in both resource-rich and resource-limited settings.

  14. An invariability-area relationship sheds new light on the spatial scaling of ecological stability.

    PubMed

    Wang, Shaopeng; Loreau, Michel; Arnoldi, Jean-Francois; Fang, Jingyun; Rahman, K Abd; Tao, Shengli; de Mazancourt, Claire

    2017-05-19

    The spatial scaling of stability is key to understanding ecological sustainability across scales and the sensitivity of ecosystems to habitat destruction. Here we propose the invariability-area relationship (IAR) as a novel approach to investigate the spatial scaling of stability. The shape and slope of IAR are largely determined by patterns of spatial synchrony across scales. When synchrony decays exponentially with distance, IARs exhibit three phases, characterized by steeper increases in invariability at both small and large scales. Such triphasic IARs are observed for primary productivity from plot to continental scales. When synchrony decays as a power law with distance, IARs are quasilinear on a log-log scale. Such quasilinear IARs are observed for North American bird biomass at both species and community levels. The IAR provides a quantitative tool to predict the effects of habitat loss on population and ecosystem stability and to detect regime shifts in spatial ecological systems, which are goals of relevance to conservation and policy.

  15. [Advances in mass spectrometry-based approaches for neuropeptide analysis].

    PubMed

    Ji, Qianyue; Ma, Min; Peng, Xin; Jia, Chenxi; Ji, Qianyue

    2017-07-25

    Neuropeptides are an important class of endogenous bioactive substances involved in the function of the nervous system, and connect the brain and other neural and peripheral organs. Mass spectrometry-based neuropeptidomics are designed to study neuropeptides in a large-scale manner and obtain important molecular information to further understand the mechanism of nervous system regulation and the pathogenesis of neurological diseases. This review summarizes the basic strategies for the study of neuropeptides using mass spectrometry, including sample preparation and processing, qualitative and quantitative methods, and mass spectrometry imagining.

  16. Light sheet theta microscopy for rapid high-resolution imaging of large biological samples.

    PubMed

    Migliori, Bianca; Datta, Malika S; Dupre, Christophe; Apak, Mehmet C; Asano, Shoh; Gao, Ruixuan; Boyden, Edward S; Hermanson, Ola; Yuste, Rafael; Tomer, Raju

    2018-05-29

    Advances in tissue clearing and molecular labeling methods are enabling unprecedented optical access to large intact biological systems. These developments fuel the need for high-speed microscopy approaches to image large samples quantitatively and at high resolution. While light sheet microscopy (LSM), with its high planar imaging speed and low photo-bleaching, can be effective, scaling up to larger imaging volumes has been hindered by the use of orthogonal light sheet illumination. To address this fundamental limitation, we have developed light sheet theta microscopy (LSTM), which uniformly illuminates samples from the same side as the detection objective, thereby eliminating limits on lateral dimensions without sacrificing the imaging resolution, depth, and speed. We present a detailed characterization of LSTM, and demonstrate its complementary advantages over LSM for rapid high-resolution quantitative imaging of large intact samples with high uniform quality. The reported LSTM approach is a significant step for the rapid high-resolution quantitative mapping of the structure and function of very large biological systems, such as a clarified thick coronal slab of human brain and uniformly expanded tissues, and also for rapid volumetric calcium imaging of highly motile animals, such as Hydra, undergoing non-isomorphic body shape changes.

  17. Kinetic method for the large-scale analysis of the binding mechanism of histone deacetylase inhibitors.

    PubMed

    Meyners, Christian; Baud, Matthias G J; Fuchter, Matthew J; Meyer-Almes, Franz-Josef

    2014-09-01

    Performing kinetic studies on protein ligand interactions provides important information on complex formation and dissociation. Beside kinetic parameters such as association rates and residence times, kinetic experiments also reveal insights into reaction mechanisms. Exploiting intrinsic tryptophan fluorescence a parallelized high-throughput Förster resonance energy transfer (FRET)-based reporter displacement assay with very low protein consumption was developed to enable the large-scale kinetic characterization of the binding of ligands to recombinant human histone deacetylases (HDACs) and a bacterial histone deacetylase-like amidohydrolase (HDAH) from Bordetella/Alcaligenes. For the binding of trichostatin A (TSA), suberoylanilide hydroxamic acid (SAHA), and two other SAHA derivatives to HDAH, two different modes of action, simple one-step binding and a two-step mechanism comprising initial binding and induced fit, were verified. In contrast to HDAH, all compounds bound to human HDAC1, HDAC6, and HDAC8 through a two-step mechanism. A quantitative view on the inhibitor-HDAC systems revealed two types of interaction, fast binding and slow dissociation. We provide arguments for the thesis that the relationship between quantitative kinetic and mechanistic information and chemical structures of compounds will serve as a valuable tool for drug optimization. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    PubMed

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Characterizing Listener Engagement with Popular Songs Using Large-Scale Music Discovery Data

    PubMed Central

    Kaneshiro, Blair; Ruan, Feng; Baker, Casey W.; Berger, Jonathan

    2017-01-01

    Music discovery in everyday situations has been facilitated in recent years by audio content recognition services such as Shazam. The widespread use of such services has produced a wealth of user data, specifying where and when a global audience takes action to learn more about music playing around them. Here, we analyze a large collection of Shazam queries of popular songs to study the relationship between the timing of queries and corresponding musical content. Our results reveal that the distribution of queries varies over the course of a song, and that salient musical events drive an increase in queries during a song. Furthermore, we find that the distribution of queries at the time of a song's release differs from the distribution following a song's peak and subsequent decline in popularity, possibly reflecting an evolution of user intent over the “life cycle” of a song. Finally, we derive insights into the data size needed to achieve consistent query distributions for individual songs. The combined findings of this study suggest that music discovery behavior, and other facets of the human experience of music, can be studied quantitatively using large-scale industrial data. PMID:28386241

  20. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies.

    PubMed

    Atkinson, Jonathan A; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E; Griffiths, Marcus; Wells, Darren M

    2017-10-01

    Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. © The Authors 2017. Published by Oxford University Press.

  1. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies

    PubMed Central

    Atkinson, Jonathan A.; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E.; Griffiths, Marcus

    2017-01-01

    Abstract Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. PMID:29020748

  2. Characterizing Listener Engagement with Popular Songs Using Large-Scale Music Discovery Data.

    PubMed

    Kaneshiro, Blair; Ruan, Feng; Baker, Casey W; Berger, Jonathan

    2017-01-01

    Music discovery in everyday situations has been facilitated in recent years by audio content recognition services such as Shazam. The widespread use of such services has produced a wealth of user data, specifying where and when a global audience takes action to learn more about music playing around them. Here, we analyze a large collection of Shazam queries of popular songs to study the relationship between the timing of queries and corresponding musical content. Our results reveal that the distribution of queries varies over the course of a song, and that salient musical events drive an increase in queries during a song. Furthermore, we find that the distribution of queries at the time of a song's release differs from the distribution following a song's peak and subsequent decline in popularity, possibly reflecting an evolution of user intent over the "life cycle" of a song. Finally, we derive insights into the data size needed to achieve consistent query distributions for individual songs. The combined findings of this study suggest that music discovery behavior, and other facets of the human experience of music, can be studied quantitatively using large-scale industrial data.

  3. Thymidylate synthase (TS) gene expression in primary lung cancer patients: a large-scale study in Japanese population.

    PubMed

    Tanaka, F; Wada, H; Fukui, Y; Fukushima, M

    2011-08-01

    Previous small-sized studies showed lower thymidylate synthase (TS) expression in adenocarcinoma of the lung, which may explain higher antitumor activity of TS-inhibiting agents such as pemetrexed. To quantitatively measure TS gene expression in a large-scale Japanese population (n = 2621) with primary lung cancer, laser-captured microdissected sections were cut from primary tumors, surrounding normal lung tissues and involved nodes. TS gene expression level in primary tumor was significantly higher than that in normal lung tissue (mean TS/β-actin, 3.4 and 1.0, respectively; P < 0.01), and TS gene expression level was further higher in involved node (mean TS/β-actin, 7.7; P < 0.01). Analyses of TS gene expression levels in primary tumor according to histologic cell type revealed that small-cell carcinoma showed highest TS expression (mean TS/β-actin, 13.8) and that squamous cell carcinoma showed higher TS expression as compared with adenocarcinoma (mean TS/β-actin, 4.3 and 2.3, respectively; P < 0.01); TS gene expression was significantly increased along with a decrease in the grade of tumor cell differentiation. There was no significant difference in TS gene expression according to any other patient characteristics including tumor progression. Lower TS expression in adenocarcinoma of the lung was confirmed in a large-scale study.

  4. Natural disasters and population mobility in Bangladesh

    PubMed Central

    Gray, Clark L.; Mueller, Valerie

    2012-01-01

    The consequences of environmental change for human migration have gained increasing attention in the context of climate change and recent large-scale natural disasters, but as yet relatively few large-scale and quantitative studies have addressed this issue. We investigate the consequences of climate-related natural disasters for long-term population mobility in rural Bangladesh, a region particularly vulnerable to environmental change, using longitudinal survey data from 1,700 households spanning a 15-y period. Multivariate event history models are used to estimate the effects of flooding and crop failures on local population mobility and long-distance migration while controlling for a large set of potential confounders at various scales. The results indicate that flooding has modest effects on mobility that are most visible at moderate intensities and for women and the poor. However, crop failures unrelated to flooding have strong effects on mobility in which households that are not directly affected but live in severely affected areas are the most likely to move. These results point toward an alternate paradigm of disaster-induced mobility that recognizes the significant barriers to migration for vulnerable households as well their substantial local adaptive capacity. PMID:22474361

  5. A study to explore the use of orbital remote sensing to determine native arid plant distribution. [Arizona

    NASA Technical Reports Server (NTRS)

    Mcginnies, W. G. (Principal Investigator); Conn, J. S.; Haase, E. F.; Lepley, L. K.; Musick, H. B.; Foster, K. E.

    1975-01-01

    The author has identified the following significant results. Research results include a method for determining the reflectivities of natural areas from ERTS data taking into account sun angle and atmospheric effects on the radiance seen by the satellite sensor. Ground truth spectral signature data for various types of scenes, including ground with and without annuals, and various shrubs were collected. Large areas of varnished desert pavement are visible and mappable on ERTS and high altitude aircraft imagery. A large scale and a small scale vegetation pattern were found to be correlated with presence of desert pavement. A comparison of radiometric data with video recordings shows quantitatively that for most areas of desert vegetation, soils are the most influential factor in determining the signature of a scene. Additive and subtractive image processing techniques were applied in the dark room to enhance vegetational aspects of ERTS.

  6. The causality analysis of climate change and large-scale human crisis

    PubMed Central

    Zhang, David D.; Lee, Harry F.; Wang, Cong; Li, Baosheng; Pei, Qing; Zhang, Jane; An, Yulun

    2011-01-01

    Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500–1800 in Europe. Results show that cooling from A.D. 1560–1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined “golden” and “dark” ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere. PMID:21969578

  7. The causality analysis of climate change and large-scale human crisis.

    PubMed

    Zhang, David D; Lee, Harry F; Wang, Cong; Li, Baosheng; Pei, Qing; Zhang, Jane; An, Yulun

    2011-10-18

    Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500-1800 in Europe. Results show that cooling from A.D. 1560-1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined "golden" and "dark" ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere.

  8. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  9. Flow turbulence topology in regular porous media: From macroscopic to microscopic scale with direct numerical simulation

    NASA Astrophysics Data System (ADS)

    Chu, Xu; Weigand, Bernhard; Vaikuntanathan, Visakh

    2018-06-01

    Microscopic analysis of turbulence topology in a regular porous medium is presented with a series of direct numerical simulation. The regular porous media are comprised of square cylinders in a staggered array. Triply periodic boundary conditions enable efficient investigations in a representative elementary volume. Three flow patterns—channel with sudden contraction, impinging surface, and wake—are observed and studied quantitatively in contrast to the qualitative experimental studies reported in the literature. Among these, shear layers in the channel show the highest turbulence intensity due to a favorable pressure gradient and shed due to an adverse pressure gradient downstream. The turbulent energy budget indicates a strong production rate after the flow contraction and a strong dissipation on both shear and impinging walls. Energy spectra and pre-multiplied spectra detect large scale energetic structures in the shear layer and a breakup of scales in the impinging layer. However, these large scale structures break into less energetic small structures at high Reynolds number conditions. This suggests an absence of coherent structures in densely packed porous media at high Reynolds numbers. Anisotropy analysis with a barycentric map shows that the turbulence in porous media is highly isotropic in the macro-scale, which is not the case in the micro-scale. In the end, proper orthogonal decomposition is employed to distinguish the energy-conserving structures. The results support the pore scale prevalence hypothesis. However, energetic coherent structures are observed in the case with sparsely packed porous media.

  10. [Quantitative estimation of vegetation cover and management factor in USLE and RUSLE models by using remote sensing data: a review].

    PubMed

    Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie

    2012-06-01

    Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.

  11. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor.

    PubMed

    Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim

    2016-03-14

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.

  12. Monthly mean large-scale analyses of upper-tropospheric humidity and wind field divergence derived from three geostationary satellites

    NASA Technical Reports Server (NTRS)

    Schmetz, Johannes; Menzel, W. Paul; Velden, Christopher; Wu, Xiangqian; Vandeberg, Leo; Nieman, Steve; Hayden, Christopher; Holmlund, Kenneth; Geijo, Carlos

    1995-01-01

    This paper describes the results from a collaborative study between the European Space Operations Center, the European Organization for the Exploitation of Meteorological Satellites, the National Oceanic and Atmospheric Administration, and the Cooperative Institute for Meteorological Satellite Studies investigating the relationship between satellite-derived monthly mean fields of wind and humidity in the upper troposphere for March 1994. Three geostationary meteorological satellites GOES-7, Meteosat-3, and Meteosat-5 are used to cover an area from roughly 160 deg W to 50 deg E. The wind fields are derived from tracking features in successive images of upper-tropospheric water vapor (WV) as depicted in the 6.5-micron absorption band. The upper-tropospheric relative humidity (UTH) is inferred from measured water vapor radiances with a physical retrieval scheme based on radiative forward calculations. Quantitative information on large-scale circulation patterns in the upper-troposphere is possible with the dense spatial coverage of the WV wind vectors. The monthly mean wind field is used to estimate the large-scale divergence; values range between about-5 x 10(exp -6) and 5 x 10(exp 6)/s when averaged over a scale length of about 1000-2000 km. The spatial patterns of the UTH field and the divergence of the wind field closely resemble one another, suggesting that UTH patterns are principally determined by the large-scale circulation. Since the upper-tropospheric humidity absorbs upwelling radiation from lower-tropospheric levels and therefore contributes significantly to the atmospheric greenhouse effect, this work implies that studies on the climate relevance of water vapor should include three-dimensional modeling of the atmospheric dynamics. The fields of UTH and WV winds are useful parameters for a climate-monitoring system based on satellite data. The results from this 1-month analysis suggest the desirability of further GOES and Meteosat studies to characterize the changes in the upper-tropospheric moisture sources and sinks over the past decade.

  13. Quantitative computational infrared imaging of buoyant diffusion flames

    NASA Astrophysics Data System (ADS)

    Newale, Ashish S.

    Studies of infrared radiation from turbulent buoyant diffusion flames impinging on structural elements have applications to the development of fire models. A numerical and experimental study of radiation from buoyant diffusion flames with and without impingement on a flat plate is reported. Quantitative images of the radiation intensity from the flames are acquired using a high speed infrared camera. Large eddy simulations are performed using fire dynamics simulator (FDS version 6). The species concentrations and temperature from the simulations are used in conjunction with a narrow-band radiation model (RADCAL) to solve the radiative transfer equation. The computed infrared radiation intensities rendered in the form of images and compared with the measurements. The measured and computed radiation intensities reveal necking and bulging with a characteristic frequency of 7.1 Hz which is in agreement with previous empirical correlations. The results demonstrate the effects of stagnation point boundary layer on the upstream buoyant shear layer. The coupling between these two shear layers presents a model problem for sub-grid scale modeling necessary for future large eddy simulations.

  14. Machine learning for large-scale wearable sensor data in Parkinson's disease: Concepts, promises, pitfalls, and futures.

    PubMed

    Kubota, Ken J; Chen, Jason A; Little, Max A

    2016-09-01

    For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, "wearable," sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that "learn" from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.

  15. On the buckling of an elastic holey column

    PubMed Central

    Hazel, A. L.; Pihler-Puzović, D.

    2017-01-01

    We report the results of a numerical and theoretical study of buckling in elastic columns containing a line of holes. Buckling is a common failure mode of elastic columns under compression, found over scales ranging from metres in buildings and aircraft to tens of nanometers in DNA. This failure usually occurs through lateral buckling, described for slender columns by Euler’s theory. When the column is perforated with a regular line of holes, a new buckling mode arises, in which adjacent holes collapse in orthogonal directions. In this paper, we firstly elucidate how this alternate hole buckling mode coexists and interacts with classical Euler buckling modes, using finite-element numerical calculations with bifurcation tracking. We show how the preferred buckling mode is selected by the geometry, and discuss the roles of localized (hole-scale) and global (column-scale) buckling. Secondly, we develop a novel predictive model for the buckling of columns perforated with large holes. This model is derived without arbitrary fitting parameters, and quantitatively predicts the critical strain for buckling. We extend the model to sheets perforated with a regular array of circular holes and use it to provide quantitative predictions of their buckling. PMID:29225498

  16. Electrical properties of 0.4 cm long single walled nanotubes

    NASA Astrophysics Data System (ADS)

    Yu, Zhen

    2005-03-01

    Centimeter scale aligned carbon nanotube arrays are grown from nanoparticle/metal catalyst pads[1]. We find the nanotubes grow both with and ``against the wind.'' A metal underlayer provides in-situ electrical contact to these long nanotubes with no post growth processing needed. Using the electrically contacted nanotubes, we study electrical transport of 0.4 cm long nanotubes[2]. Using this data, we are able to determine the resistance of a nanotube as a function of length quantitatively, since the contact resistance is negligible in these long nanotubes. The source drain I-V curves are quantitatively described by a classical, diffusive model. Our measurements show that the outstanding transport properties of nanotubes can be extended to the cm scale and open the door to large scale integrated nanotube circuits with macroscopic dimensions. These are the longest electrically contacted single walled nanotubes measured to date. [1] Zhen Yu, Shengdong Li, Peter J. Burke, ``Synthesis of Aligned Arrays of Millimeter Long, Straight Single-Walled Carbon Nanotubes,'' Chemistry of Materials, 16(18), 3414-3416 (2004). [2] Shengdong Li, Zhen Yu, Christopher Rutherglen, Peter J. Burke, ``Electrical properties of 0.4 cm long single-walled carbon nanotubes'' Nano Letters, 4(10), 2003-2007 (2004).

  17. Knowledge-Guided Robust MRI Brain Extraction for Diverse Large-Scale Neuroimaging Studies on Humans and Non-Human Primates

    PubMed Central

    Wang, Yaping; Nie, Jingxin; Yap, Pew-Thian; Li, Gang; Shi, Feng; Geng, Xiujuan; Guo, Lei; Shen, Dinggang

    2014-01-01

    Accurate and robust brain extraction is a critical step in most neuroimaging analysis pipelines. In particular, for the large-scale multi-site neuroimaging studies involving a significant number of subjects with diverse age and diagnostic groups, accurate and robust extraction of the brain automatically and consistently is highly desirable. In this paper, we introduce population-specific probability maps to guide the brain extraction of diverse subject groups, including both healthy and diseased adult human populations, both developing and aging human populations, as well as non-human primates. Specifically, the proposed method combines an atlas-based approach, for coarse skull-stripping, with a deformable-surface-based approach that is guided by local intensity information and population-specific prior information learned from a set of real brain images for more localized refinement. Comprehensive quantitative evaluations were performed on the diverse large-scale populations of ADNI dataset with over 800 subjects (55∼90 years of age, multi-site, various diagnosis groups), OASIS dataset with over 400 subjects (18∼96 years of age, wide age range, various diagnosis groups), and NIH pediatrics dataset with 150 subjects (5∼18 years of age, multi-site, wide age range as a complementary age group to the adult dataset). The results demonstrate that our method consistently yields the best overall results across almost the entire human life span, with only a single set of parameters. To demonstrate its capability to work on non-human primates, the proposed method is further evaluated using a rhesus macaque dataset with 20 subjects. Quantitative comparisons with popularly used state-of-the-art methods, including BET, Two-pass BET, BET-B, BSE, HWA, ROBEX and AFNI, demonstrate that the proposed method performs favorably with superior performance on all testing datasets, indicating its robustness and effectiveness. PMID:24489639

  18. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  19. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  20. Infrared Multiphoton Dissociation for Quantitative Shotgun Proteomics

    PubMed Central

    Ledvina, Aaron R.; Lee, M. Violet; McAlister, Graeme C.; Westphall, Michael S.; Coon, Joshua J.

    2012-01-01

    We modified a dual-cell linear ion trap mass spectrometer to perform infrared multiphoton dissociation (IRMPD) in the low pressure trap of a dual-cell quadrupole linear ion trap (dual cell QLT) and perform large-scale IRMPD analyses of complex peptide mixtures. Upon optimization of activation parameters (precursor q-value, irradiation time, and photon flux), IRMPD subtly, but significantly outperforms resonant excitation CAD for peptides identified at a 1% false-discovery rate (FDR) from a yeast tryptic digest (95% confidence, p = 0.019). We further demonstrate that IRMPD is compatible with the analysis of isobaric-tagged peptides. Using fixed QLT RF amplitude allows for the consistent retention of reporter ions, but necessitates the use of variable IRMPD irradiation times, dependent upon precursor mass-to-charge (m/z). We show that IRMPD activation parameters can be tuned to allow for effective peptide identification and quantitation simultaneously. We thus conclude that IRMPD performed in a dual-cell ion trap is an effective option for the large-scale analysis of both unmodified and isobaric-tagged peptides. PMID:22480380

  1. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  2. Quantitative analysis of the evolution of novelty in cinema through crowdsourced keywords.

    PubMed

    Sreenivasan, Sameet

    2013-09-26

    The generation of novelty is central to any creative endeavor. Novelty generation and the relationship between novelty and individual hedonic value have long been subjects of study in social psychology. However, few studies have utilized large-scale datasets to quantitatively investigate these issues. Here we consider the domain of American cinema and explore these questions using a database of films spanning a 70 year period. We use crowdsourced keywords from the Internet Movie Database as a window into the contents of films, and prescribe novelty scores for each film based on occurrence probabilities of individual keywords and keyword-pairs. These scores provide revealing insights into the dynamics of novelty in cinema. We investigate how novelty influences the revenue generated by a film, and find a relationship that resembles the Wundt-Berlyne curve. We also study the statistics of keyword occurrence and the aggregate distribution of keywords over a 100 year period.

  3. Parametric studies with an atmospheric diffusion model that assesses toxic fuel hazards due to the ground clouds generated by rocket launches

    NASA Technical Reports Server (NTRS)

    Stewart, R. B.; Grose, W. L.

    1975-01-01

    Parametric studies were made with a multilayer atmospheric diffusion model to place quantitative limits on the uncertainty of predicting ground-level toxic rocket-fuel concentrations. Exhaust distributions in the ground cloud, cloud stabilized geometry, atmospheric coefficients, the effects of exhaust plume afterburning of carbon monoxide CO, assumed surface mixing-layer division in the model, and model sensitivity to different meteorological regimes were studied. Large-scale differences in ground-level predictions are quantitatively described. Cloud alongwind growth for several meteorological conditions is shown to be in error because of incorrect application of previous diffusion theory. In addition, rocket-plume calculations indicate that almost all of the rocket-motor carbon monoxide is afterburned to carbon dioxide CO2, thus reducing toxic hazards due to CO. The afterburning is also shown to have a significant effect on cloud stabilization height and on ground-level concentrations of exhaust products.

  4. Quantitative analysis of the evolution of novelty in cinema through crowdsourced keywords

    PubMed Central

    Sreenivasan, Sameet

    2013-01-01

    The generation of novelty is central to any creative endeavor. Novelty generation and the relationship between novelty and individual hedonic value have long been subjects of study in social psychology. However, few studies have utilized large-scale datasets to quantitatively investigate these issues. Here we consider the domain of American cinema and explore these questions using a database of films spanning a 70 year period. We use crowdsourced keywords from the Internet Movie Database as a window into the contents of films, and prescribe novelty scores for each film based on occurrence probabilities of individual keywords and keyword-pairs. These scores provide revealing insights into the dynamics of novelty in cinema. We investigate how novelty influences the revenue generated by a film, and find a relationship that resembles the Wundt-Berlyne curve. We also study the statistics of keyword occurrence and the aggregate distribution of keywords over a 100 year period. PMID:24067890

  5. Determinants of fruit and vegetable consumption among children and adolescents: a review of the literature. Part II: qualitative studies

    PubMed Central

    2011-01-01

    Background Large proportions of children do not fulfil the World Health Organization recommendation of eating at least 400 grams of fruit and vegetables (FV) per day. To promote an increased FV intake among children it is important to identify factors which influence their consumption. Both qualitative and quantitative studies are needed. Earlier reviews have analysed evidence from quantitative studies. The aim of this paper is to present a systematic review of qualitative studies of determinants of children's FV intake. Methods Relevant studies were identified by searching Anthropology Plus, Cinahl, CSA illumine, Embase, International Bibliography of the Social Sciences, Medline, PsycINFO, and Web of Science using combinations of synonyms for FV intake, children/adolescents and qualitative methods as search terms. The literature search was completed by December 1st 2010. Papers were included if they applied qualitative methods to investigate 6-18-year-olds' perceptions of factors influencing their FV consumption. Quantitative studies, review studies, studies reported in other languages than English, and non-peer reviewed or unpublished manuscripts were excluded. The papers were reviewed systematically using standardised templates for summary of papers, quality assessment, and synthesis of findings across papers. Results The review included 31 studies, mostly based on US populations and focus group discussions. The synthesis identified the following potential determinants for FV intake which supplement the quantitative knowledge base: Time costs; lack of taste guarantee; satiety value; appropriate time/occasions/settings for eating FV; sensory and physical aspects; variety, visibility, methods of preparation; access to unhealthy food; the symbolic value of food for image, gender identity and social interaction with peers; short term outcome expectancies. Conclusions The review highlights numerous potential determinants which have not been investigated thoroughly in quantitative studies. Future large scale quantitative studies should attempt to quantify the importance of these factors. Further, mechanisms behind gender, age and socioeconomic differences in FV consumption are proposed which should be tested quantitatively in order to better tailor interventions to vulnerable groups. Finally, the review provides input to the conceptualisation and measurements of concepts (i.e. peer influence, availability in schools) which may refine survey instruments and theoretical frameworks concerning eating behaviours. PMID:21999291

  6. "What else are you worried about?" – Integrating textual responses into quantitative social science research

    PubMed Central

    Brümmer, Martin; Schmukle, Stefan C.; Goebel, Jan; Wagner, Gert G.

    2017-01-01

    Open-ended questions have routinely been included in large-scale survey and panel studies, yet there is some perplexity about how to actually incorporate the answers to such questions into quantitative social science research. Tools developed recently in the domain of natural language processing offer a wide range of options for the automated analysis of such textual data, but their implementation has lagged behind. In this study, we demonstrate straightforward procedures that can be applied to process and analyze textual data for the purposes of quantitative social science research. Using more than 35,000 textual answers to the question “What else are you worried about?” from participants of the German Socio-economic Panel Study (SOEP), we (1) analyzed characteristics of respondents that determined whether they answered the open-ended question, (2) used the textual data to detect relevant topics that were reported by the respondents, and (3) linked the features of the respondents to the worries they reported in their textual data. The potential uses as well as the limitations of the automated analysis of textual data are discussed. PMID:28759628

  7. "What else are you worried about?" - Integrating textual responses into quantitative social science research.

    PubMed

    Rohrer, Julia M; Brümmer, Martin; Schmukle, Stefan C; Goebel, Jan; Wagner, Gert G

    2017-01-01

    Open-ended questions have routinely been included in large-scale survey and panel studies, yet there is some perplexity about how to actually incorporate the answers to such questions into quantitative social science research. Tools developed recently in the domain of natural language processing offer a wide range of options for the automated analysis of such textual data, but their implementation has lagged behind. In this study, we demonstrate straightforward procedures that can be applied to process and analyze textual data for the purposes of quantitative social science research. Using more than 35,000 textual answers to the question "What else are you worried about?" from participants of the German Socio-economic Panel Study (SOEP), we (1) analyzed characteristics of respondents that determined whether they answered the open-ended question, (2) used the textual data to detect relevant topics that were reported by the respondents, and (3) linked the features of the respondents to the worries they reported in their textual data. The potential uses as well as the limitations of the automated analysis of textual data are discussed.

  8. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  9. Quantitative properties of clustering within modern microscopic nuclear models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Volya, A.; Tchuvil’sky, Yu. M., E-mail: tchuvl@nucl-th.sinp.msu.ru

    2016-09-15

    A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially themore » possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.« less

  10. Scaling identity connects human mobility and social interactions.

    PubMed

    Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D; Barabási, Albert-László; Wang, Dashun

    2016-06-28

    Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality.

  11. Scaling identity connects human mobility and social interactions

    PubMed Central

    Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D.; Barabási, Albert-László; Wang, Dashun

    2016-01-01

    Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality. PMID:27274050

  12. [Modeling continuous scaling of NDVI based on fractal theory].

    PubMed

    Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng

    2013-07-01

    Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.

  13. EmbryoMiner: A new framework for interactive knowledge discovery in large-scale cell tracking data of developing embryos.

    PubMed

    Schott, Benjamin; Traub, Manuel; Schlagenhauf, Cornelia; Takamiya, Masanari; Antritter, Thomas; Bartschat, Andreas; Löffler, Katharina; Blessing, Denis; Otte, Jens C; Kobitski, Andrei Y; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf; Stegmaier, Johannes

    2018-04-01

    State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.

  14. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  15. Liquidity crises on different time scales

    NASA Astrophysics Data System (ADS)

    Corradi, Francesco; Zaccaria, Andrea; Pietronero, Luciano

    2015-12-01

    We present an empirical analysis of the microstructure of financial markets and, in particular, of the static and dynamic properties of liquidity. We find that on relatively large time scales (15 min) large price fluctuations are connected to the failure of the subtle mechanism of compensation between the flows of market and limit orders: in other words, the missed revelation of the latent order book breaks the dynamical equilibrium between the flows, triggering the large price jumps. On smaller time scales (30 s), instead, the static depletion of the limit order book is an indicator of an intrinsic fragility of the system, which is related to a strongly nonlinear enhancement of the response. In order to quantify this phenomenon we introduce a measure of the liquidity imbalance present in the book and we show that it is correlated to both the sign and the magnitude of the next price movement. These findings provide a quantitative definition of the effective liquidity, which proves to be strongly dependent on the considered time scales.

  16. Liquidity crises on different time scales.

    PubMed

    Corradi, Francesco; Zaccaria, Andrea; Pietronero, Luciano

    2015-12-01

    We present an empirical analysis of the microstructure of financial markets and, in particular, of the static and dynamic properties of liquidity. We find that on relatively large time scales (15 min) large price fluctuations are connected to the failure of the subtle mechanism of compensation between the flows of market and limit orders: in other words, the missed revelation of the latent order book breaks the dynamical equilibrium between the flows, triggering the large price jumps. On smaller time scales (30 s), instead, the static depletion of the limit order book is an indicator of an intrinsic fragility of the system, which is related to a strongly nonlinear enhancement of the response. In order to quantify this phenomenon we introduce a measure of the liquidity imbalance present in the book and we show that it is correlated to both the sign and the magnitude of the next price movement. These findings provide a quantitative definition of the effective liquidity, which proves to be strongly dependent on the considered time scales.

  17. Effect of Logarithmic and Linear Frequency Scales on Parametric Modelling of Tissue Dielectric Data.

    PubMed

    Salahuddin, Saqib; Porter, Emily; Meaney, Paul M; O'Halloran, Martin

    2017-02-01

    The dielectric properties of biological tissues have been studied widely over the past half-century. These properties are used in a vast array of applications, from determining the safety of wireless telecommunication devices to the design and optimisation of medical devices. The frequency-dependent dielectric properties are represented in closed-form parametric models, such as the Cole-Cole model, for use in numerical simulations which examine the interaction of electromagnetic (EM) fields with the human body. In general, the accuracy of EM simulations depends upon the accuracy of the tissue dielectric models. Typically, dielectric properties are measured using a linear frequency scale; however, use of the logarithmic scale has been suggested historically to be more biologically descriptive. Thus, the aim of this paper is to quantitatively compare the Cole-Cole fitting of broadband tissue dielectric measurements collected with both linear and logarithmic frequency scales. In this way, we can determine if appropriate choice of scale can minimise the fit error and thus reduce the overall error in simulations. Using a well-established fundamental statistical framework, the results of the fitting for both scales are quantified. It is found that commonly used performance metrics, such as the average fractional error, are unable to examine the effect of frequency scale on the fitting results due to the averaging effect that obscures large localised errors. This work demonstrates that the broadband fit for these tissues is quantitatively improved when the given data is measured with a logarithmic frequency scale rather than a linear scale, underscoring the importance of frequency scale selection in accurate wideband dielectric modelling of human tissues.

  18. Effect of Logarithmic and Linear Frequency Scales on Parametric Modelling of Tissue Dielectric Data

    PubMed Central

    Salahuddin, Saqib; Porter, Emily; Meaney, Paul M.; O’Halloran, Martin

    2016-01-01

    The dielectric properties of biological tissues have been studied widely over the past half-century. These properties are used in a vast array of applications, from determining the safety of wireless telecommunication devices to the design and optimisation of medical devices. The frequency-dependent dielectric properties are represented in closed-form parametric models, such as the Cole-Cole model, for use in numerical simulations which examine the interaction of electromagnetic (EM) fields with the human body. In general, the accuracy of EM simulations depends upon the accuracy of the tissue dielectric models. Typically, dielectric properties are measured using a linear frequency scale; however, use of the logarithmic scale has been suggested historically to be more biologically descriptive. Thus, the aim of this paper is to quantitatively compare the Cole-Cole fitting of broadband tissue dielectric measurements collected with both linear and logarithmic frequency scales. In this way, we can determine if appropriate choice of scale can minimise the fit error and thus reduce the overall error in simulations. Using a well-established fundamental statistical framework, the results of the fitting for both scales are quantified. It is found that commonly used performance metrics, such as the average fractional error, are unable to examine the effect of frequency scale on the fitting results due to the averaging effect that obscures large localised errors. This work demonstrates that the broadband fit for these tissues is quantitatively improved when the given data is measured with a logarithmic frequency scale rather than a linear scale, underscoring the importance of frequency scale selection in accurate wideband dielectric modelling of human tissues. PMID:28191324

  19. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    NASA Astrophysics Data System (ADS)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  20. Will Systems Biology Deliver Its Promise and Contribute to the Development of New or Improved Vaccines? What Really Constitutes the Study of "Systems Biology" and How Might Such an Approach Facilitate Vaccine Design.

    PubMed

    Germain, Ronald N

    2017-10-16

    A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  1. COMPARISON OF THE SINK CHARACTERISTICS OF THREE FULL-SCALE ENVIRONMENTAL CHAMBERS

    EPA Science Inventory

    The paper gives results of an investigation of the interaction of vapor-phase organic compounds with the interior surfaces of three large dynamic test chambers. A pattern of adsorption and reemission of the test compounds was observed in all three chambers. Quantitative compari...

  2. A case report of evaluating a large-scale health systems improvement project in an uncontrolled setting: a quality improvement initiative in KwaZulu-Natal, South Africa.

    PubMed

    Mate, Kedar S; Ngidi, Wilbroda Hlolisile; Reddy, Jennifer; Mphatswe, Wendy; Rollins, Nigel; Barker, Pierre

    2013-11-01

    New approaches are needed to evaluate quality improvement (QI) within large-scale public health efforts. This case report details challenges to large-scale QI evaluation, and proposes solutions relying on adaptive study design. We used two sequential evaluative methods to study a QI effort to improve delivery of HIV preventive care in public health facilities in three districts in KwaZulu-Natal, South Africa, over a 3-year period. We initially used a cluster randomised controlled trial (RCT) design. During the RCT study period, tensions arose between intervention implementation and evaluation design due to loss of integrity of the randomisation unit over time, pressure to implement changes across the randomisation unit boundaries, and use of administrative rather than functional structures for the randomisation. In response to this loss of design integrity, we switched to a more flexible intervention design and a mixed-methods quasiexperimental evaluation relying on both a qualitative analysis and an interrupted time series quantitative analysis. Cluster RCT designs may not be optimal for evaluating complex interventions to improve implementation in uncontrolled 'real world' settings. More flexible, context-sensitive evaluation designs offer a better balance of the need to adjust the intervention during the evaluation to meet implementation challenges while providing the data required to evaluate effectiveness. Our case study involved HIV care in a resource-limited setting, but these issues likely apply to complex improvement interventions in other settings.

  3. A pilot rating scale for evaluating failure transients in electronic flight control systems

    NASA Technical Reports Server (NTRS)

    Hindson, William S.; Schroeder, Jeffery A.; Eshow, Michelle M.

    1990-01-01

    A pilot rating scale was developed to describe the effects of transients in helicopter flight-control systems on safety-of-flight and on pilot recovery action. The scale was applied to the evaluation of hardovers that could potentially occur in the digital flight-control system being designed for a variable-stability UH-60A research helicopter. Tests were conducted in a large moving-base simulator and in flight. The results of the investigation were combined with existing airworthiness criteria to determine quantitative reliability design goals for the control system.

  4. Individuals with autism have higher 8-Iso-PGF2α levels than controls, but no correlation with quantitative assay of Paraoxonase 1 serum levels.

    PubMed

    Pop, Bianca; Niculae, Alexandru-Ștefan; Pop, Tudor Lucian; Răchișan, Andreea Liana

    2017-12-01

    Autism spectrum disorder (ASD) represents a very large set of neurodevelopmental issues with diverse clinical outcomes. Various hypotheses have been put forth for the etiology of autism spectrum disorder, including issues pertaining to oxidative stress. In this study, we conducted measurements of serum 8-Iso-Prostaglanding F2 α (8-iso-PGF2α, which is the results of non-enzimatically mediated polyunsaturated fatty acid oxidation) in a population of individuals with autism and a control group of age and sex matched controls. A quantitative assay of Paraoxonase 1 (PON1) was conducted. Data regarding comorbidities, structural MRI scans, medication, intelligence quotient (IQ) and Childhood Autism Rating Scale scores (CARS) were also included in our study. Our results show that patients diagnosed with autism have higher levels of 8-iso-PGF2α than their neurotypical counterparts. Levels of this particular metabolite, however, do not correlate with quantitative serum levels of Paraoxonase 1, which has been shown to be altered in individuals with autism. Neither 8-iso-PGF2α nor quantitative levels of PON1 provide any meaningful correlation with clinical or neuroimaging data in this study group. Future research should focus on providing data regarding PON 1 phenotype, in addition to standard quantitative measurements, in relation to 8-iso-PGF2α as well as other clinical and structural brain findings.

  5. Psychometric Properties of the Quantitative Myasthenia Gravis Score and the Myasthenia Gravis Composite Scale.

    PubMed

    Barnett, Carolina; Merkies, Ingemar S J; Katzberg, Hans; Bril, Vera

    2015-09-02

    The Quantitative Myasthenia Gravis Score and the Myasthenia Gravis Composite are two commonly used outcome measures in Myasthenia Gravis. So far, their measurement properties have not been compared, so we aimed to study their psychometric properties using the Rasch model. 251 patients with stable myasthenia gravis were assessed with both scales, and 211 patients returned for a second assessment. We studied fit to the Rasch model at the first visit, and compared item fit, thresholds, differential item functioning, local dependence, person separation index, and tests for unidimensionality. We also assessed test-retest reliability and estimated the Minimal Detectable Change. Neither scale fit the Rasch model (X2p <  0.05). The Myasthenia Gravis Composite had lower discrimination properties than the Quantitative Myasthenia Gravis Scale (Person Separation Index: 0.14 and 0.7). There was local dependence in both scales, as well as differential item functioning for ocular and generalized disease. Disordered thresholds were found in 6(60%) items of the Myasthenia Gravis Composite and in 4(31%) of the Quantitative Myasthenia Gravis Score. Both tools had adequate test-retest reliability (ICCs >0.8). The minimally detectable change was 4.9 points for the Myasthenia Gravis Composite and 4.3 points for the Quantitative Myasthenia Gravis Score. Neither scale fulfilled Rasch model expectations. The Quantitative Myasthenia Gravis Score has higher discrimination than the Myasthenia Gravis Composite. Both tools have items with disordered thresholds, differential item functioning and local dependency. There was evidence of multidimensionality in the QMGS. The minimal detectable change values are higher than previous studies on the minimal significant change. These findings might inform future modifications of these tools.

  6. Multi-scale modeling of diffusion-controlled reactions in polymers: renormalisation of reactivity parameters.

    PubMed

    Everaers, Ralf; Rosa, Angelo

    2012-01-07

    The quantitative description of polymeric systems requires hierarchical modeling schemes, which bridge the gap between the atomic scale, relevant to chemical or biomolecular reactions, and the macromolecular scale, where the longest relaxation modes occur. Here, we use the formalism for diffusion-controlled reactions in polymers developed by Wilemski, Fixman, and Doi to discuss the renormalisation of the reactivity parameters in polymer models with varying spatial resolution. In particular, we show that the adjustments are independent of chain length. As a consequence, it is possible to match reactions times between descriptions with different resolution for relatively short reference chains and to use the coarse-grained model to make quantitative predictions for longer chains. We illustrate our results by a detailed discussion of the classical problem of chain cyclization in the Rouse model, which offers the simplest example of a multi-scale descriptions, if we consider differently discretized Rouse models for the same physical system. Moreover, we are able to explore different combinations of compact and non-compact diffusion in the local and large-scale dynamics by varying the embedding dimension.

  7. Disproportionate photosynthetic decline and inverse relationship between constitutive and induced volatile emissions upon feeding of Quercus robur leaves by large larvae of gypsy moth (Lymantria dispar)

    PubMed Central

    Copolovici, Lucian; Pag, Andreea; Kännaste, Astrid; Bodescu, Adina; Tomescu, Daniel; Copolovici, Dana; Soran, Maria-Loredana; Niinemets, Ülo

    2018-01-01

    Gypsy moth (Lymantria dispar L., Lymantriinae) is a major pest of pedunculate oak (Quercus robur) forests in Europe, but how its infections scale with foliage physiological characteristics, in particular with photosynthesis rates and emissions of volatile organic compounds has not been studied. Differently from the majority of insect herbivores, large larvae of L. dispar rapidly consume leaf area, and can also bite through tough tissues, including secondary and primary leaf veins. Given the rapid and devastating feeding responses, we hypothesized that infection of Q. robur leaves by L. dispar leads to disproportionate scaling of leaf photosynthesis and constitutive isoprene emissions with damaged leaf area, and to less prominent enhancements of induced volatile release. Leaves with 0% (control) to 50% of leaf area removed by larvae were studied. Across this range of infection severity, all physiological characteristics were quantitatively correlated with the degree of damage, but all these traits changed disproportionately with the degree of damage. The net assimilation rate was reduced by almost 10-fold and constitutive isoprene emissions by more than 7-fold, whereas the emissions of green leaf volatiles, monoterpenes, methyl salicylate and the homoterpene (3E)-4,8-dimethy-1,3,7-nonatriene scaled negatively and almost linearly with net assimilation rate through damage treatments. This study demonstrates that feeding by large insect herbivores disproportionately alters photosynthetic rate and constitutive isoprene emissions. Furthermore, the leaves have a surprisingly large capacity for enhancement of induced emissions even when foliage photosynthetic function is severely impaired. PMID:29367792

  8. Large-scale label-free quantitative proteomics of the pea aphid-Buchnera symbiosis.

    PubMed

    Poliakov, Anton; Russell, Calum W; Ponnala, Lalit; Hoops, Harold J; Sun, Qi; Douglas, Angela E; van Wijk, Klaas J

    2011-06-01

    Many insects are nutritionally dependent on symbiotic microorganisms that have tiny genomes and are housed in specialized host cells called bacteriocytes. The obligate symbiosis between the pea aphid Acyrthosiphon pisum and the γ-proteobacterium Buchnera aphidicola (only 584 predicted proteins) is particularly amenable for molecular analysis because the genomes of both partners have been sequenced. To better define the symbiotic relationship between this aphid and Buchnera, we used large-scale, high accuracy tandem mass spectrometry (nanoLC-LTQ-Orbtrap) to identify aphid and Buchnera proteins in the whole aphid body, purified bacteriocytes, isolated Buchnera cells and the residual bacteriocyte fraction. More than 1900 aphid and 400 Buchnera proteins were identified. All enzymes in amino acid metabolism annotated in the Buchnera genome were detected, reflecting the high (68%) coverage of the proteome and supporting the core function of Buchnera in the aphid symbiosis. Transporters mediating the transport of predicted metabolites were present in the bacteriocyte. Label-free spectral counting combined with hierarchical clustering, allowed to define the quantitative distribution of a subset of these proteins across both symbiotic partners, yielding no evidence for the selective transfer of protein among the partners in either direction. This is the first quantitative proteome analysis of bacteriocyte symbiosis, providing a wealth of information about molecular function of both the host cell and bacterial symbiont.

  9. Response of human populations to large-scale emergencies

    NASA Astrophysics Data System (ADS)

    Bagrow, James; Wang, Dashun; Barabási, Albert-László

    2010-03-01

    Until recently, little quantitative data regarding collective human behavior during dangerous events such as bombings and riots have been available, despite its importance for emergency management, safety and urban planning. Understanding how populations react to danger is critical for prediction, detection and intervention strategies. Using a large telecommunications dataset, we study for the first time the spatiotemporal, social and demographic response properties of people during several disasters, including a bombing, a city-wide power outage, and an earthquake. Call activity rapidly increases after an event and we find that, when faced with a truly life-threatening emergency, information rapidly propagates through a population's social network. Other events, such as sports games, do not exhibit this propagation.

  10. Quantifying patterns of research interest evolution

    NASA Astrophysics Data System (ADS)

    Jia, Tao; Wang, Dashun; Szymanski, Boleslaw

    Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.

  11. The Role of Endocytosis during Morphogenetic Signaling

    PubMed Central

    Gonzalez-Gaitan, Marcos; Jülicher, Frank

    2014-01-01

    Morphogens are signaling molecules that are secreted by a localized source and spread in a target tissue where they are involved in the regulation of growth and patterning. Both the activity of morphogenetic signaling and the kinetics of ligand spreading in a tissue depend on endocytosis and intracellular trafficking. Here, we review quantitative approaches to study how large-scale morphogen profiles and signals emerge in a tissue from cellular trafficking processes and endocytic pathways. Starting from the kinetics of endosomal networks, we discuss the role of cellular trafficking and receptor dynamics in the formation of morphogen gradients. These morphogen gradients scale during growth, which implies that overall tissue size influences cellular trafficking kinetics. Finally, we discuss how such morphogen profiles can be used to control tissue growth. We emphasize the role of theory in efforts to bridge between scales. PMID:24984777

  12. The Small College Administrative Environment.

    ERIC Educational Resources Information Center

    Buzza, Bonnie Wilson

    Environmental differences for speech departments at large and small colleges are not simply of scale; there are qualitative as well as quantitative differences. At small colleges, faculty are hired as teachers, rather than as researchers. Because speech teachers at small colleges must be generalists, and because it is often difficult to replace…

  13. Potential Large-Scale Production of Conjugated Soybean Oil Catalyzed by Photolyzed Iodine in Hexanes

    USDA-ARS?s Scientific Manuscript database

    A laboratory apparatus is described for the production of conjugated soybean oil (SBO) in pound quantities via irradiation with visible-light. Under our reaction conditions, quantitative conversions (determined by NMR spectroscopy) of SBO to conjugated SBO, in hexanes at reflux temperatures, were a...

  14. Scale dependence of the alignment between strain rate and rotation in turbulent shear flow

    NASA Astrophysics Data System (ADS)

    Fiscaletti, D.; Elsinga, G. E.; Attili, A.; Bisetti, F.; Buxton, O. R. H.

    2016-10-01

    The scale dependence of the statistical alignment tendencies of the eigenvectors of the strain-rate tensor ei, with the vorticity vector ω , is examined in the self-preserving region of a planar turbulent mixing layer. Data from a direct numerical simulation are filtered at various length scales and the probability density functions of the magnitude of the alignment cosines between the two unit vectors | ei.ω ̂| are examined. It is observed that the alignment tendencies are insensitive to the concurrent large-scale velocity fluctuations, but are quantitatively affected by the nature of the concurrent large-scale velocity-gradient fluctuations. It is confirmed that the small-scale (local) vorticity vector is preferentially aligned in parallel with the large-scale (background) extensive strain-rate eigenvector e1, in contrast to the global tendency for ω to be aligned in parallel with the intermediate strain-rate eigenvector [Hamlington et al., Phys. Fluids 20, 111703 (2008), 10.1063/1.3021055]. When only data from regions of the flow that exhibit strong swirling are included, the so-called high-enstrophy worms, the alignment tendencies are exaggerated with respect to the global picture. These findings support the notion that the production of enstrophy, responsible for a net cascade of turbulent kinetic energy from large scales to small scales, is driven by vorticity stretching due to the preferential parallel alignment between ω and nonlocal e1 and that the strongly swirling worms are kinematically significant to this process.

  15. Floods, floodplains, delta plains — A satellite imaging approach

    NASA Astrophysics Data System (ADS)

    Syvitski, James P. M.; Overeem, Irina; Brakenridge, G. Robert; Hannon, Mark

    2012-08-01

    Thirty-three lowland floodplains and their associated delta plains are characterized with data from three remote sensing systems (AMSR-E, SRTM and MODIS). These data provide new quantitative information to characterize Late Quaternary floodplain landscapes and their penchant for flooding over the last decade. Daily proxy records for discharge since 2002 and for each of the 33 river systems can be derived with novel Advanced Microwave Scanning Radiometer (AMSR-E) methods. A descriptive framework based on analysis of Shuttle Radar Topography Mission (SRTM) data is used to capture the major landscape-scale floodplain elements or zones: 1) container valleys with their long and narrow pathways of largely sediment transit and bypass, 2) floodplain depressions that act as loci for frequent flooding and sediment storage, 3) zones of nodal avulsions common to many continental scale rivers, and often located seaward of container valleys, and 4) coastal floodplains and delta plains that offer both sediment bypass and storage but under the influence of marine processes. The SRTM data allow mapping of smaller-scale architectural elements in unprecedented systematic manner. Floodplain depressions were found to play a major role, which may largely be overlooked in conceptual floodplain models. Lastly, MODIS data (independently and combined with AMSR-E) allows the tracking of flood hydrographs and pathways and sedimentation patterns on a near-daily timescale worldwide. These remote-sensing data show that 85% of the studied major river systems experienced extensive flooding in the last decade. A new quantitative paradigm of floodplain processes, honoring the frequency and extent of floods, can be develop by careful analysis of these new remotely sensed data.

  16. The memory remains: Understanding collective memory in the digital age

    PubMed Central

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-01-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects. PMID:28435881

  17. The memory remains: Understanding collective memory in the digital age.

    PubMed

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-04-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects.

  18. Large wood in the Snowy River estuary, Australia

    NASA Astrophysics Data System (ADS)

    Hinwood, Jon B.; McLean, Errol J.

    2017-02-01

    In this paper we report on 8 years of data collection and interpretation of large wood in the Snowy River estuary in southeastern Australia, providing quantitative data on the amount, sources, transport, decay, and geomorphic actions. No prior census data for an estuary is known to the authors despite their environmental and economic importance and the significant differences between a fluvial channel and an estuarine channel. Southeastern Australian estuaries contain a significant quantity of large wood that is derived from many sources, including river flood flows, local bank erosion, and anthropogenic sources. Wind and tide are shown to be as important as river flow in transporting and stranding large wood. Tidal action facilitates trapping of large wood on intertidal bars and shoals; but channels are wider and generally deeper, so log jams are less likely than in rivers. Estuarine large wood contributes to localised scour and accretion and hence to the modification of estuarine habitat, but in the study area it did not have large-scale impacts on the hydraulic gradients nor the geomorphology.

  19. Preservation of large-scale chromatin structure in FISH experiments

    PubMed Central

    Hepperger, Claudia; Otten, Simone; von Hase, Johann

    2006-01-01

    The nuclear organization of specific endogenous chromatin regions can be investigated only by fluorescence in situ hybridization (FISH). One of the two fixation procedures is typically applied: (1) buffered formaldehyde or (2) hypotonic shock with methanol acetic acid fixation followed by dropping of nuclei on glass slides and air drying. In this study, we compared the effects of these two procedures and some variations on nuclear morphology and on FISH signals. We analyzed mouse erythroleukemia and mouse embryonic stem cells because their clusters of subcentromeric heterochromatin provide an easy means to assess preservation of chromatin. Qualitative and quantitative analyses revealed that formaldehyde fixation provided good preservation of large-scale chromatin structures, while classical methanol acetic acid fixation after hypotonic treatment severely impaired nuclear shape and led to disruption of chromosome territories, heterochromatin structures, and large transgene arrays. Our data show that such preparations do not faithfully reflect in vivo nuclear architecture. Electronic supplementary material Supplementary material is available in the online version of this article at http://dx.doi.org/10.1007/s00412-006-0084-2 and is accessible for authorized users. PMID:17119992

  20. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    PubMed

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  1. Planar isotropy of passive scalar turbulent mixing with a mean perpendicular gradient.

    PubMed

    Danaila, L; Dusek, J; Le Gal, P; Anselmet, F; Brun, C; Pumir, A

    1999-08-01

    A recently proposed evolution equation [Vaienti et al., Physica D 85, 405 (1994)] for the probability density functions (PDF's) of turbulent passive scalar increments obtained under the assumptions of fully three-dimensional homogeneity and isotropy is submitted to validation using direct numerical simulation (DNS) results of the mixing of a passive scalar with a nonzero mean gradient by a homogeneous and isotropic turbulent velocity field. It is shown that this approach leads to a quantitatively correct balance between the different terms of the equation, in a plane perpendicular to the mean gradient, at small scales and at large Péclet number. A weaker assumption of homogeneity and isotropy restricted to the plane normal to the mean gradient is then considered to derive an equation describing the evolution of the PDF's as a function of the spatial scale and the scalar increments. A very good agreement between the theory and the DNS data is obtained at all scales. As a particular case of the theory, we derive a generalized form for the well-known Yaglom equation (the isotropic relation between the second-order moments for temperature increments and the third-order velocity-temperature mixed moments). This approach allows us to determine quantitatively how the integral scale properties influence the properties of mixing throughout the whole range of scales. In the simple configuration considered here, the PDF's of the scalar increments perpendicular to the mean gradient can be theoretically described once the sources of inhomogeneity and anisotropy at large scales are correctly taken into account.

  2. Structured Qualitative Research: Organizing “Mountains of Words” for Data Analysis, both Qualitative and Quantitative

    PubMed Central

    Johnson, Bruce D.; Dunlap, Eloise; Benoit, Ellen

    2008-01-01

    Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semi-quantitative coding and analysis. The authors have 30 years of experience in designing and completing major qualitative research projects, mainly funded by the US National Institute on Drug Abuse [NIDA]. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Several different staff members can contribute to the data collection, even when working from remote locations. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed. The major difficulties involve a need for staff who can program and manage large databases, and who can be skillful analysts of both qualitative and quantitative data. PMID:20222777

  3. Statistical mechanics of neocortical interactions: A scaling paradigm applied to electroencephalography

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1991-09-01

    A series of papers has developed a statistical mechanics of neocortical interactions (SMNI), deriving aggregate behavior of experimentally observed columns of neurons from statistical electrical-chemical properties of synaptic interactions. While not useful to yield insights at the single-neuron level, SMNI has demonstrated its capability in describing large-scale properties of short-term memory and electroencephalographic (EEG) systematics. The necessity of including nonlinear and stochastic structures in this development has been stressed. In this paper, a more stringent test is placed on SMNI: The algebraic and numerical algorithms previously developed in this and similar systems are brought to bear to fit large sets of EEG and evoked-potential data being collected to investigate genetic predispositions to alcoholism and to extract brain ``signatures'' of short-term memory. Using the numerical algorithm of very fast simulated reannealing, it is demonstrated that SMNI can indeed fit these data within experimentally observed ranges of its underlying neuronal-synaptic parameters, and the quantitative modeling results are used to examine physical neocortical mechanisms to discriminate high-risk and low-risk populations genetically predisposed to alcoholism. Since this study is a control to span relatively long time epochs, similar to earlier attempts to establish such correlations, this discrimination is inconclusive because of other neuronal activity which can mask such effects. However, the SMNI model is shown to be consistent with EEG data during selective attention tasks and with neocortical mechanisms describing short-term memory previously published using this approach. This paper explicitly identifies similar nonlinear stochastic mechanisms of interaction at the microscopic-neuronal, mesoscopic-columnar, and macroscopic-regional scales of neocortical interactions. These results give strong quantitative support for an accurate intuitive picture, portraying neocortical interactions as having common algebraic or physics mechanisms that scale across quite disparate spatial scales and functional or behavioral phenomena, i.e., describing interactions among neurons, columns of neurons, and regional masses of neurons.

  4. Quantitative elemental imaging of heterogeneous catalysts using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Trichard, F.; Sorbier, L.; Moncayo, S.; Blouët, Y.; Lienemann, C.-P.; Motto-Ros, V.

    2017-07-01

    Currently, the use of catalysis is widespread in almost all industrial processes; its use improves productivity, synthesis yields and waste treatment as well as decreases energy costs. The increasingly stringent requirements, in terms of reaction selectivity and environmental standards, impose progressively increasing accuracy and control of operations. Meanwhile, the development of characterization techniques has been challenging, and the techniques often require equipment with high complexity. In this paper, we demonstrate a novel elemental approach for performing quantitative space-resolved analysis with ppm-scale quantification limits and μm-scale resolution. This approach, based on laser-induced breakdown spectroscopy (LIBS), is distinguished by its simplicity, all-optical design, and speed of operation. This work analyzes palladium-based porous alumina catalysts, which are commonly used in the selective hydrogenation process, using the LIBS method. We report an exhaustive study of the quantification capability of LIBS and its ability to perform imaging measurements over a large dynamic range, typically from a few ppm to wt%. These results offer new insight into the use of LIBS-based imaging in the industry and paves the way for innumerable applications.

  5. Formation and representation: Critical analyses of identity, supply, and demand in science, technology, engineering, and mathematics

    NASA Astrophysics Data System (ADS)

    Mandayam Doddamane, Prabha

    2011-12-01

    Considerable research, policy, and programmatic efforts have been dedicated to addressing the participation of particular populations in STEM for decades. Each of these efforts claims equity-related goals; yet, they heavily frame the problem, through pervasive STEM pipeline model discourse, in terms of national needs, workforce supply, and competitiveness. This particular framing of the problem may, indeed, be counter to equity goals, especially when paired with policy that largely relies on statistical significance and broad aggregation of data over exploring the identities and experiences of the populations targeted for equitable outcomes in that policy. In this study, I used the mixed-methods approach of critical discourse and critical quantitative analyses to understand how the pipeline model ideology has become embedded within academic discourse, research, and data surrounding STEM education and work and to provide alternatives for quantitative analysis. Using critical theory as a lens, I first conducted a critical discourse analysis of contemporary STEM workforce studies with a particular eye to pipeline ideology. Next, I used that analysis to inform logistic regression analyses of the 2006 SESTAT data. This quantitative analysis compared and contrasted different ways of thinking about identity and retention. Overall, the findings of this study show that many subjective choices are made in the construction of the large-scale datasets used to inform much national science and engineering policy and that these choices greatly influence likelihood of retention outcomes.

  6. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor

    NASA Astrophysics Data System (ADS)

    Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim

    2016-03-01

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions. Electronic supplementary information (ESI) available: DNA sequences and nomenclature (Table 1S); SDS-PAGE assay of IHF stock solution (Fig. 1S); determination of the concentration of IHF stock solution by Bradford assay (Fig. 2S); equilibrium binding isotherm fitting results of other DNA sequences (Table 2S); calculation of dissociation constants (Fig. 3S, 4S; Table 2S); geometric model for quantitation of DNA bending angle induced by specific IHF binding (Fig. 4S); customized flow cell assembly (Fig. 5S); real-time measurement of average fluorophore height change by SSFM (Fig. 6S); summary of binding parameters obtained from additive isotherm model fitting (Table 3S); average surface densities of 10 dsDNA spots and bound IHF at equilibrium (Table 4S); effects of surface densities on the binding and bending of dsDNA (Tables 5S, 6S and Fig. 7S-10S). See DOI: 10.1039/c5nr06785e

  7. Assessing Psychodynamic Conflict.

    PubMed

    Simmonds, Joshua; Constantinides, Prometheas; Perry, J Christopher; Drapeau, Martin; Sheptycki, Amanda R

    2015-09-01

    Psychodynamic psychotherapies suggest that symptomatic relief is provided, in part, with the resolution of psychic conflicts. Clinical researchers have used innovative methods to investigate such phenomenon. This article aims to review the literature on quantitative psychodynamic conflict rating scales. An electronic search of the literature was conducted to retrieve quantitative observer-rated scales used to assess conflict noting each measure's theoretical model, information source, and training and clinical experience required. Scales were also examined for levels of reliability and validity. Five quantitative observer-rated conflict scales were identified. Reliability varied from poor to excellent with each measure demonstrating good validity. However a small number of studies and limited links to current conflict theory suggest further clinical research is needed.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mezyk, Stephen P.; Mincher, Bruce J.; Nilsson, Mikael

    This document is the final report for the Nuclear Energy Universities Program (NEUP) grant 10-910 (DE-AC07-05ID14517) “Alpha Radiolysis of Nuclear Solvent Extraction Ligands used for An(III) and Ln(III) Separations”. The goal of this work was to obtain a quantitative understanding of the impacts of both low Linear Energy Transfer (LET, gamma-rays) and high LET (alpha particles) radiation chemistry occurring in future large-scale separations processes. This quantitative understanding of the major radiation effects on diluents and ligands is essential for optimal process implementation, and could result in significant cost savings in the future.

  9. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  10. Development of four self-report measures of job stressors and strain: Interpersonal Conflict at Work Scale, Organizational Constraints Scale, Quantitative Workload Inventory, and Physical Symptoms Inventory.

    PubMed

    Spector, P E; Jex, S M

    1998-10-01

    Despite the widespread use of self-report measures of both job-related stressors and strains, relatively few carefully developed scales for which validity data exist are available. In this article, we discuss 3 job stressor scales (Interpersonal Conflict at Work Scale, Organizational Constraints Scale, and Quantitative Workload Inventory) and 1 job strain scale (Physical Symptoms Inventory). Using meta-analysis, we combined the results of 18 studies to provide estimates of relations between our scales and other variables. Data showed moderate convergent validity for the 3 job stressor scales, suggesting some objectively to these self-reports. Norms for each scale are provided.

  11. Quantitation of heparosan with heparin lyase III and spectrophotometry.

    PubMed

    Huang, Haichan; Zhao, Yingying; Lv, Shencong; Zhong, Weihong; Zhang, Fuming; Linhardt, Robert J

    2014-02-15

    Heparosan is Escherichia coli K5 capsule polysaccharide, which is the key precursor for preparing bioengineered heparin. A rapid and effective quantitative method for detecting heparosan is important in the large-scale production of heparosan. Heparin lyase III (Hep III) effectively catalyzes the heparosan depolymerization, forming unsaturated disaccharides that are measurable using a spectrophotometer at 232 nm. We report a new method for the quantitative detection of heparosan with heparin lyase III and spectrophotometry that is safer and more specific than the traditional carbazole assay. In an optimized detection system, heparosan at a minimum concentration of 0.60 g/L in fermentation broth can be detected. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. CPTAC Evaluates Long-Term Reproducibility of Quantitative Proteomics Using Breast Cancer Xenografts | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.

  13. An Introduction to Magnetospheric Physics by Means of Simple Models

    NASA Technical Reports Server (NTRS)

    Stern, D. P.

    1981-01-01

    The large scale structure and behavior of the Earth's magnetosphere is discussed. The model is suitable for inclusion in courses on space physics, plasmas, astrophysics or the Earth's environment, as well as for self-study. Nine quantitative problems, dealing with properties of linear superpositions of a dipole and a constant field are presented. Topics covered include: open and closed models of the magnetosphere; field line motion; the role of magnetic merging (reconnection); magnetospheric convection; and the origin of the magnetopause, polar cusps, and high latitude lobes.

  14. Enabling Interactive Measurements from Large Coverage Microscopy

    PubMed Central

    Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary

    2017-01-01

    Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600

  15. Universality and predictability in molecular quantitative genetics.

    PubMed

    Nourmohammad, Armita; Held, Torsten; Lässig, Michael

    2013-12-01

    Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.

  16. Large-scale impacts of herbivores on the structural diversity of African savannas

    PubMed Central

    Asner, Gregory P.; Levick, Shaun R.; Kennedy-Bowdoin, Ty; Knapp, David E.; Emerson, Ruth; Jacobson, James; Colgan, Matthew S.; Martin, Roberta E.

    2009-01-01

    African savannas are undergoing management intensification, and decision makers are increasingly challenged to balance the needs of large herbivore populations with the maintenance of vegetation and ecosystem diversity. Ensuring the sustainability of Africa's natural protected areas requires information on the efficacy of management decisions at large spatial scales, but often neither experimental treatments nor large-scale responses are available for analysis. Using a new airborne remote sensing system, we mapped the three-dimensional (3-D) structure of vegetation at a spatial resolution of 56 cm throughout 1640 ha of savanna after 6-, 22-, 35-, and 41-year exclusions of herbivores, as well as in unprotected areas, across Kruger National Park in South Africa. Areas in which herbivores were excluded over the short term (6 years) contained 38%–80% less bare ground compared with those that were exposed to mammalian herbivory. In the longer-term (> 22 years), the 3-D structure of woody vegetation differed significantly between protected and accessible landscapes, with up to 11-fold greater woody canopy cover in the areas without herbivores. Our maps revealed 2 scales of ecosystem response to herbivore consumption, one broadly mediated by geologic substrate and the other mediated by hillslope-scale variation in soil nutrient availability and moisture conditions. Our results are the first to quantitatively illustrate the extent to which herbivores can affect the 3-D structural diversity of vegetation across large savanna landscapes. PMID:19258457

  17. Photogrammetric portrayal of Mars topography.

    USGS Publications Warehouse

    Wu, S.S.C.

    1979-01-01

    Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.-Author

  18. Photogrammetric portrayal of Mars topography

    NASA Technical Reports Server (NTRS)

    Wu, S. S. C.

    1979-01-01

    Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.

  19. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  20. Collaboration in the Humanities, Arts and Social Sciences in Australia

    ERIC Educational Resources Information Center

    Haddow, Gaby; Xia, Jianhong; Willson, Michele

    2017-01-01

    This paper reports on the first large-scale quantitative investigation into collaboration, demonstrated in co-authorship, by Australian humanities, arts and social sciences (HASS) researchers. Web of Science data were extracted for Australian HASS publications, with a focus on the softer social sciences, over the period 2004-2013. The findings…

  1. Quality Control Charts in Large-Scale Assessment Programs

    ERIC Educational Resources Information Center

    Schafer, William D.; Coverdale, Bradley J.; Luxenberg, Harlan; Jin, Ying

    2011-01-01

    There are relatively few examples of quantitative approaches to quality control in educational assessment and accountability contexts. Among the several techniques that are used in other fields, Shewart charts have been found in a few instances to be applicable in educational settings. This paper describes Shewart charts and gives examples of how…

  2. Evaluating Change in Medical School Curricula: How Did We Know Where We Were Going?

    ERIC Educational Resources Information Center

    Mahaffy, John; Gerrity, Martha S.

    1998-01-01

    Compares and contrasts the primary outcomes and methods used to evaluate curricular changes at eight medical schools participating in a large-scale medical curriculum development project. Describes how the evaluative data, both quantitative and qualitative, were collected, and how evaluation drove curricular change. Although the evaluations were…

  3. Landscape pattern metrics and regional assessment

    Treesearch

    Robert V. O' Neill; Kurt H. Riitters; J.D. Wickham; Bruce K. Jones

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop interpret quantitative measures of spatial patter-the landscape indices. This article reviews what is known about...

  4. Invasive Russian knapweed (Acroptilon repens) creates large patches almost entirely by rhizomic growth

    USDA-ARS?s Scientific Manuscript database

    Russian knapweed is an outcrossing perennial invasive weed in North America that can spread by both seed and horizontal rhizome growth leading to new shoots. The predominant mode of spread at the local and long-distance scales has not been quantitatively researched. We used Amplified Fragment Length...

  5. Using Association Mapping in Teosinte (Zea Mays ssp Parviglumis) to Investigate the Function of Selection-Candidate Genes

    USDA-ARS?s Scientific Manuscript database

    Large-scale screens of the maize genome identified 48 genes that show the putative signature of artificial selection during maize domestication or improvement. These selection-candidate genes may act as quantitative trait loci (QTL) that control the phenotypic differences between maize and its proge...

  6. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  7. A Quantitative Mass Spectrometry-based Approach for Identifying Protein Kinase-Clients and Quantifying Kinase Activity

    USDA-ARS?s Scientific Manuscript database

    The Homo sapiens and Arabidopsis thaliana genomes are believed to encode >500 and >1,000 protein kinases, respectively. Despite this abundance, few bona fide kinase-client relationships have been described in detail. Mass spectrometry (MS)-based approaches have been integral to the large-scale mapp...

  8. A large scale joint analysis of flowering time reveals independent temperate adaptations in maize

    USDA-ARS?s Scientific Manuscript database

    Modulating days to flowering is a key mechanism in plants for adapting to new environments, and variation in days to flowering drives population structure by limiting mating. To elucidate the genetic architecture of flowering across maize, a quantitative trait, we mapped flowering in five global pop...

  9. Factors influencing stream fish recovery following a large-scale disturbance

    Treesearch

    William E. Ensign; Angermeier Leftwich; C. Andrew Dolloff

    1997-01-01

    The authors examined fish distribution and abundance in erosional habitat units in South Fork Roanoke River, VA, following a fish kill by using a reachwide sampling approach for 3 species and a representative-reach sampling approach for 10 species. Qualitative (presence-absence) and quantitative (relative abundance) estimates of distribution and abundance provided...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaidheeswaran, Avinash; Shaffer, Franklin; Gopalan, Balaji

    Here, the statistics of fluctuating velocity components are studied in the riser of a closed-loop circulating fluidized bed with fluid catalytic cracking catalyst particles. Our analysis shows distinct similarities as well as deviations compared to existing theories and bench-scale experiments. The study confirms anisotropic and non-Maxwellian distribution of fluctuating velocity components. The velocity distribution functions (VDFs) corresponding to transverse fluctuations exhibit symmetry, and follow a stretched-exponential behavior up to three standard deviations. The form of the transverse VDF is largely determined by interparticle interactions. The tails become more overpopulated with an increase in particle loading. The observed deviations from themore » Gaussian distribution are represented using the leading order term in the Sonine expansion, which is commonly used to approximate the VDFs in kinetic theory for granular flows. The vertical fluctuating VDFs are asymmetric and the skewness shifts as the wall is approached. In comparison to transverse fluctuations, the vertical VDF is determined by the local hydrodynamics. This is an observation of particle velocity fluctuations in a large-scale system and their quantitative comparison with the Maxwell-Boltzmann statistics.« less

  11. Automated microscopy for high-content RNAi screening

    PubMed Central

    2010-01-01

    Fluorescence microscopy is one of the most powerful tools to investigate complex cellular processes such as cell division, cell motility, or intracellular trafficking. The availability of RNA interference (RNAi) technology and automated microscopy has opened the possibility to perform cellular imaging in functional genomics and other large-scale applications. Although imaging often dramatically increases the content of a screening assay, it poses new challenges to achieve accurate quantitative annotation and therefore needs to be carefully adjusted to the specific needs of individual screening applications. In this review, we discuss principles of assay design, large-scale RNAi, microscope automation, and computational data analysis. We highlight strategies for imaging-based RNAi screening adapted to different library and assay designs. PMID:20176920

  12. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  13. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. How large a dataset should be in order to estimate scaling exponents and other statistics correctly in studies of solar wind turbulence

    NASA Astrophysics Data System (ADS)

    Rowlands, G.; Kiyani, K. H.; Chapman, S. C.; Watkins, N. W.

    2009-12-01

    Quantitative analysis of solar wind fluctuations are often performed in the context of intermittent turbulence and center around methods to quantify statistical scaling, such as power spectra and structure functions which assume a stationary process. The solar wind exhibits large scale secular changes and so the question arises as to whether the timeseries of the fluctuations is non-stationary. One approach is to seek a local stationarity by parsing the time interval over which statistical analysis is performed. Hence, natural systems such as the solar wind unavoidably provide observations over restricted intervals. Consequently, due to a reduction of sample size leading to poorer estimates, a stationary stochastic process (time series) can yield anomalous time variation in the scaling exponents, suggestive of nonstationarity. The variance in the estimates of scaling exponents computed from an interval of N observations is known for finite variance processes to vary as ~1/N as N becomes large for certain statistical estimators; however, the convergence to this behavior will depend on the details of the process, and may be slow. We study the variation in the scaling of second-order moments of the time-series increments with N for a variety of synthetic and “real world” time series, and we find that in particular for heavy tailed processes, for realizable N, one is far from this ~1/N limiting behavior. We propose a semiempirical estimate for the minimum N needed to make a meaningful estimate of the scaling exponents for model stochastic processes and compare these with some “real world” time series from the solar wind. With fewer datapoints the stationary timeseries becomes indistinguishable from a nonstationary process and we illustrate this with nonstationary synthetic datasets. Reference article: K. H. Kiyani, S. C. Chapman and N. W. Watkins, Phys. Rev. E 79, 036109 (2009).

  15. Helium ion microscopy of Lepidoptera scales.

    PubMed

    Boden, Stuart A; Asadollahbaik, Asa; Rutt, Harvey N; Bagnall, Darren M

    2012-01-01

    In this report, helium ion microscopy (HIM) is used to study the micro and nanostructures responsible for structural color in the wings of two species of Lepidotera from the Papilionidae family: Papilio ulysses (Blue Mountain Butterfly) and Parides sesostris (Emerald-patched Cattleheart). Electronic charging of uncoated scales from the wings of these butterflies, due to the incident ion beam, is successfully neutralized, leading to images displaying a large depth-of-field and a high level of surface detail, which would normally be obscured by traditional coating methods used for scanning electron microscopy (SEM). The images are compared with those from variable pressure SEM, demonstrating the superiority of HIM at high magnifications. In addition, the large depth-of-field capabilities of HIM are exploited through the creation of stereo pairs that allows the exploration of the third dimension. Furthermore, the extraction of quantitative height information which matches well with cross-sectional transmission electron microscopy measurements from the literature is demonstrated. © Wiley Periodicals, Inc.

  16. A scalable double-barcode sequencing platform for characterization of dynamic protein-protein interactions.

    PubMed

    Schlecht, Ulrich; Liu, Zhimin; Blundell, Jamie R; St Onge, Robert P; Levy, Sasha F

    2017-05-25

    Several large-scale efforts have systematically catalogued protein-protein interactions (PPIs) of a cell in a single environment. However, little is known about how the protein interactome changes across environmental perturbations. Current technologies, which assay one PPI at a time, are too low throughput to make it practical to study protein interactome dynamics. Here, we develop a highly parallel protein-protein interaction sequencing (PPiSeq) platform that uses a novel double barcoding system in conjunction with the dihydrofolate reductase protein-fragment complementation assay in Saccharomyces cerevisiae. PPiSeq detects PPIs at a rate that is on par with current assays and, in contrast with current methods, quantitatively scores PPIs with enough accuracy and sensitivity to detect changes across environments. Both PPI scoring and the bulk of strain construction can be performed with cell pools, making the assay scalable and easily reproduced across environments. PPiSeq is therefore a powerful new tool for large-scale investigations of dynamic PPIs.

  17. Evolution of Precipitation Structure During the November DYNAMO MJO Event: Cloud-Resolving Model Intercomparison and Cross Validation Using Radar Observations

    NASA Astrophysics Data System (ADS)

    Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong

    2018-04-01

    Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.

  18. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  19. Quantitative measurement of the growth rate of the PHA-producing photosynthetic bacterium Rhodocyclus gelatinous CBS-2[PolyHydroxyAlkanoate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfrum, E.J.; Weaver, P.F.

    Researchers at the National Renewable Energy Laboratory (NREL) have been investigating the use of model photosynthetic microorganisms that use sunlight and two-carbon organic substrates (e.g., ethanol, acetate) to produce biodegradable polyhydroxyalkanoate (PHA) copolymers as carbon storage compounds. Use of these biological PHAs in single-use plastics applications, followed by their post-consumer composting or anaerobic digestion, could impact petroleum consumption as well as the overloading of landfills. The large-scale production of PHA polymers by photosynthetic bacteria will require large-scale reactor systems utilizing either sunlight or artificial illumination. The first step in the scale-up process is to quantify the microbial growth rates andmore » the PHA production rates as a function of reaction conditions such as nutrient concentration, temperature, and light quality and intensity.« less

  20. The topology of large-scale structure. I - Topology and the random phase hypothesis. [galactic formation models

    NASA Technical Reports Server (NTRS)

    Weinberg, David H.; Gott, J. Richard, III; Melott, Adrian L.

    1987-01-01

    Many models for the formation of galaxies and large-scale structure assume a spectrum of random phase (Gaussian), small-amplitude density fluctuations as initial conditions. In such scenarios, the topology of the galaxy distribution on large scales relates directly to the topology of the initial density fluctuations. Here a quantitative measure of topology - the genus of contours in a smoothed density distribution - is described and applied to numerical simulations of galaxy clustering, to a variety of three-dimensional toy models, and to a volume-limited sample of the CfA redshift survey. For random phase distributions the genus of density contours exhibits a universal dependence on threshold density. The clustering simulations show that a smoothing length of 2-3 times the mass correlation length is sufficient to recover the topology of the initial fluctuations from the evolved galaxy distribution. Cold dark matter and white noise models retain a random phase topology at shorter smoothing lengths, but massive neutrino models develop a cellular topology.

  1. Role of Correlations in the Collective Behavior of Microswimmer Suspensions

    NASA Astrophysics Data System (ADS)

    Stenhammar, Joakim; Nardini, Cesare; Nash, Rupert W.; Marenduzzo, Davide; Morozov, Alexander

    2017-07-01

    In this Letter, we study the collective behavior of a large number of self-propelled microswimmers immersed in a fluid. Using unprecedentedly large-scale lattice Boltzmann simulations, we reproduce the transition to bacterial turbulence. We show that, even well below the transition, swimmers move in a correlated fashion that cannot be described by a mean-field approach. We develop a novel kinetic theory that captures these correlations and is nonperturbative in the swimmer density. To provide an experimentally accessible measure of correlations, we calculate the diffusivity of passive tracers and reveal its nontrivial density dependence. The theory is in quantitative agreement with the lattice Boltzmann simulations and captures the asymmetry between pusher and puller swimmers below the transition to turbulence.

  2. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  3. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  4. Phase correlation imaging of unlabeled cell dynamics

    NASA Astrophysics Data System (ADS)

    Ma, Lihong; Rajshekhar, Gannavarpu; Wang, Ru; Bhaduri, Basanta; Sridharan, Shamira; Mir, Mustafa; Chakraborty, Arindam; Iyer, Rajashekar; Prasanth, Supriya; Millet, Larry; Gillette, Martha U.; Popescu, Gabriel

    2016-09-01

    We present phase correlation imaging (PCI) as a novel approach to study cell dynamics in a spatially-resolved manner. PCI relies on quantitative phase imaging time-lapse data and, as such, functions in label-free mode, without the limitations associated with exogenous markers. The correlation time map outputted in PCI informs on the dynamics of the intracellular mass transport. Specifically, we show that PCI can extract quantitatively the diffusion coefficient map associated with live cells, as well as standard Brownian particles. Due to its high sensitivity to mass transport, PCI can be applied to studying the integrity of actin polymerization dynamics. Our results indicate that the cyto-D treatment blocking the actin polymerization has a dominant effect at the large spatial scales, in the region surrounding the cell. We found that PCI can distinguish between senescent and quiescent cells, which is extremely difficult without using specific markers currently. We anticipate that PCI will be used alongside established, fluorescence-based techniques to enable valuable new studies of cell function.

  5. Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.

    PubMed

    Zauber, Henrik; Schulze, Waltraud X

    2012-11-02

    The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.

  6. Intrinsic fluctuations of the proton saturation momentum scale in high multiplicity p+p collisions

    DOE PAGES

    McLerran, Larry; Tribedy, Prithwish

    2015-11-02

    High multiplicity events in p+p collisions are studied using the theory of the Color Glass Condensate. Here, we show that intrinsic fluctuations of the proton saturation momentum scale are needed in addition to the sub-nucleonic color charge fluctuations to explain the very high multiplicity tail of distributions in p+p collisions. It is presumed that the origin of such intrinsic fluctuations is non-perturbative in nature. Classical Yang Mills simulations using the IP-Glasma model are performed to make quantitative estimations. Furthermore, we find that fluctuations as large as O(1) of the average values of the saturation momentum scale can lead to raremore » high multiplicity events seen in p+p data at RHIC and LHC energies. Using the available data on multiplicity distributions we try to constrain the distribution of the proton saturation momentum scale and make predictions for the multiplicity distribution in 13 TeV p+p collisions.« less

  7. Valley s'Asymmetric Characteristics of the Loess Plateau in Northwestern Shanxi Based on DEM

    NASA Astrophysics Data System (ADS)

    Duan, J.

    2016-12-01

    The valleys of the Loess Plateau in northwestern Shanxi show great asymmetry. This study using multi-scale DEMs, high-resolution satellite images and digital terrain analysis method, put forward a quantitative index to describe the asymmetric morphology. Several typical areas are selected to test and verify the spatial variability. Results show: (1) Considering the difference of spatial distribution, Pianguanhe basin, Xianchuanhe basin and Yangjiachuan basin are the areas where show most significant asymmetric characteristics . (2) Considering the difference of scale, the shape of large-scale valleys represents three characteristics: randomness, equilibrium and relative symmetry, while small-scale valleys show directionality and asymmetry. (3) Asymmetric morphology performs orientation, and the east-west valleys extremely obvious. Combined with field survey, its formation mechanism can be interpreted as follows :(1)Loess uneven distribution in the valleys. (2) The distribution diversities of vegetation, water , heat conditions and other factors, make a difference in water erosion capability which leads to asymmetric characteristics.

  8. Primordial Magnetic Field Effects on the CMB and Large-Scale Structure

    DOE PAGES

    Yamazaki, Dai G.; Ichiki, Kiyotomo; Kajino, Toshitaka; ...

    2010-01-01

    Mmore » agnetic fields are everywhere in nature, and they play an important role in every astronomical environment which involves the formation of plasma and currents. It is natural therefore to suppose that magnetic fields could be present in the turbulent high-temperature environment of the big bang. Such a primordial magnetic field (PF) would be expected to manifest itself in the cosmic microwave background (CB) temperature and polarization anisotropies, and also in the formation of large-scale structure. In this paper, we summarize the theoretical framework which we have developed to calculate the PF power spectrum to high precision. Using this formulation, we summarize calculations of the effects of a PF which take accurate quantitative account of the time evolution of the cutoff scale. We review the constructed numerical program, which is without approximation, and an improvement over the approach used in a number of previous works for studying the effect of the PF on the cosmological perturbations. We demonstrate how the PF is an important cosmological physical process on small scales. We also summarize the current constraints on the PF amplitude B λ and the power spectral index n B which have been deduced from the available CB observational data by using our computational framework.« less

  9. Developing stepped care treatment for depression (STEPS): study protocol for a pilot randomised controlled trial.

    PubMed

    Hill, Jacqueline J; Kuyken, Willem; Richards, David A

    2014-11-20

    Stepped care is recommended and implemented as a means to organise depression treatment. Compared with alternative systems, it is assumed to achieve equivalent clinical effects and greater efficiency. However, no trials have examined these assumptions. A fully powered trial of stepped care compared with intensive psychological therapy is required but a number of methodological and procedural uncertainties associated with the conduct of a large trial need to be addressed first. STEPS (Developing stepped care treatment for depression) is a mixed methods study to address uncertainties associated with a large-scale evaluation of stepped care compared with high-intensity psychological therapy alone for the treatment of depression. We will conduct a pilot randomised controlled trial with an embedded process study. Quantitative trial data on recruitment, retention and the pathway of patients through treatment will be used to assess feasibility. Outcome data on the effects of stepped care compared with high-intensity therapy alone will inform a sample size calculation for a definitive trial. Qualitative interviews will be undertaken to explore what people think of our trial methods and procedures and the stepped care intervention. A minimum of 60 patients with Major Depressive Disorder will be recruited from an Improving Access to Psychological Therapies service and randomly allocated to receive stepped care or intensive psychological therapy alone. All treatments will be delivered at clinic facilities within the University of Exeter. Quantitative patient-related data on depressive symptoms, worry and anxiety and quality of life will be collected at baseline and 6 months. The pilot trial and interviews will be undertaken concurrently. Quantitative and qualitative data will be analysed separately and then integrated. The outcomes of this study will inform the design of a fully powered randomised controlled trial to evaluate the effectiveness and efficiency of stepped care. Qualitative data on stepped care will be of immediate interest to patients, clinicians, service managers, policy makers and guideline developers. A more informed understanding of the feasibility of a large trial will be obtained than would be possible from a purely quantitative (or qualitative) design. Current Controlled Trials ISRCTN66346646 registered on 2 July 2014.

  10. Size and structure of Chlorella zofingiensis /FeCl 3 flocs in a shear flow: Algae Floc Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyatt, Nicholas B.; O'Hern, Timothy J.; Shelden, Bion

    Flocculation is a promising method to overcome the economic hurdle to separation of algae from its growth medium in large scale operations. But, understanding of the floc structure and the effects of shear on the floc structure are crucial to the large scale implementation of this technique. The floc structure is important because it determines, in large part, the density and settling behavior of the algae. Freshwater algae floc size distributions and fractal dimensions are presented as a function of applied shear rate in a Couette cell using ferric chloride as a flocculant. Comparisons are made with measurements made formore » a polystyrene microparticle model system taken here as well as reported literature results. The algae floc size distributions are found to be self-preserving with respect to shear rate, consistent with literature data for polystyrene. Moreover, three fractal dimensions are calculated which quantitatively characterize the complexity of the floc structure. Low shear rates result in large, relatively dense packed flocs which elongate and fracture as the shear rate is increased. Our results presented here provide crucial information for economically implementing flocculation as a large scale algae harvesting strategy.« less

  11. From Fibrils to Toughness: Multi-Scale Mechanics of Fibrillating Interfaces in Stretchable Electronics

    PubMed Central

    van der Sluis, Olaf; Vossen, Bart; Geers, Marc

    2018-01-01

    Metal-elastomer interfacial systems, often encountered in stretchable electronics, demonstrate remarkably high interface fracture toughness values. Evidently, a large gap exists between the rather small adhesion energy levels at the microscopic scale (‘intrinsic adhesion’) and the large measured macroscopic work-of-separation. This energy gap is closed here by unravelling the underlying dissipative mechanisms through a systematic numerical/experimental multi-scale approach. This self-containing contribution collects and reviews previously published results and addresses the remaining open questions by providing new and independent results obtained from an alternative experimental set-up. In particular, the experimental studies on Cu-PDMS (Poly(dimethylsiloxane)) samples conclusively reveal the essential role of fibrillation mechanisms at the micro-meter scale during the metal-elastomer delamination process. The micro-scale numerical analyses on single and multiple fibrils show that the dynamic release of the stored elastic energy by multiple fibril fracture, including the interaction with the adjacent deforming bulk PDMS and its highly nonlinear behaviour, provide a mechanistic understanding of the high work-of-separation. An experimentally validated quantitative relation between the macroscopic work-of-separation and peel front height is established from the simulation results. Finally, it is shown that a micro-mechanically motivated shape of the traction-separation law in cohesive zone models is essential to describe the delamination process in fibrillating metal-elastomer systems in a physically meaningful way. PMID:29393908

  12. Research highlights: microfluidics meets big data.

    PubMed

    Tseng, Peter; Weaver, Westbrook M; Masaeli, Mahdokht; Owsley, Keegan; Di Carlo, Dino

    2014-03-07

    In this issue we highlight a collection of recent work in which microfluidic parallelization and automation have been employed to address the increasing need for large amounts of quantitative data concerning cellular function--from correlating microRNA levels to protein expression, increasing the throughput and reducing the noise when studying protein dynamics in single-cells, and understanding how signal dynamics encodes information. The painstaking dissection of cellular pathways one protein at a time appears to be coming to an end, leading to more rapid discoveries which will inevitably translate to better cellular control--in producing useful gene products and treating disease at the individual cell level. From these studies it is also clear that development of large scale mutant or fusion libraries, automation of microscopy, image analysis, and data extraction will be key components as microfluidics contributes its strengths to aid systems biology moving forward.

  13. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  14. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  15. Analysis of the ability of large-scale reanalysis data to define Siberian fire danger in preparation for future fire prediction

    NASA Astrophysics Data System (ADS)

    Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly

    2010-05-01

    Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.

  16. Hospital for Special Surgery Pediatric Functional Activity Brief Scale predicts physical fitness testing performance.

    PubMed

    Fabricant, Peter D; Robles, Alex; McLaren, Son H; Marx, Robert G; Widmann, Roger F; Green, Daniel W

    2014-05-01

    An eight-item activity scale was recently developed and validated for use as a prognostic tool in clinical research in children and adolescents. It is unclear, however, if this brief questionnaire is predictive of quantitative metrics of physical activity and fitness. The purposes of this study were to prospectively administer the Hospital for Special Surgery Pediatric Functional Activity Brief Scale to a large cohort of healthy adolescents to determine (1) if the activity scale exhibits any floor or ceiling effects; (2) if scores on the activity scale are correlated with standardized physical fitness metrics; and if so, (3) to determine the discrimination ability of the activity scale to differentiate between adolescents with healthy or unhealthy levels of aerobic capacity and calculate an appropriate cutoff value for its use as a screening tool. One hundred eighty-two adolescents (mean, 15.3 years old) prospectively completed the activity scale and four standardized metrics of physical fitness: pushups, sit-ups, shuttle run exercise (Progressive Aerobic Cardiovascular Endurance Run), and calculated VO2-max. Age, sex, and body mass index were also recorded. Pearson correlations, regression analyses, and receiver operating characteristic analyses were used to evaluate activity scale performance. The activity scale did not exhibit any floor or ceiling effects. Pushups (ρ = 0.28), sit-ups (ρ = 0.23), performance on the Progressive Aerobic Cardiovascular Endurance Run (ρ = 0.44), and VO2-max (ρ = 0.43) were all positively correlated with the activity scale score (Pearson correlations, all p < 0.001). Receiver operating characteristic analysis revealed that those with an activity score of ≤ 14 were at higher risk of having low levels of aerobic capacity. In the current study, activity score was free of floor and ceiling effects and predictive of all four physical fitness metrics. An activity score of ≤ 14 was associated with at-risk aerobic capacity previously shown to be associated with an increased risk of metabolic syndrome. This study is the first to prospectively validate an activity questionnaire against quantitative physical fitness assessments and provides further evidence substantiating its use in outcomes research and screening for healthy levels of childhood activity and fitness. Level I, diagnostic study. See the Guidelines for Authors for a complete description of levels of evidence.

  17. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  18. Development of multitissue microfluidic dynamic array for assessing changes in gene expression associated with channel catfish appetite, growth, metabolism, and intestinal health

    USDA-ARS?s Scientific Manuscript database

    Large-scale, gene expression methods allow for high throughput analysis of physiological pathways at a fraction of the cost of individual gene expression analysis. Systems, such as the Fluidigm quantitative PCR array described here, can provide powerful assessments of the effects of diet, environme...

  19. Acquiring a Variable Structure: An Interlanguage Analysis of Second Language Mood Use in Spanish

    ERIC Educational Resources Information Center

    Gudmestad, Aarnes

    2012-01-01

    This investigation connects issues in second language (L2) acquisition to topics in quantitative sociolinguistics by exploring the relationship between native-speaker (NS) and L2 variation. It is the first large-scale analysis of L2 mood use (the subjunctive-indicative contrast) in Spanish. It applies variationist findings on the range of…

  20. Considerations for interpreting probabilistic estimates of uncertainty of forest carbon

    Treesearch

    James E. Smith; Linda S. Heath

    2000-01-01

    Quantitative estimated of carbon inventories are needed as part of nationwide attempts to reduce net release of greenhouse gases and the associated climate forcing. Naturally, an appreciable amount of uncertainty is inherent in such large-scale assessments, especially since both science and policy issues are still evolving. Decision makers need an idea of the...

  1. The Relationship between English Language Learners' Language Proficiency and Standardized Test Scores

    ERIC Educational Resources Information Center

    Thakkar, Darshan

    2013-01-01

    It is generally theorized that English Language Learner (ELL) students do not succeed on state standardized tests because ELL students lack the cognitive academic language skills necessary to function on the large scale content assessments. The purpose of this dissertation was to test that theory. Through the use of quantitative methodology, ELL…

  2. Researching Returns Emanating from Participation in Adult Education Courses: A Quantitative Approach

    ERIC Educational Resources Information Center

    Panitsides, Eugenia

    2013-01-01

    Throughout contemporary literature, participants in adult education courses have been reported to acquire knowledge and skills, develop understanding and enhance self-confidence, parameters that induce changes in their personal lives, while enabling them to play a more active role in their family, community or work. In this vein, a large-scale,…

  3. Advancing effects analysis for integrated, large-scale wildfire risk assessment

    Treesearch

    Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager

    2011-01-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...

  4. Women in Engineering in Turkey--A Large Scale Quantitative and Qualitative Examination

    ERIC Educational Resources Information Center

    Smith, Alice E.; Dengiz, Berna

    2010-01-01

    The underrepresentation of women in engineering is well known and unresolved. However, Turkey has witnessed a shift in trend from virtually no female participation in engineering to across-the-board proportions that dominate other industrialised countries within the 76 years of the founding of the Turkish Republic. This paper describes the largest…

  5. Health at the Sub-catchment Scale: Typhoid and Its Environmental Determinants in Central Division, Fiji.

    PubMed

    Jenkins, Aaron Peter; Jupiter, Stacy; Mueller, Ute; Jenney, Adam; Vosaki, Gandercillar; Rosa, Varanisese; Naucukidi, Alanieta; Mulholland, Kim; Strugnell, Richard; Kama, Mike; Horwitz, Pierre

    2016-12-01

    The impact of environmental change on transmission patterns of waterborne enteric diseases is a major public health concern. This study concerns the burden and spatial nature of enteric fever, attributable to Salmonella Typhi infection in the Central Division, Republic of Fiji at a sub-catchment scale over 30-months (2013-2015). Quantitative spatial analysis suggested relationships between environmental conditions of sub-catchments and incidence and recurrence of typhoid fever. Average incidence per inhabited sub-catchment for the Central Division was high at 205.9/100,000, with cases recurring in each calendar year in 26% of sub-catchments. Although the numbers of cases were highest within dense, urban coastal sub-catchments, the incidence was highest in low-density mountainous rural areas. Significant environmental determinants at this scale suggest increased risk of exposure where sediment yields increase following runoff. The study suggests that populations living on large systems that broaden into meandering mid-reaches and floodplains with alluvial deposition are at a greater risk compared to small populations living near small, erosional, high-energy headwaters and small streams unconnected to large hydrological networks. This study suggests that anthropogenic alteration of land cover and hydrology (particularly via fragmentation of riparian forest and connectivity between road and river networks) facilitates increased transmission of typhoid fever and that environmental transmission of typhoid fever is important in Fiji.

  6. Assessing Student Status and Progress in Science Reasoning and Quantitative Literacy at a Very Large Undergraduate Institution

    NASA Astrophysics Data System (ADS)

    Donahue, Megan; Kaplan, J.; Ebert-May, D.; Ording, G.; Melfi, V.; Gilliland, D.; Sikorski, A.; Johnson, N.

    2009-01-01

    The typical large liberal-arts, tier-one research university requires all of its graduates to achieve some minimal standards of quantitative literacy and scientific reasoning skills. But how do we know what we are doing, as instructors and as a university, is working the way we think it should? At Michigan State University, a cross-disciplinary team of scientists, statisticians, and teacher education experts have begun a large-scale investigation about student mastery of quantitative and scientific skills, beginning with an assessment of 3,000 freshmen before they start their university careers. We will describe the process we used for developing and testing an instrument, for expanding faculty involvement and input on high-level goals. For this limited presentation, we will limit the discussion mainly to the scientific reasoning perspective, but we will briefly mention some intriguing observations regarding quantitative literacy as well. This project represents the beginning of long-term, longitudinal tracking of the progress of students at our institution. We will discuss preliminary results our 2008 assessment of incoming freshman at Michigan State, and where we plan to go from here. We acknowledge local support from the Quality Fund from the Office of the Provost at MSU. We also acknowledge the Center for Assessment at James Madison University and the NSF for their support at the very beginning of our work.

  7. Do Junior High School Students Perceive Their Learning Environment as Constructivist?

    NASA Astrophysics Data System (ADS)

    Moustafa, Asely; Ben-Zvi-Assaraf, Orit; Eshach, Haim

    2013-08-01

    The purpose of this study is to examine the manner in which the features of a constructivist learning environment, and the mechanisms at its base, are expressed in junior high school students' conceptions. Our research is based on an integration of quantitative and qualitative approaches, deigned to provide a wider ranging and deeper understanding. Eight hundred and forty eighth- and ninth-grade students from over 15 schools participated in the study. Of the 840 students who completed the questionnaire, the explanations of 200 well-written questionnaires were further analyzed qualitatively. The findings of the study are presented in terms of the four scales employed in the CLES, namely the autonomy scale, the prior knowledge scale, the negotiation scale, and the student-centeredness scale. The quantitative results achieved here concur with parallel studies conducted around the world. The findings indicate that a considerable portion of the students perceive their learning environment as a constructivist one and report positive attitudes toward the way they are being taught. In terms of the qualitative results, however, it appears that in some cases, the students' explanations reveal that in fact, and contrary to the bare quantitative results, some students do not perceive their learning environment as being constructivist. This raises the question of whether the fact that students recognize the factors associated with constructivist teaching is indeed an indication that such teaching exists in practice. This finding emphasizes the importance of combining qualitative and quantitative methods for arriving at a balanced view of classroom occurrences.

  8. A genome-wide linkage scan for quantitative trait loci underlying obesity related phenotypes in 434 Caucasian families.

    PubMed

    Zhao, Lan-Juan; Xiao, Peng; Liu, Yong-Jun; Xiong, Dong-Hai; Shen, Hui; Recker, Robert R; Deng, Hong-Wen

    2007-03-01

    To identify quantitative trait loci (QTLs) that contribute to obesity, we performed a large-scale whole genome linkage scan (WGS) involving 4,102 individuals from 434 Caucasian families. The most pronounced linkage evidence was found at the genomic region 20p11-12 for fat mass (LOD = 3.31) and percentage fat mass (PFM) (LOD = 2.92). We also identified several regions showing suggestive linkage signals (threshold LOD = 1.9) for obesity phenotypes, including 5q35, 8q13, 10p12, and 17q11.

  9. Finite-Size Scaling of a First-Order Dynamical Phase Transition: Adaptive Population Dynamics and an Effective Model

    NASA Astrophysics Data System (ADS)

    Nemoto, Takahiro; Jack, Robert L.; Lecomte, Vivien

    2017-03-01

    We analyze large deviations of the time-averaged activity in the one-dimensional Fredrickson-Andersen model, both numerically and analytically. The model exhibits a dynamical phase transition, which appears as a singularity in the large deviation function. We analyze the finite-size scaling of this phase transition numerically, by generalizing an existing cloning algorithm to include a multicanonical feedback control: this significantly improves the computational efficiency. Motivated by these numerical results, we formulate an effective theory for the model in the vicinity of the phase transition, which accounts quantitatively for the observed behavior. We discuss potential applications of the numerical method and the effective theory in a range of more general contexts.

  10. The operations manual: a mechanism for improving the research process.

    PubMed

    Bowman, Ann; Wyman, Jean F; Peters, Jennifer

    2002-01-01

    The development and use of an operations manual has the potential to improve the capacity of nurse scientists to address the complex, multifaceted issues associated with conducting research in today's healthcare environment. An operations manual facilitates communication, standardizes training and evaluation, and enhances the development and standard implementation of clear policies, processes, and protocols. A 10-year review of methodology articles in relevant nursing journals revealed no attention to this topic. This article will discuss how an operations manual can improve the conduct of research methods and outcomes for both small-scale and large-scale research studies. It also describes the purpose and components of a prototype operations manual for use in quantitative research. The operations manual increases reliability and reproducibility of the research while improving the management of study processes. It can prevent costly and untimely delays or errors in the conduct of research.

  11. A theoretical study of hydrodynamic cavitation.

    PubMed

    Arrojo, S; Benito, Y

    2008-03-01

    The optimization of hydrodynamic cavitation as an AOP requires identifying the key parameters and studying their effects on the process. Specific simulations of hydrodynamic bubbles reveal that time scales play a major role on the process. Rarefaction/compression periods generate a number of opposing effects which have demonstrated to be quantitatively different from those found in ultrasonic cavitation. Hydrodynamic cavitation can be upscaled and offers an energy efficient way of generating cavitation. On the other hand, the large characteristic time scales hinder bubble collapse and generate a low number of cavitation cycles per unit time. By controlling the pressure pulse through a flexible cavitation chamber design these limitations can be partially compensated. The chemical processes promoted by this technique are also different from those found in ultrasonic cavitation. Properties such as volatility or hydrophobicity determine the potential applicability of HC and therefore have to be taken into account.

  12. The Quantitative Analysis of User Behavior Online - Data, Models and Algorithms

    NASA Astrophysics Data System (ADS)

    Raghavan, Prabhakar

    By blending principles from mechanism design, algorithms, machine learning and massive distributed computing, the search industry has become good at optimizing monetization on sound scientific principles. This represents a successful and growing partnership between computer science and microeconomics. When it comes to understanding how online users respond to the content and experiences presented to them, we have more of a lacuna in the collaboration between computer science and certain social sciences. We will use a concrete technical example from image search results presentation, developing in the process some algorithmic and machine learning problems of interest in their own right. We then use this example to motivate the kinds of studies that need to grow between computer science and the social sciences; a critical element of this is the need to blend large-scale data analysis with smaller-scale eye-tracking and "individualized" lab studies.

  13. Reliable determination of oxygen and hydrogen isotope ratios in atmospheric water vapour adsorbed on 3A molecular sieve.

    PubMed

    Han, Liang-Feng; Gröning, Manfred; Aggarwal, Pradeep; Helliker, Brent R

    2006-01-01

    The isotope ratio of atmospheric water vapour is determined by wide-ranging feedback effects from the isotope ratio of water in biological water pools, soil surface horizons, open water bodies and precipitation. Accurate determination of atmospheric water vapour isotope ratios is important for a broad range of research areas from leaf-scale to global-scale isotope studies. In spite of the importance of stable isotopic measurements of atmospheric water vapour, there is a paucity of published data available, largely because of the requirement for liquid nitrogen or dry ice for quantitative trapping of water vapour. We report results from a non-cryogenic method for quantitatively trapping atmospheric water vapour using 3A molecular sieve, although water is removed from the column using standard cryogenic methods. The molecular sieve column was conditioned with water of a known isotope ratio to 'set' the background signature of the molecular sieve. Two separate prototypes were developed, one for large collection volumes (3 mL) and one for small collection volumes (90 microL). Atmospheric water vapour was adsorbed to the column by pulling air through the column for several days to reach the desired final volume. Water was recovered from the column by baking at 250 degrees C in a dry helium or nitrogen air stream and cryogenically trapped. For the large-volume apparatus, the recovered water differed from water that was simultaneously trapped by liquid nitrogen (the experimental control) by 2.6 per thousand with a standard deviation (SD) of 1.5 per thousand for delta(2)H and by 0.3 per thousand with a SD of 0.2 per thousand for delta(18)O. Water-vapour recovery was not satisfactory for the small volume apparatus. Copyright (c) 2006 John Wiley & Sons, Ltd.

  14. Activity-Based Introductory Physics Reform *

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald

    2004-05-01

    Physics education research has shown that learning environments that engage students and allow them to take an active part in their learning can lead to large conceptual gains compared to those of good traditional instruction. Examples of successful curricula and methods include Peer Instruction, Just in Time Teaching, RealTime Physics, Workshop Physics, Scale-Up, and Interactive Lecture Demonstrations (ILDs). RealTime Physics promotes interaction among students in a laboratory setting and makes use of powerful real-time data logging tools to teach concepts as well as quantitative relationships. An active learning environment is often difficult to achieve in large lecture sessions and Workshop Physics and Scale-Up largely eliminate lectures in favor of collaborative student activities. Peer Instruction, Just in Time Teaching, and Interactive Lecture Demonstrations (ILDs) make lectures more interactive in complementary ways. This presentation will introduce these reforms and use Interactive Lecture Demonstrations (ILDs) with the audience to illustrate the types of curricula and tools used in the curricula above. ILDs make use real experiments, real-time data logging tools and student interaction to create an active learning environment in large lecture classes. A short video of students involved in interactive lecture demonstrations will be shown. The results of research studies at various institutions to measure the effectiveness of these methods will be presented.

  15. Determination of burn patient outcome by large-scale quantitative discovery proteomics

    PubMed Central

    Finnerty, Celeste C.; Jeschke, Marc G.; Qian, Wei-Jun; Kaushal, Amit; Xiao, Wenzhong; Liu, Tao; Gritsenko, Marina A.; Moore, Ronald J.; Camp, David G.; Moldawer, Lyle L.; Elson, Constance; Schoenfeld, David; Gamelli, Richard; Gibran, Nicole; Klein, Matthew; Arnoldo, Brett; Remick, Daniel; Smith, Richard D.; Davis, Ronald; Tompkins, Ronald G.; Herndon, David N.

    2013-01-01

    Objective Emerging proteomics techniques can be used to establish proteomic outcome signatures and to identify candidate biomarkers for survival following traumatic injury. We applied high-resolution liquid chromatography-mass spectrometry (LC-MS) and multiplex cytokine analysis to profile the plasma proteome of survivors and non-survivors of massive burn injury to determine the proteomic survival signature following a major burn injury. Design Proteomic discovery study. Setting Five burn hospitals across the U.S. Patients Thirty-two burn patients (16 non-survivors and 16 survivors), 19–89 years of age, were admitted within 96 h of injury to the participating hospitals with burns covering >20% of the total body surface area and required at least one surgical intervention. Interventions None. Measurements and Main Results We found differences in circulating levels of 43 proteins involved in the acute phase response, hepatic signaling, the complement cascade, inflammation, and insulin resistance. Thirty-two of the proteins identified were not previously known to play a role in the response to burn. IL-4, IL-8, GM-CSF, MCP-1, and β2-microglobulin correlated well with survival and may serve as clinical biomarkers. Conclusions These results demonstrate the utility of these techniques for establishing proteomic survival signatures and for use as a discovery tool to identify candidate biomarkers for survival. This is the first clinical application of a high-throughput, large-scale LC-MS-based quantitative plasma proteomic approach for biomarker discovery for the prediction of patient outcome following burn, trauma or critical illness. PMID:23507713

  16. Inferring cortical function in the mouse visual system through large-scale systems neuroscience.

    PubMed

    Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof

    2016-07-05

    The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.

  17. A real-time interferometer technique for compressible flow research

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Strengths and shortcomings in the application of interferometric techniques to transonic flow fields are examined and an improved method is elaborated. Such applications have demonstrated the value of interferometry in obtaining data for compressible flow research. With holographic techniques, interferometry may be applied in large scale facilities without the use of expensive optics or elaborate vibration isolation equipment. Results obtained using holographic interferometry and other methods demonstrate that reliable qualitative and quantitative data can be acquired. Nevertheless, the conventional method can be difficult to set up and apply, and it cannot produce real-time data. A new interferometry technique is investigated that promises to be easier to apply and can provide real-time information. This single-beam technique has the necessary insensitivity to vibration for large scale wind tunnel operations. Capabilities of the method and preliminary tests on some laboratory scale flow fluids are described.

  18. Control factors and scale analysis of annual river water, sediments and carbon transport in China.

    PubMed

    Song, Chunlin; Wang, Genxu; Sun, Xiangyang; Chang, Ruiying; Mao, Tianxu

    2016-05-11

    Under the context of dramatic human disturbances on river system, the processes that control the transport of water, sediment, and carbon from river basins to coastal seas are not completely understood. Here we performed a quantitative synthesis for 121 sites across China to find control factors of annual river exports (Rc: runoff coefficient; TSSC: total suspended sediment concentration; TSSL: total suspended sediment loads; TOCL: total organic carbon loads) at different spatial scales. The results indicated that human activities such as dam construction and vegetation restoration might have a greater influence than climate on the transport of river sediment and carbon, although climate was a major driver of Rc. Multiple spatial scale analyses indicated that Rc increased from the small to medium scale by 20% and then decreased at the sizable scale by 20%. TSSC decreased from the small to sizeable scale but increase from the sizeable to large scales; however, TSSL significantly decreased from small (768 g·m(-2)·a(-1)) to medium spatial scale basins (258 g·m(-2)·a(-1)), and TOCL decreased from the medium to large scale. Our results will improve the understanding of water, sediment and carbon transport processes and contribute better water and land resources management strategies from different spatial scales.

  19. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  20. ICA model order selection of task co-activation networks.

    PubMed

    Ray, Kimberly L; McKay, D Reese; Fox, Peter M; Riedel, Michael C; Uecker, Angela M; Beckmann, Christian F; Smith, Stephen M; Fox, Peter T; Laird, Angela R

    2013-01-01

    Independent component analysis (ICA) has become a widely used method for extracting functional networks in the brain during rest and task. Historically, preferred ICA dimensionality has widely varied within the neuroimaging community, but typically varies between 20 and 100 components. This can be problematic when comparing results across multiple studies because of the impact ICA dimensionality has on the topology of its resultant components. Recent studies have demonstrated that ICA can be applied to peak activation coordinates archived in a large neuroimaging database (i.e., BrainMap Database) to yield whole-brain task-based co-activation networks. A strength of applying ICA to BrainMap data is that the vast amount of metadata in BrainMap can be used to quantitatively assess tasks and cognitive processes contributing to each component. In this study, we investigated the effect of model order on the distribution of functional properties across networks as a method for identifying the most informative decompositions of BrainMap-based ICA components. Our findings suggest dimensionality of 20 for low model order ICA to examine large-scale brain networks, and dimensionality of 70 to provide insight into how large-scale networks fractionate into sub-networks. We also provide a functional and organizational assessment of visual, motor, emotion, and interoceptive task co-activation networks as they fractionate from low to high model-orders.

  1. ICA model order selection of task co-activation networks

    PubMed Central

    Ray, Kimberly L.; McKay, D. Reese; Fox, Peter M.; Riedel, Michael C.; Uecker, Angela M.; Beckmann, Christian F.; Smith, Stephen M.; Fox, Peter T.; Laird, Angela R.

    2013-01-01

    Independent component analysis (ICA) has become a widely used method for extracting functional networks in the brain during rest and task. Historically, preferred ICA dimensionality has widely varied within the neuroimaging community, but typically varies between 20 and 100 components. This can be problematic when comparing results across multiple studies because of the impact ICA dimensionality has on the topology of its resultant components. Recent studies have demonstrated that ICA can be applied to peak activation coordinates archived in a large neuroimaging database (i.e., BrainMap Database) to yield whole-brain task-based co-activation networks. A strength of applying ICA to BrainMap data is that the vast amount of metadata in BrainMap can be used to quantitatively assess tasks and cognitive processes contributing to each component. In this study, we investigated the effect of model order on the distribution of functional properties across networks as a method for identifying the most informative decompositions of BrainMap-based ICA components. Our findings suggest dimensionality of 20 for low model order ICA to examine large-scale brain networks, and dimensionality of 70 to provide insight into how large-scale networks fractionate into sub-networks. We also provide a functional and organizational assessment of visual, motor, emotion, and interoceptive task co-activation networks as they fractionate from low to high model-orders. PMID:24339802

  2. Dynamics of Topological Excitations in a Model Quantum Spin Ice

    NASA Astrophysics Data System (ADS)

    Huang, Chun-Jiong; Deng, Youjin; Wan, Yuan; Meng, Zi Yang

    2018-04-01

    We study the quantum spin dynamics of a frustrated X X Z model on a pyrochlore lattice by using large-scale quantum Monte Carlo simulation and stochastic analytic continuation. In the low-temperature quantum spin ice regime, we observe signatures of coherent photon and spinon excitations in the dynamic spin structure factor. As the temperature rises to the classical spin ice regime, the photon disappears from the dynamic spin structure factor, whereas the dynamics of the spinon remain coherent in a broad temperature window. Our results provide experimentally relevant, quantitative information for the ongoing pursuit of quantum spin ice materials.

  3. History of ancient copper smelting pollution during Roman and Medieval times recorded in Greenland ice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Sungmin; Candelone, J.P.; Patterson, C.C.

    1996-04-12

    Determination of copper concentrations in Greenland ice dated from seven millennia ago to the present showed values exceeding natural levels, beginning about 2500 years ago. This early large-scale pollution of the atmosphere of the Northern Hemisphere is attributed to emissions from the crude, highly polluting smelting technologies used for copper production during Roman and medieval times, especially in Europe and China. This study opens the way to a quantitative assessment of the history of early metal production, which was instrumental in the development of human cultures during ancient eras. 27 refs., 1 fig., 2 tabs.

  4. Laboratory band strengths of methane and their application to the atmospheres of Jupiter, Saturn, Uranus, Neptune, and Titan. II - The red region 6000-7600 A

    NASA Technical Reports Server (NTRS)

    Lutz, B. L.; Owen, T.; Cess, R. D.

    1982-01-01

    Lutz et al. (1976) have reported the first quantitative analyses of the strengths of the blue-green bands of methane which dominate the visible spectra of the outer planets. The present investigation represents an extension of the first study to include a number of bands between 6000 and 7500 A. The objective of this extension is to establish the validity of the scaled numerical curve of growth of the first study further into the saturated region and to test the apparent pressure independence of the high-overtone bands over a large pressure range. In addition, it is desired to provide a set of homogeneously determined band strengths and curves of growth over a large spectral region and over a large range of band strengths. This will make it possible to investigate feasible apparent dependences of planetary methane abundances on wavelength and band strength as a probe of the scattering processes in the planetary atmospheres.

  5. Determination of functional collective motions in a protein at atomic resolution using coherent neutron scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Liang; Jain, Nitin; Cheng, Xiaolin

    Protein function often depends on global, collective internal motions. However, the simultaneous quantitative experimental determination of the forms, amplitudes, and time scales of these motions has remained elusive. We demonstrate that a complete description of these large-scale dynamic modes can be obtained using coherent neutron-scattering experiments on perdeuterated samples. With this approach, a microscopic relationship between the structure, dynamics, and function in a protein, cytochrome P450cam, is established. The approach developed here should be of general applicability to protein systems.

  6. Determination of functional collective motions in a protein at atomic resolution using coherent neutron scattering

    DOE PAGES

    Hong, Liang; Jain, Nitin; Cheng, Xiaolin; ...

    2016-10-14

    Protein function often depends on global, collective internal motions. However, the simultaneous quantitative experimental determination of the forms, amplitudes, and time scales of these motions has remained elusive. We demonstrate that a complete description of these large-scale dynamic modes can be obtained using coherent neutron-scattering experiments on perdeuterated samples. With this approach, a microscopic relationship between the structure, dynamics, and function in a protein, cytochrome P450cam, is established. The approach developed here should be of general applicability to protein systems.

  7. Fast inertial particle manipulation in oscillating flows

    NASA Astrophysics Data System (ADS)

    Thameem, Raqeeb; Rallabandi, Bhargav; Hilgenfeldt, Sascha

    2017-05-01

    It is demonstrated that micron-sized particles suspended in fluid near oscillating interfaces experience strong inertial displacements above and beyond the fluid streaming. Experiments with oscillating bubbles show rectified particle lift over extraordinarily short (millisecond) times. A quantitative model on both the oscillatory and the steady time scales describes the particle displacement relative to the fluid motion. The formalism yields analytical predictions confirming the observed scaling behavior with particle size and experimental control parameters. It applies to a large class of oscillatory flows with applications from particle trapping to size sorting.

  8. Bicultural Work Motivation Scale for Asian American College Students

    ERIC Educational Resources Information Center

    Chen, Yung-Lung; Fouad, Nadya A.

    2016-01-01

    The bicultural work motivations of Asian Americans have not yet been comprehensively captured by contemporary vocational constructs and scales. For this study, we conducted two studies on the initial reliability and validity of the Bicultural Work Motivation Scale (BWMS) by combining qualitative and quantitative methods. First, a pilot study was…

  9. A simple model to quantitatively account for periodic outbreaks of the measles in the Dutch Bible Belt

    NASA Astrophysics Data System (ADS)

    Bier, Martin; Brak, Bastiaan

    2015-04-01

    In the Netherlands there has been nationwide vaccination against the measles since 1976. However, in small clustered communities of orthodox Protestants there is widespread refusal of the vaccine. After 1976, three large outbreaks with about 3000 reported cases of the measles have occurred among these orthodox Protestants. The outbreaks appear to occur about every twelve years. We show how a simple Kermack-McKendrick-like model can quantitatively account for the periodic outbreaks. Approximate analytic formulae to connect the period, size, and outbreak duration are derived. With an enhanced model we take the latency period in account. We also expand the model to follow how different age groups are affected. Like other researchers using other methods, we conclude that large scale underreporting of the disease must occur.

  10. 3D-PTV around Operational Wind Turbines

    NASA Astrophysics Data System (ADS)

    Brownstein, Ian; Dabiri, John

    2016-11-01

    Laboratory studies and numerical simulations of wind turbines are typically constrained in how they can inform operational turbine behavior. Laboratory experiments are usually unable to match both pertinent parameters of full-scale wind turbines, the Reynolds number (Re) and tip speed ratio, using scaled-down models. Additionally, numerical simulations of the flow around wind turbines are constrained by the large domain size and high Re that need to be simulated. When these simulations are preformed, turbine geometry is typically simplified resulting in flow structures near the rotor not being well resolved. In order to bypass these limitations, a quantitative flow visualization method was developed to take in situ measurements of the flow around wind turbines at the Field Laboratory for Optimized Wind Energy (FLOWE) in Lancaster, CA. The apparatus constructed was able to seed an approximately 9m x 9m x 5m volume in the wake of the turbine using artificial snow. Quantitative measurements were obtained by tracking the evolution of the artificial snow using a four camera setup. The methodology for calibrating and collecting data, as well as preliminary results detailing the flow around a 2kW vertical-axis wind turbine (VAWT), will be presented.

  11. A centennial tribute to G.K. Gilbert's Hydraulic Mining Débris in the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    James, L. A.; Phillips, J. D.; Lecce, S. A.

    2017-10-01

    G.K. Gilbert's (1917) classic monograph, Hydraulic-Mining Débris in the Sierra Nevada, is described and put into the context of modern geomorphic knowledge. The emphasis here is on large-scale applied fluvial geomorphology, but other key elements-e.g., coastal geomorphology-are also briefly covered. A brief synopsis outlines key elements of the monograph, followed by discussions of highly influential aspects including the integrated watershed perspective, the extreme example of anthropogenic sedimentation, computation of a quantitative, semidistributed sediment budget, and advent of sediment-wave theory. Although Gilbert did not address concepts of equilibrium and grade in much detail, the rivers of the northwestern Sierra Nevada were highly disrupted and thrown into a condition of nonequilibrium. Therefore, concepts of equilibrium and grade-for which Gilbert's early work is often cited-are discussed. Gilbert's work is put into the context of complex nonlinear dynamics in geomorphic systems and how these concepts can be used to interpret the nonequilibrium systems described by Gilbert. Broad, basin-scale studies were common in the period, but few were as quantitative and empirically rigorous or employed such a range of methodologies as PP105. None demonstrated such an extreme case of anthropogeomorphic change.

  12. Nonlinear optical microscopy and ultrasound imaging of human cervical structure

    NASA Astrophysics Data System (ADS)

    Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.; Hall, Timothy J.

    2013-03-01

    The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth.

  13. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leclercq, Florent; Wandelt, Benjamin; Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: jasche@iap.fr, E-mail: wandelt@iap.fr

    Recent application of the Bayesian algorithm \\textsc(borg) to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of themore » tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.« less

  14. Synthesis and Preclinical Characterization of a Cationic Iodinated Imaging Contrast Agent (CA4+) and Its Use for Quantitative Computed Tomography of Ex Vivo Human Hip Cartilage.

    PubMed

    Stewart, Rachel C; Patwa, Amit N; Lusic, Hrvoje; Freedman, Jonathan D; Wathier, Michel; Snyder, Brian D; Guermazi, Ali; Grinstaff, Mark W

    2017-07-13

    Contrast agents that go beyond qualitative visualization and enable quantitative assessments of functional tissue performance represent the next generation of clinically useful imaging tools. An optimized and efficient large-scale synthesis of a cationic iodinated contrast agent (CA4+) is described for imaging articular cartilage. Contrast-enhanced CT (CECT) using CA4+ reveals significantly greater agent uptake of CA4+ in articular cartilage compared to that of similar anionic or nonionic agents, and CA4+ uptake follows Donnan equilibrium theory. The CA4+ CECT attenuation obtained from imaging ex vivo human hip cartilage correlates with the glycosaminoglycan content, equilibrium modulus, and coefficient of friction, which are key indicators of cartilage functional performance and osteoarthritis stage. Finally, preliminary toxicity studies in a rat model show no adverse events, and a pharmacokinetics study documents a peak plasma concentration 30 min after dosing, with the agent no longer present in vivo at 96 h via excretion in the urine.

  15. Old wine in new bottles: decanting systemic family process research in the era of evidence-based practice.

    PubMed

    Rohrbaugh, Michael J

    2014-09-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when "solutions" maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle-from qualitative to quantitative observation and back again. © 2014 FPI, Inc.

  16. Old Wine in New Bottles: Decanting Systemic Family Process Research in the Era of Evidence-Based Practice†

    PubMed Central

    Rohrbaugh, Michael J.

    2015-01-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when “solutions” maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation (FAMCON) approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle – from qualitative to quantitative observation and back again. PMID:24905101

  17. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  18. Quantitative image analysis of cellular heterogeneity in breast tumors complements genomic profiling.

    PubMed

    Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian

    2012-10-24

    Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.

  19. A Method for Semi-quantitative Assessment of Exposure to Pesticides of Applicators and Re-entry Workers: An Application in Three Farming Systems in Ethiopia.

    PubMed

    Negatu, Beyene; Vermeulen, Roel; Mekonnen, Yalemtshay; Kromhout, Hans

    2016-07-01

    To develop an inexpensive and easily adaptable semi-quantitative exposure assessment method to characterize exposure to pesticide in applicators and re-entry farmers and farm workers in Ethiopia. Two specific semi-quantitative exposure algorithms for pesticides applicators and re-entry workers were developed and applied to 601 farm workers employed in 3 distinctly different farming systems [small-scale irrigated, large-scale greenhouses (LSGH), and large-scale open (LSO)] in Ethiopia. The algorithm for applicators was based on exposure-modifying factors including application methods, farm layout (open or closed), pesticide mixing conditions, cleaning of spraying equipment, intensity of pesticide application per day, utilization of personal protective equipment (PPE), personal hygienic behavior, annual frequency of application, and duration of employment at the farm. The algorithm for re-entry work was based on an expert-based re-entry exposure intensity score, utilization of PPE, personal hygienic behavior, annual frequency of re-entry work, and duration of employment at the farm. The algorithms allowed estimation of daily, annual and cumulative lifetime exposure for applicators, and re-entry workers by farming system, by gender, and by age group. For all metrics, highest exposures occurred in LSGH for both applicators and female re-entry workers. For male re-entry workers, highest cumulative exposure occurred in LSO farms. Female re-entry workers appeared to be higher exposed on a daily or annual basis than male re-entry workers, but their cumulative exposures were similar due to the fact that on average males had longer tenure. Factors related to intensity of exposure (like application method and farm layout) were indicated as the main driving factors for estimated potential exposure. Use of personal protection, hygienic behavior, and duration of employment in surveyed farm workers contributed less to the contrast in exposure estimates. This study indicated that farmers' and farm workers' exposure to pesticides can be inexpensively characterized, ranked, and classified. Our method could be extended to assess exposure to specific active ingredients provided that detailed information on pesticides used is available. The resulting exposure estimates will consequently be used in occupational epidemiology studies in Ethiopia and other similar countries with few resources. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  20. The genetic architecture of novel trophic specialists: larger effect sizes are associated with exceptional oral jaw diversification in a pupfish adaptive radiation.

    PubMed

    Martin, Christopher H; Erickson, Priscilla A; Miller, Craig T

    2017-01-01

    The genetic architecture of adaptation is fundamental to understanding the mechanisms and constraints governing diversification. However, most case studies focus on loss of complex traits or parallel speciation in similar environments. It is still unclear how the genetic architecture of these local adaptive processes compares to the architecture of evolutionary transitions contributing to morphological and ecological novelty. Here, we identify quantitative trait loci (QTL) between two trophic specialists in an excellent case study for examining the origins of ecological novelty: a sympatric radiation of pupfishes endemic to San Salvador Island, Bahamas, containing a large-jawed scale-eater and a short-jawed molluscivore with a skeletal nasal protrusion. These specialized niches and trophic traits are unique among over 2000 related species. Measurements of the fitness landscape on San Salvador demonstrate multiple fitness peaks and a larger fitness valley isolating the scale-eater from the putative ancestral intermediate phenotype of the generalist, suggesting that more large-effect QTL should contribute to its unique phenotype. We evaluated this prediction using an F2 intercross between these specialists. We present the first linkage map for pupfishes and detect significant QTL for sex and eight skeletal traits. Large-effect QTL contributed more to enlarged scale-eater jaws than the molluscivore nasal protrusion, consistent with predictions from the adaptive landscape. The microevolutionary genetic architecture of large-effect QTL for oral jaws parallels the exceptional diversification rates of oral jaws within the San Salvador radiation observed over macroevolutionary timescales and may have facilitated exceptional trophic novelty in this system. © 2016 John Wiley & Sons Ltd.

  1. Wind power for the electric-utility industry: Policy incentives for fuel conservation

    NASA Astrophysics Data System (ADS)

    March, F.; Dlott, E. H.; Korn, D. H.; Madio, F. R.; McArthur, R. C.; Vachon, W. A.

    1982-06-01

    A systematic method for evaluating the economics of solar-electric/conservation technologies as fuel-savings investments for electric utilities in the presence of changing federal incentive policies is presented. The focus is on wind energy conversion systems (WECS) as the solar technology closest to near-term large scale implementation. Commercially available large WECS are described, along with computer models to calculate the economic impact of the inclusion of WECS as 10% of the base-load generating capacity on a grid. A guide to legal structures and relationships which impinge on large-scale WECS utilization is developed, together with a quantitative examination of the installation of 1000 MWe of WECS capacity by a utility in the northeast states. Engineering and financial analyses were performed, with results indicating government policy changes necessary to encourage the entrance of utilities into the field of windpower utilization.

  2. Large-scale structure non-Gaussianities with modal methods

    NASA Astrophysics Data System (ADS)

    Schmittfull, Marcel

    2016-10-01

    Relying on a separable modal expansion of the bispectrum, the implementation of a fast estimator for the full bispectrum of a 3d particle distribution is presented. The computational cost of accurate bispectrum estimation is negligible relative to simulation evolution, so the bispectrum can be used as a standard diagnostic whenever the power spectrum is evaluated. As an application, the time evolution of gravitational and primordial dark matter bispectra was measured in a large suite of N-body simulations. The bispectrum shape changes characteristically when the cosmic web becomes dominated by filaments and halos, therefore providing a quantitative probe of 3d structure formation. Our measured bispectra are determined by ~ 50 coefficients, which can be used as fitting formulae in the nonlinear regime and for non-Gaussian initial conditions. We also compare the measured bispectra with predictions from the Effective Field Theory of Large Scale Structures (EFTofLSS).

  3. Weighing trees with lasers: advances, challenges and opportunities

    PubMed Central

    Boni Vicari, M.; Burt, A.; Calders, K.; Lewis, S. L.; Raumonen, P.; Wilkes, P.

    2018-01-01

    Terrestrial laser scanning (TLS) is providing exciting new ways to quantify tree and forest structure, particularly above-ground biomass (AGB). We show how TLS can address some of the key uncertainties and limitations of current approaches to estimating AGB based on empirical allometric scaling equations (ASEs) that underpin all large-scale estimates of AGB. TLS provides extremely detailed non-destructive measurements of tree form independent of tree size and shape. We show examples of three-dimensional (3D) TLS measurements from various tropical and temperate forests and describe how the resulting TLS point clouds can be used to produce quantitative 3D models of branch and trunk size, shape and distribution. These models can drastically improve estimates of AGB, provide new, improved large-scale ASEs, and deliver insights into a range of fundamental tree properties related to structure. Large quantities of detailed measurements of individual 3D tree structure also have the potential to open new and exciting avenues of research in areas where difficulties of measurement have until now prevented statistical approaches to detecting and understanding underlying patterns of scaling, form and function. We discuss these opportunities and some of the challenges that remain to be overcome to enable wider adoption of TLS methods. PMID:29503726

  4. Cavallo's multiplier for in situ generation of high voltage

    NASA Astrophysics Data System (ADS)

    Clayton, S. M.; Ito, T. M.; Ramsey, J. C.; Wei, W.; Blatnik, M. A.; Filippone, B. W.; Seidel, G. M.

    2018-05-01

    A classic electrostatic induction machine, Cavallo's multiplier, is suggested for in situ production of very high voltage in cryogenic environments. The device is suitable for generating a large electrostatic field under conditions of very small load current. Operation of the Cavallo multiplier is analyzed, with quantitative description in terms of mutual capacitances between electrodes in the system. A demonstration apparatus was constructed, and measured voltages are compared to predictions based on measured capacitances in the system. The simplicity of the Cavallo multiplier makes it amenable to electrostatic analysis using finite element software, and electrode shapes can be optimized to take advantage of a high dielectric strength medium such as liquid helium. A design study is presented for a Cavallo multiplier in a large-scale, cryogenic experiment to measure the neutron electric dipole moment.

  5. Convergence between biological, behavioural and genetic determinants of obesity.

    PubMed

    Ghosh, Sujoy; Bouchard, Claude

    2017-12-01

    Multiple biological, behavioural and genetic determinants or correlates of obesity have been identified to date. Genome-wide association studies (GWAS) have contributed to the identification of more than 100 obesity-associated genetic variants, but their roles in causal processes leading to obesity remain largely unknown. Most variants are likely to have tissue-specific regulatory roles through joint contributions to biological pathways and networks, through changes in gene expression that influence quantitative traits, or through the regulation of the epigenome. The recent availability of large-scale functional genomics resources provides an opportunity to re-examine obesity GWAS data to begin elucidating the function of genetic variants. Interrogation of knockout mouse phenotype resources provides a further avenue to test for evidence of convergence between genetic variation and biological or behavioural determinants of obesity.

  6. A Unified Theory of Impact Crises and Mass Extinctions: Quantitative Tests

    NASA Technical Reports Server (NTRS)

    Rampino, Michael R.; Haggerty, Bruce M.; Pagano, Thomas C.

    1997-01-01

    Several quantitative tests of a general hypothesis linking impacts of large asteroids and comets with mass extinctions of life are possible based on astronomical data, impact dynamics, and geological information. The waiting of large-body impacts on the Earth derive from the flux of Earth-crossing asteroids and comets, and the estimated size of impacts capable of causing large-scale environmental disasters, predict that impacts of objects greater than or equal to 5 km in diameter (greater than or equal to 10 (exp 7) Mt TNT equivalent) could be sufficient to explain the record of approximately 25 extinction pulses in the last 540 Myr, with the 5 recorded major mass extinctions related to impacts of the largest objects of greater than or equal to 10 km in diameter (greater than or equal to 10(exp 8) Mt Events). Smaller impacts (approximately 10 (exp 6) Mt), with significant regional environmental effects, could be responsible for the lesser boundaries in the geologic record.

  7. Interdigital athlete's foot. The interaction of dermatophytes and resident bacteria.

    PubMed

    Leyden, J J; Kligman, A M

    1978-10-01

    Quantitative cultures in 140 cases of interdigital "athlete's foot" established the following clinical-microbiological correlations. In 39 cases of mild, scaling, relatively asymptomatic variety, fungi were recovered in 84% of cases. As the disease progressed to maceration, hyperkeratosis, and increased symptoms, recovery of fungi fell to 55% in moderately symptomatic and to 36% in severe cases. Symptomatic cases had increasing numbers of resident aerobic organisms, particularly large colony diphtheroids. Experimental manipulations of the interspace microflora in volunteers, monitored with quantitative cultures, demonstrated that symptomatic, macerated, hyperkeratotic process results from an overgrowth of resident organisms if the stratum corneum barrier is damaged by preexisting fungi, while overgrowth of the same organisms in normal, fungus-free interspaces does not produce lesions. These experiments support the conclusion that athlete's foot represents a continuum from a relatively asymptomatic, scaling eruption produced by fungi to a symptomatic, macerated, hyperkeratotic variety that is caused by an overgrowth of bacteria.

  8. Advances in imaging and quantification of electrical properties at the nanoscale using Scanning Microwave Impedance Microscopy (sMIM)

    NASA Astrophysics Data System (ADS)

    Friedman, Stuart; Stanke, Fred; Yang, Yongliang; Amster, Oskar

    Scanning Microwave Impedance Microscopy (sMIM) is a mode for Atomic Force Microscopy (AFM) enabling imaging of unique contrast mechanisms and measurement of local permittivity and conductivity at the 10's of nm length scale. sMIM has been applied to a variety of systems including nanotubes, nanowires, 2D materials, photovoltaics and semiconductor devices. Early results were largely semi-quantitative. This talk will focus on techniques for extracting quantitative physical parameters such as permittivity, conductivity, doping concentrations and thin film properties from sMIM data. Particular attention will be paid to non-linear materials where sMIM has been used to acquire nano-scale capacitance-voltage curves. These curves can be used to identify the dopant type (n vs p) and doping level in doped semiconductors, both bulk samples and devices. Supported in part by DOE-SBIR DE-SC0009856.

  9. Quantum and classical ripples in graphene

    NASA Astrophysics Data System (ADS)

    Hašík, Juraj; Tosatti, Erio; MartoÅák, Roman

    2018-04-01

    Thermal ripples of graphene are well understood at room temperature, but their quantum counterparts at low temperatures are in need of a realistic quantitative description. Here we present atomistic path-integral Monte Carlo simulations of freestanding graphene, which show upon cooling a striking classical-quantum evolution of height and angular fluctuations. The crossover takes place at ever-decreasing temperatures for ever-increasing wavelengths so that a completely quantum regime is never attained. Zero-temperature quantum graphene is flatter and smoother than classical graphene at large scales yet rougher at short scales. The angular fluctuation distribution of the normals can be quantitatively described by coexistence of two Gaussians, one classical strongly T -dependent and one quantum about 2° wide, of zero-point character. The quantum evolution of ripple-induced height and angular spread should be observable in electron diffraction in graphene and other two-dimensional materials, such as MoS2, bilayer graphene, boron nitride, etc.

  10. Using occupancy estimation to assess the effectiveness of a regional multiple-species conservation plan: bats in the Pacific Northwest

    Treesearch

    Theodore Weller

    2008-01-01

    Regional conservation plans are increasingly used to plan for and protect biodiversity at large spatial scales however the means of quantitatively evaluating their effectiveness are rarely specified. Multiple-species approaches, particular those which employ site-occupancy estimation, have been proposed as robust and efficient alternatives for assessing the status of...

  11. Continental-scale simulation of burn probabilities, flame lengths, and fire size distribution for the United States

    Treesearch

    Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley

    2010-01-01

    Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...

  12. A Generational Divide in the Academic Profession: A Mixed Quantitative and Qualitative Approach to the Polish Case

    ERIC Educational Resources Information Center

    Kwiek, Marek

    2017-01-01

    In a recently changing Polish academic environment--following the large-scale higher education reforms of 2009-2012--different academic generations have to cope with different challenges. Polish academics have been strongly divided generationally, not only in terms of what they think and how they work but also in terms of what is academically…

  13. High-throughput method for the quantitation of metabolites and co-factors from homocysteine-methionine cycle for nutritional status assessment.

    PubMed

    Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping

    2016-09-01

    There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.

  14. Enablers and Barriers to Large-Scale Uptake of Improved Solid Fuel Stoves: A Systematic Review

    PubMed Central

    Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G.

    2013-01-01

    Background: Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. Objectives: We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. Methods: We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as “factors” relating to one of seven domains—fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms—and also recorded issues that impacted equity. Results: We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Conclusions: Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness. Citation: Rehfuess EA, Puzzolo E, Stanistreet D, Pope D, Bruce NG. 2014. Enablers and barriers to large-scale uptake of improved solid fuel stoves: a systematic review. Environ Health Perspect 122:120–130; http://dx.doi.org/10.1289/ehp.1306639 PMID:24300100

  15. HFSB-seeding for large-scale tomographic PIV in wind tunnels

    NASA Astrophysics Data System (ADS)

    Caridi, Giuseppe Carlo Alp; Ragni, Daniele; Sciacchitano, Andrea; Scarano, Fulvio

    2016-12-01

    A new system for large-scale tomographic particle image velocimetry in low-speed wind tunnels is presented. The system relies upon the use of sub-millimetre helium-filled soap bubbles as flow tracers, which scatter light with intensity several orders of magnitude higher than micron-sized droplets. With respect to a single bubble generator, the system increases the rate of bubbles emission by means of transient accumulation and rapid release. The governing parameters of the system are identified and discussed, namely the bubbles production rate, the accumulation and release times, the size of the bubble injector and its location with respect to the wind tunnel contraction. The relations between the above parameters, the resulting spatial concentration of tracers and measurement of dynamic spatial range are obtained and discussed. Large-scale experiments are carried out in a large low-speed wind tunnel with 2.85 × 2.85 m2 test section, where a vertical axis wind turbine of 1 m diameter is operated. Time-resolved tomographic PIV measurements are taken over a measurement volume of 40 × 20 × 15 cm3, allowing the quantitative analysis of the tip-vortex structure and dynamical evolution.

  16. Quantitative phosphoproteomic analysis of early seed development in rice (Oryza sativa L.).

    PubMed

    Qiu, Jiehua; Hou, Yuxuan; Tong, Xiaohong; Wang, Yifeng; Lin, Haiyan; Liu, Qing; Zhang, Wen; Li, Zhiyong; Nallamilli, Babi R; Zhang, Jian

    2016-02-01

    Rice (Oryza sativa L.) seed serves as a major food source for over half of the global population. Though it has been long recognized that phosphorylation plays an essential role in rice seed development, the phosphorylation events and dynamics in this process remain largely unknown so far. Here, we report the first large scale identification of rice seed phosphoproteins and phosphosites by using a quantitative phosphoproteomic approach. Thorough proteomic studies in pistils and seeds at 3, 7 days after pollination resulted in the successful identification of 3885, 4313 and 4135 phosphopeptides respectively. A total of 2487 proteins were differentially phosphorylated among the three stages, including Kip related protein 1, Rice basic leucine zipper factor 1, Rice prolamin box binding factor and numerous other master regulators of rice seed development. Moreover, differentially phosphorylated proteins may be extensively involved in the biosynthesis and signaling pathways of phytohormones such as auxin, gibberellin, abscisic acid and brassinosteroid. Our results strongly indicated that protein phosphorylation is a key mechanism regulating cell proliferation and enlargement, phytohormone biosynthesis and signaling, grain filling and grain quality during rice seed development. Overall, the current study enhanced our understanding of the rice phosphoproteome and shed novel insight into the regulatory mechanism of rice seed development.

  17. Application of Classification Algorithm of Machine Learning and Buffer Analysis in Torism Regional Planning

    NASA Astrophysics Data System (ADS)

    Zhang, T. H.; Ji, H. W.; Hu, Y.; Ye, Q.; Lin, Y.

    2018-04-01

    Remote Sensing (RS) and Geography Information System (GIS) technologies are widely used in ecological analysis and regional planning. With the advantages of large scale monitoring, combination of point and area, multiple time-phases and repeated observation, they are suitable for monitoring and analysis of environmental information in a large range. In this study, support vector machine (SVM) classification algorithm is used to monitor the land use and land cover change (LUCC), and then to perform the ecological evaluation for Chaohu lake tourism area quantitatively. The automatic classification and the quantitative spatial-temporal analysis for the Chaohu Lake basin are realized by the analysis of multi-temporal and multispectral satellite images, DEM data and slope information data. Furthermore, the ecological buffer zone analysis is also studied to set up the buffer width for each catchment area surrounding Chaohu Lake. The results of LUCC monitoring from 1992 to 2015 has shown obvious affections by human activities. Since the construction of the Chaohu Lake basin is in the crucial stage of the rapid development of urbanization, the application of RS and GIS technique can effectively provide scientific basis for land use planning, ecological management, environmental protection and tourism resources development in the Chaohu Lake Basin.

  18. Microwave Remote Sensing and the Cold Land Processes Field Experiment

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Cline, Don; Davis, Bert; Hildebrand, Peter H. (Technical Monitor)

    2001-01-01

    The Cold Land Processes Field Experiment (CLPX) has been designed to advance our understanding of the terrestrial cryosphere. Developing a more complete understanding of fluxes, storage, and transformations of water and energy in cold land areas is a critical focus of the NASA Earth Science Enterprise Research Strategy, the NASA Global Water and Energy Cycle (GWEC) Initiative, the Global Energy and Water Cycle Experiment (GEWEX), and the GEWEX Americas Prediction Project (GAPP). The movement of water and energy through cold regions in turn plays a large role in ecological activity and biogeochemical cycles. Quantitative understanding of cold land processes over large areas will require synergistic advancements in 1) understanding how cold land processes, most comprehensively understood at local or hillslope scales, extend to larger scales, 2) improved representation of cold land processes in coupled and uncoupled land-surface models, and 3) a breakthrough in large-scale observation of hydrologic properties, including snow characteristics, soil moisture, the extent of frozen soils, and the transition between frozen and thawed soil conditions. The CLPX Plan has been developed through the efforts of over 60 interested scientists that have participated in the NASA Cold Land Processes Working Group (CLPWG). This group is charged with the task of assessing, planning and implementing the required background science, technology, and application infrastructure to support successful land surface hydrology remote sensing space missions. A major product of the experiment will be a comprehensive, legacy data set that will energize many aspects of cold land processes research. The CLPX will focus on developing the quantitative understanding, models, and measurements necessary to extend our local-scale understanding of water fluxes, storage, and transformations to regional and global scales. The experiment will particularly emphasize developing a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. The experimental design is a multi-sensor, multi-scale (1-ha to 160,000 km ^ {2}) approach to providing the comprehensive data set necessary to address several experiment objectives. A description focusing on the microwave remote sensing components (ground, airborne, and spaceborne) of the experiment will be presented.

  19. Past and present cosmic structure in the SDSS DR7 main sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasche, J.; Leclercq, F.; Wandelt, B.D., E-mail: jasche@iap.fr, E-mail: florent.leclercq@polytechnique.org, E-mail: wandelt@iap.fr

    2015-01-01

    We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structuremore » formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than ∼ 3 Mpc/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.« less

  20. Comparison of the Single Molecule Dynamics of Linear and Circular DNAs in Planar Extensional Flows

    NASA Astrophysics Data System (ADS)

    Li, Yanfei; Hsiao, Kai-Wen; Brockman, Christopher; Yates, Daniel; McKenna, Gregory; Schroeder, Charles; San Francisco, Michael; Kornfield, Julie; Anderson, Rae

    2015-03-01

    Chain topology has a profound impact on the flow behaviors of single macromolecules. The absence of free ends separates circular polymers from other chain architectures, i.e., linear, star, and branched. In the present work, we study the single chain dynamics of large circular and linear DNA molecules by comparing the relaxation dynamics, steady state coil-stretch transition, and transient molecular individualism behaviors for the two types of macromolecules. To this end, large circular DNA molecules were biologically synthesized and studied in a microfluidic device that has a cross-slot geometry to develop a stagnation point extensional flow. Although the relaxation time of rings scales in the same way as for the linear analog, the circular polymers show quantitatively different behaviors in the steady state extension and qualitatively different behaviors during a transient stretch. The existence of some commonality between these two topologies is proposed. Texas Tech University John R. Bradford Endowment.

  1. Identifying Preserved Storm Events on Beaches from Trenches and Cores

    NASA Astrophysics Data System (ADS)

    Wadman, H. M.; Gallagher, E. L.; McNinch, J.; Reniers, A.; Koktas, M.

    2014-12-01

    Recent research suggests that even small scale variations in grain size in the shallow stratigraphy of sandy beaches can significantly influence large-scale morphology change. However, few quantitative studies of variations in shallow stratigraphic layers, as differentiated by variations in mean grain size, have been conducted, in no small part due to the difficulty of collecting undisturbed sediment cores in the energetic lower beach and swash zone. Due to this lack of quantitative stratigraphic grain size data, most coastal morphology models assume that uniform grain sizes dominate sandy beaches, allowing for little to no temporal or spatial variations in grain size heterogeneity. In a first-order attempt to quantify small-scale, temporal and spatial variations in beach stratigraphy, thirty-five vibracores were collected at the USACE Field Research Facility (FRF), Duck, NC, in March-April of 2014 using the FRF's Coastal Research and Amphibious Buggy (CRAB). Vibracores were collected at set locations along a cross-shore profile from the toe of the dune to a water depth of ~1m in the surf zone. Vibracores were repeatedly collected from the same locations throughout a tidal cycle, as well as pre- and post a nor'easter event. In addition, two ~1.5m deep trenches were dug in the cross-shore and along-shore directions (each ~14m in length) after coring was completed to allow better interpretation of the stratigraphic sequences observed in the vibracores. The elevations of coherent stratigraphic layers, as revealed in vibracore-based fence diagrams and trench data, are used to relate specific observed stratigraphic sequences to individual storm events observed at the FRF. These data provide a first-order, quantitative examination of the small-scale temporal and spatial variability of shallow grain size along an open, sandy coastline. The data will be used to refine morphological model predictions to include variations in grain size and associated shallow stratigraphy.

  2. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  3. Large-Scale SNP Discovery and Genotyping for Constructing a High-Density Genetic Map of Tea Plant Using Specific-Locus Amplified Fragment Sequencing (SLAF-seq)

    PubMed Central

    Ma, Chun-Lei; Jin, Ji-Qiang; Li, Chun-Fang; Wang, Rong-Kai; Zheng, Hong-Kun; Yao, Ming-Zhe; Chen, Liang

    2015-01-01

    Genetic maps are important tools in plant genomics and breeding. The present study reports the large-scale discovery of single nucleotide polymorphisms (SNPs) for genetic map construction in tea plant. We developed a total of 6,042 valid SNP markers using specific-locus amplified fragment sequencing (SLAF-seq), and subsequently mapped them into the previous framework map. The final map contained 6,448 molecular markers, distributing on fifteen linkage groups corresponding to the number of tea plant chromosomes. The total map length was 3,965 cM, with an average inter-locus distance of 1.0 cM. This map is the first SNP-based reference map of tea plant, as well as the most saturated one developed to date. The SNP markers and map resources generated in this study provide a wealth of genetic information that can serve as a foundation for downstream genetic analyses, such as the fine mapping of quantitative trait loci (QTL), map-based cloning, marker-assisted selection, and anchoring of scaffolds to facilitate the process of whole genome sequencing projects for tea plant. PMID:26035838

  4. Statistics of velocity fluctuations of Geldart A particles in a circulating fluidized bed riser

    DOE PAGES

    Vaidheeswaran, Avinash; Shaffer, Franklin; Gopalan, Balaji

    2017-11-21

    Here, the statistics of fluctuating velocity components are studied in the riser of a closed-loop circulating fluidized bed with fluid catalytic cracking catalyst particles. Our analysis shows distinct similarities as well as deviations compared to existing theories and bench-scale experiments. The study confirms anisotropic and non-Maxwellian distribution of fluctuating velocity components. The velocity distribution functions (VDFs) corresponding to transverse fluctuations exhibit symmetry, and follow a stretched-exponential behavior up to three standard deviations. The form of the transverse VDF is largely determined by interparticle interactions. The tails become more overpopulated with an increase in particle loading. The observed deviations from themore » Gaussian distribution are represented using the leading order term in the Sonine expansion, which is commonly used to approximate the VDFs in kinetic theory for granular flows. The vertical fluctuating VDFs are asymmetric and the skewness shifts as the wall is approached. In comparison to transverse fluctuations, the vertical VDF is determined by the local hydrodynamics. This is an observation of particle velocity fluctuations in a large-scale system and their quantitative comparison with the Maxwell-Boltzmann statistics.« less

  5. Molecular Imaging of Kerogen and Minerals in Shale Rocks across Micro- and Nano- Scales

    NASA Astrophysics Data System (ADS)

    Hao, Z.; Bechtel, H.; Sannibale, F.; Kneafsey, T. J.; Gilbert, B.; Nico, P. S.

    2016-12-01

    Fourier transform infrared (FTIR) spectroscopy is a reliable and non-destructive quantitative method to evaluate mineralogy and kerogen content / maturity of shale rocks, although it is traditionally difficult to assess the organic and mineralogical heterogeneity at micrometer and nanometer scales due to the diffraction limit of the infrared light. However, it is truly at these scales that the kerogen and mineral content and their formation in share rocks determines the quality of shale gas reserve, the gas flow mechanisms and the gas production. Therefore, it's necessary to develop new approaches which can image across both micro- and nano- scales. In this presentation, we will describe two new molecular imaging approaches to obtain kerogen and mineral information in shale rocks at the unprecedented high spatial resolution, and a cross-scale quantitative multivariate analysis method to provide rapid geochemical characterization of large size samples. The two imaging approaches are enhanced at nearfield respectively by a Ge-hemisphere (GE) and by a metallic scanning probe (SINS). The GE method is a modified microscopic attenuated total reflectance (ATR) method which rapidly captures a chemical image of the shale rock surface at 1 to 5 micrometer resolution with a large field of view of 600 X 600 micrometer, while the SINS probes the surface at 20 nm resolution which provides a chemically "deconvoluted" map at the nano-pore level. The detailed geochemical distribution at nanoscale is then used to build a machine learning model to generate self-calibrated chemical distribution map at micrometer scale with the input of the GE images. A number of geochemical contents across these two important scales are observed and analyzed, including the minerals (oxides, carbonates, sulphides), the organics (carbohydrates, aromatics), and the absorbed gases. These approaches are self-calibrated, optics friendly and non-destructive, so they hold the potential to monitor shale gas flow at real time inside the micro- or nano- pore network, which is of great interest for optimizing the shale gas extraction.

  6. Global-Mindedness and Intercultural Competence: A Quantitative Study of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Cui, Qi

    2013-01-01

    This study assessed pre-service teachers' levels of global-mindedness and intercultural competence using the Global-Mindedness Scale (GMS) and the Cultural Intelligence Scale (CQS) and investigated the correlation between the two. The study examined whether the individual scale factors such as gender, perceived competence in non-native language or…

  7. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  8. High-Throughput Analysis and Automation for Glycomics Studies.

    PubMed

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  9. Role of the ocean in climate changes

    NASA Technical Reports Server (NTRS)

    Gulev, Sergey K.

    1992-01-01

    The present program aimed at the study of ocean climate change is prepared by a group of scientists from State Oceanographic Institute, Academy of Science of Russia, Academy of Science of Ukraine and Moscow State University. It appears to be a natural evolution of ideas and achievements that have been developed under national and international ocean research projects such as SECTIONS, WOCE, TOGA, JGOFS and others. The two primary goals are set in the program ROCC. (1) Quantitative description of the global interoceanic 'conveyor' and it's role in formation of the large scale anomalies in the North Atlantic. The objectives on the way to this goal are: to get the reliable estimates of year-to-year variations of heat and water exchange between the Atlantic Ocean and the atmosphere; to establish and understand the physics of long period variations in meridianal heat and fresh water transport (MHT and MFWT) in the Atlantic Ocean; to analyze the general mechanisms, that form the MHT and MFWT in low latitudes (Ekman flux), middle latitudes (western boundary currents) and high latitudes (deep convection) of the North Atlantic; to establish and to give quantitative description of the realization of global changes in SST, surface salinity, sea level and sea ice data. (2) Development of the observational system pointed at tracing the climate changes in the North Atlantic. This goal merges the following objectives: to find the proper sites that form the inter annual variations of MHT; to study the deep circulation in the 'key' points; to develop the circulation models reflecting the principle features of interoceanic circulation; and to define global and local response of the atmosphere circulation to large scale processes in the Atlantic Ocean.

  10. Spatial patterns of schistosomiasis in Burkina Faso: relevance of human mobility and water resources development

    NASA Astrophysics Data System (ADS)

    Perez-Saez, Javier; Bertuzzo, Enrico; Frohelich, Jean-Marc; Mande, Theophile; Ceperley, Natalie; Sou, Mariam; Yacouba, Hamma; Maiga, Hamadou; Sokolow, Susanne; De Leo, Giulio; Casagrandi, Renato; Gatto, Marino; Mari, Lorenzo; Rinaldo, Andrea

    2015-04-01

    We study the spatial geography of schistosomiasis in the african context of Burkina Faso by means of a spatially explicit model of disease dynamics and spread. The relevance of our work lies in its ability to describe quantitatively a geographic stratification of the disease burden capable of reproducing important spatial differences, and drivers/controls of disease spread. Among the latters, we consider specifically the development and management of water resources which have been singled out empirically as an important risk factor for schistosomiasis. The model includes remotely acquired and objectively manipulated information on the distributions of population, infrastructure, elevation and climatic drivers. It also includes a general description of human mobility and addresses a first-order characterization of the ecology of the intermediate host of the parasite causing the disease based on maximum entropy learning of relevant environmenal covariates. Spatial patterns of the disease were analyzed about their disease-free equilibrium by proper extraction and mapping of suitable eigenvectors of the Jacobian matrix subsuming all stability properties of the system. Human mobility was found to be a primary control of both pathogen invasion success and of the overall distribution of disease burden. The effects of water resources development were studied by accounting for the (prior and posterior) average distances of human settlements from water bodies that may serve as suitable habitats to the intermediate host of the parasite. Water developments, in combination with human mobility, were quantitatively related to disease spread into regions previously nearly disease-free and to large-scale empirical incidence patterns. We concluded that while the model still needs refinements based on field and epidemiological evidence, the framework proposed provides a powerful tool for large-scale, long-term public health planning and management of schistosomiasis.

  11. Quantitative evaluation of first, second, and third generation hairpin systems reveals the limit of mammalian vector-based RNAi

    PubMed Central

    Watanabe, Colin; Cuellar, Trinna L.; Haley, Benjamin

    2016-01-01

    ABSTRACT Incorporating miRNA-like features into vector-based hairpin scaffolds has been shown to augment small RNA processing and RNAi efficiency. Therefore, defining an optimal, native hairpin context may obviate a need for hairpin-specific targeting design schemes, which confound the movement of functional siRNAs into shRNA/artificial miRNA backbones, or large-scale screens to identify efficacious sequences. Thus, we used quantitative cell-based assays to compare separate third generation artificial miRNA systems, miR-E (based on miR-30a) and miR-3G (based on miR-16-2 and first described in this study) to widely-adopted, first and second generation formats in both Pol-II and Pol-III expression vector contexts. Despite their unique structures and strandedness, and in contrast to first and second-generation RNAi triggers, the third generation formats operated with remarkable similarity to one another, and strong silencing was observed with a significant fraction of the evaluated target sequences within either promoter context. By pairing an established siRNA design algorithm with the third generation vectors we could readily identify targeting sequences that matched or exceeded the potency of those discovered through large-scale sensor-based assays. We find that third generation hairpin systems enable the maximal level of siRNA function, likely through enhanced processing and accumulation of precisely-defined guide RNAs. Therefore, we predict future gains in RNAi potency will come from improved hairpin expression and identification of optimal siRNA-intrinsic silencing properties rather than further modification of these scaffolds. Consequently, third generation systems should be the primary format for vector-based RNAi studies; miR-3G is advantageous due to its small expression cassette and simplified, cost-efficient cloning scheme. PMID:26786363

  12. Genetic Diversity of Globally Dispersed Lacustrine Group I Haptophytes: Implications for Quantitative Temperature Reconstructions

    NASA Astrophysics Data System (ADS)

    Richter, N.; Longo, W. M.; Amaral-Zettler, L. A.; Huang, Y.

    2017-12-01

    There are significant uncertainties surrounding the forcings that drive terrestrial temperature changes on local and regional scales. Quantitative temperature reconstructions from terrestrial sites, such as lakes, help to unravel the fundamental processes that drive changes in temperature on different temporal and spatial scales. Recent studies at Brown University show that distinct alkenones, long chain ketones produced by haptophytes, are found in many freshwater, alkaline lakes in the Northern Hemisphere, highlighting these systems as targets for quantitative continental temperature reconstructions. These freshwater alkenones are produced by the Group I haptophyte phylotype and are characterized by a distinct signature: the presence of isomeric tri-unsaturated ketones and absence of alkenoates. There are currently no cultured representatives of the "Group I" haptophytes, hence they are only known based on their rRNA gene signatures. Here we present robust evidence that Northern Hemispheric freshwater, alkaline lakes with the characteristic "Group I" alkenone signature all host the same clade of Isochrysidales haptophytes. We employed next generation DNA amplicon sequencing to target haptophyte specific hypervariable regions of the large and small-subunit ribosomal RNA gene from 13 different lakes from three continents (i.e., North America, Europe, and Asia). Combined with previously published sequences, our genetic data show that the Group I haptophyte is genetically diverse on a regional and global scale, and even within the same lake. We present two case studies from a suite of five lakes in Alaska and three in Iceland to assess the impact of various environmental factors affecting Group I diversity and alkenone production. Despite the genetic diversity in this group, the overall ketone signature is conserved. Based on global surface sediment samples and in situ Alaskan lake calibrations, alkenones produced by different operational taxonomic units of the Group I haptophytes appear to display consistent responses to temperature changes, and thus can be used for paleotemperature reconstructions with a universally defined calibration.

  13. Sedimentary Framework of an Inner Continental Shelf Sand-Ridge System, West-Central Florida

    NASA Astrophysics Data System (ADS)

    Locker, S. D.; Hine, A. C.; Wright, A. K.; Duncan, D. S.

    2002-12-01

    The west-central Florida inner continental shelf is a dynamic environment subject to current flows on a variety of temporal and spatial scales. A site survey program, undertaken in support of the Office of Naval Research's Mine Burial prediction program, is focused on the sedimentary framework and sediment accumulation patterns in 10-18 meters water depth. Our specific goals are to image the shallow subsurface and to monitor changes in bedform distribution patterns that coincide with physical processes studies ongoing in the area. Methods of study include side-scan sonar imaging, boomer and chirp subbottom profiling, and sedimentary facies analysis using surface sediment sampling and vibracoring. A well-defined sand-ridge system was imaged, trending oblique to the west-Florida coastline. The side-scan clearly shows that there is extensive three-dimensional structure within these large-scale NW-SE trending sedimentary bedforms. The sand ridges commonly are approximately 1 km wide and 4-8 km in length. The characteristics of these ridges are distinctly different than the sand ridges in < 8 m water that we have previously studied. Ridges in the offshore area tend to be thicker, have a flatter morphology, and exhibit fewer smaller-scale sand waves. Sand-ridge thickness ranges 2-3 meters, and typically consists of fining upward medium to fine quartz sand facies with occasional centimeter-scale coarser-grained carbonate-rich intervals. Time series investigations tracking the shift in position of the sand ridge margins have found undetectable net annual movement. However significant resuspension and bedform development accompanies high-energy events such as winter cold front passage. Thus the large-scale bedforms (sand ridges) are in a state of dynamic equilibrium with the average annual hydrodynamic regime. Repeated field surveys will focus on monitoring small-scale sedimentological and stratal framework changes that will be integrated with the quantitative process studies.

  14. Policy Guidance From a Multi-scale Suite of Natural Field and Digital Laboratories of Change: Hydrological Catchment Studies of Nutrient and Pollutant Source Releases, Waterborne Transport-Transformations and Mass Flows in Water Ecosystems

    NASA Astrophysics Data System (ADS)

    Destouni, G.

    2008-12-01

    Continental fresh water transports and loads excess nutrients and pollutants from various land surface sources, through the landscape, into downstream inland and coastal water environments. Our ability to understand, predict and control the eutrophication and the pollution pressures on inland, coastal and marine water ecosystems relies on our ability to quantify these mass flows. This paper synthesizes a series of hydro- biogeochemical studies of nutrient and pollutant sources, transport-transformations and mass flows in catchment areas across a range of scales, from continental, through regional and national, to individual drainage basin scales. Main findings on continental scales include correlations between country/catchment area, population and GDP and associated pollutant and nutrient loading, which differ significantly between world regions with different development levels. On regional scales, essential systematic near-coastal gaps are identified in the national monitoring of nutrient and pollutant loads from land to the sea. Combination of the unmonitored near-coastal area characteristics with the relevant regional nutrient and pollutant load correlations with these characteristics shows that the unmonitored nutrient and pollutant mass loads to the sea may often be as large as, or greater than the monitored river loads. Process studies on individual basin- scales show long-term nutrient and pollutant memories in the soil-groundwater systems of the basins, which may continue to uphold large mass loading to inland and coastal waters long time after mitigation of the sources. Linked hydro-biogeochemical-economic model studies finally demonstrate significant comparative advantages of policies that demand explicit quantitative account of the uncertainties implied by these monitoring gaps and long-term nutrient-pollution memories and time lags, and other knowledge, data and model limitations, instead of the now common neglect or subjective implicit handling of such uncertainties in strategies and practices for combating water pollution and eutrophication.

  15. Thermal runaway of metal nano-tips during intense electron emission

    NASA Astrophysics Data System (ADS)

    Kyritsakis, A.; Veske, M.; Eimre, K.; Zadin, V.; Djurabekova, F.

    2018-06-01

    When an electron emitting tip is subjected to very high electric fields, plasma forms even under ultra high vacuum conditions. This phenomenon, known as vacuum arc, causes catastrophic surface modifications and constitutes a major limiting factor not only for modern electron sources, but also for many large-scale applications such as particle accelerators, fusion reactors etc. Although vacuum arcs have been studied thoroughly, the physical mechanisms that lead from intense electron emission to plasma ignition are still unclear. In this article, we give insights to the atomic scale processes taking place in metal nanotips under intense field emission conditions. We use multi-scale atomistic simulations that concurrently include field-induced forces, electron emission with finite-size and space-charge effects, Nottingham and Joule heating. We find that when a sufficiently high electric field is applied to the tip, the emission-generated heat partially melts it and the field-induced force elongates and sharpens it. This initiates a positive feedback thermal runaway process, which eventually causes evaporation of large fractions of the tip. The reported mechanism can explain the origin of neutral atoms necessary to initiate plasma, a missing key process required to explain the ignition of a vacuum arc. Our simulations provide a quantitative description of in the conditions leading to runaway, which shall be valuable for both field emission applications and vacuum arc studies.

  16. Patterns of Metabolite Changes Identified from Large-Scale Gene Perturbations in Arabidopsis Using a Genome-Scale Metabolic Network1[OPEN

    PubMed Central

    Kim, Taehyong; Dreher, Kate; Nilo-Poyanco, Ricardo; Lee, Insuk; Fiehn, Oliver; Lange, Bernd Markus; Nikolau, Basil J.; Sumner, Lloyd; Welti, Ruth; Wurtele, Eve S.; Rhee, Seung Y.

    2015-01-01

    Metabolomics enables quantitative evaluation of metabolic changes caused by genetic or environmental perturbations. However, little is known about how perturbing a single gene changes the metabolic system as a whole and which network and functional properties are involved in this response. To answer this question, we investigated the metabolite profiles from 136 mutants with single gene perturbations of functionally diverse Arabidopsis (Arabidopsis thaliana) genes. Fewer than 10 metabolites were changed significantly relative to the wild type in most of the mutants, indicating that the metabolic network was robust to perturbations of single metabolic genes. These changed metabolites were closer to each other in a genome-scale metabolic network than expected by chance, supporting the notion that the genetic perturbations changed the network more locally than globally. Surprisingly, the changed metabolites were close to the perturbed reactions in only 30% of the mutants of the well-characterized genes. To determine the factors that contributed to the distance between the observed metabolic changes and the perturbation site in the network, we examined nine network and functional properties of the perturbed genes. Only the isozyme number affected the distance between the perturbed reactions and changed metabolites. This study revealed patterns of metabolic changes from large-scale gene perturbations and relationships between characteristics of the perturbed genes and metabolic changes. PMID:25670818

  17. Enablers and barriers to large-scale uptake of improved solid fuel stoves: a systematic review.

    PubMed

    Rehfuess, Eva A; Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G

    2014-02-01

    Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as "factors" relating to one of seven domains-fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms-and also recorded issues that impacted equity. We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness.

  18. European large-scale farmland investments and the land-water-energy-food nexus

    NASA Astrophysics Data System (ADS)

    Siciliano, Giuseppina; Rulli, Maria Cristina; D'Odorico, Paolo

    2017-12-01

    The escalating human demand for food, water, energy, fibres and minerals have resulted in increasing commercial pressures on land and water resources, which are partly reflected by the recent increase in transnational land investments. Studies have shown that many of the land-water issues associated with land acquisitions are directly related to the areas of energy and food production. This paper explores the land-water-energy-food nexus in relation to large-scale farmland investments pursued by investors from European countries. The analysis is based on a "resource assessment approach" which evaluates the linkages between land acquisitions for agricultural (including both energy and food production) and forestry purposes, and the availability of land and water in the target countries. To that end, the water appropriated by agricultural and forestry productions is quantitatively assessed and its impact on water resource availability is analysed. The analysis is meant to provide useful information to investors from EU countries and policy makers on aspects of resource acquisition, scarcity, and access to promote responsible land investments in the target countries.

  19. Evaluation of effectiveness of various devices for attenuation of trailing vortices based on model tests in a large towing basin

    NASA Technical Reports Server (NTRS)

    Kirkman, K. L.; Brown, C. E.; Goodman, A.

    1973-01-01

    The effectiveness of various candidate aircraft-wing devices for attenuation of trailing vortices generated by large aircraft is evaluated on basis of results of experiments conducted with a 0.03-scale model of a Boeing 747 transport aircraft using a technique developed at the HYDRONAUTICS Ship Model Basin. Emphasis is on the effects produced by these devices in the far-field (up to 8 kilometers downstream of full-scale generating aircraft) where the unaltered vortex-wakes could still be hazardous to small following aircraft. The evaluation is based primarily on quantitative measurements of the respective vortex velocity distributions made by means of hot-film probe traverses in a transverse plane at selected stations downstream. The effects of these altered wakes on rolling moment induced on a small following aircraft are also studied using a modified lifting-surface theory with a synthesized Gates Learjet as a typical example. Lift and drag measurements concurrently obtained in the model tests are used to appraise the effects of each device investigated on the performance characteristics of the generating aircraft.

  20. Fiber networks amplify active stress

    NASA Astrophysics Data System (ADS)

    Lenz, Martin; Ronceray, Pierre; Broedersz, Chase

    Large-scale force generation is essential for biological functions such as cell motility, embryonic development, and muscle contraction. In these processes, forces generated at the molecular level by motor proteins are transmitted by disordered fiber networks, resulting in large-scale active stresses. While fiber networks are well characterized macroscopically, this stress generation by microscopic active units is not well understood. I will present a comprehensive theoretical study of force transmission in these networks. I will show that the linear, small-force response of the networks is remarkably simple, as the macroscopic active stress depends only on the geometry of the force-exerting unit. In contrast, as non-linear buckling occurs around these units, local active forces are rectified towards isotropic contraction and strongly amplified. This stress amplification is reinforced by the networks' disordered nature, but saturates for high densities of active units. I will show that our predictions are quantitatively consistent with experiments on reconstituted tissues and actomyosin networks, and that they shed light on the role of the network microstructure in shaping active stresses in cells and tissue.

  1. Scale-invariant properties of public-debt growth

    NASA Astrophysics Data System (ADS)

    Petersen, A. M.; Podobnik, B.; Horvatic, D.; Stanley, H. E.

    2010-05-01

    Public debt is one of the important economic variables that quantitatively describes a nation's economy. Because bankruptcy is a risk faced even by institutions as large as governments (e.g., Iceland), national debt should be strictly controlled with respect to national wealth. Also, the problem of eliminating extreme poverty in the world is closely connected to the study of extremely poor debtor nations. We analyze the time evolution of national public debt and find "convergence": initially less-indebted countries increase their debt more quickly than initially more-indebted countries. We also analyze the public debt-to-GDP ratio {\\cal R} , a proxy for default risk, and approximate the probability density function P({\\cal R}) with a Gamma distribution, which can be used to establish thresholds for sustainable debt. We also observe "convergence" in {\\cal R} : countries with initially small {\\cal R} increase their {\\cal R} more quickly than countries with initially large {\\cal R} . The scaling relationships for debt and {\\cal R} have practical applications, e.g. the Maastricht Treaty requires members of the European Monetary Union to maintain {\\cal R} < 0.6 .

  2. Large-Scale SRM Screen of Urothelial Bladder Cancer Candidate Biomarkers in Urine.

    PubMed

    Duriez, Elodie; Masselon, Christophe D; Mesmin, Cédric; Court, Magali; Demeure, Kevin; Allory, Yves; Malats, Núria; Matondo, Mariette; Radvanyi, François; Garin, Jérôme; Domon, Bruno

    2017-04-07

    Urothelial bladder cancer is a condition associated with high recurrence and substantial morbidity and mortality. Noninvasive urinary tests that would detect bladder cancer and tumor recurrence are required to significantly improve patient care. Over the past decade, numerous bladder cancer candidate biomarkers have been identified in the context of extensive proteomics or transcriptomics studies. To translate these findings in clinically useful biomarkers, the systematic evaluation of these candidates remains the bottleneck. Such evaluation involves large-scale quantitative LC-SRM (liquid chromatography-selected reaction monitoring) measurements, targeting hundreds of signature peptides by monitoring thousands of transitions in a single analysis. The design of highly multiplexed SRM analyses is driven by several factors: throughput, robustness, selectivity and sensitivity. Because of the complexity of the samples to be analyzed, some measurements (transitions) can be interfered by coeluting isobaric species resulting in biased or inconsistent estimated peptide/protein levels. Thus the assessment of the quality of SRM data is critical to allow flagging these inconsistent data. We describe an efficient and robust method to process large SRM data sets, including the processing of the raw data, the detection of low-quality measurements, the normalization of the signals for each protein, and the estimation of protein levels. Using this methodology, a variety of proteins previously associated with bladder cancer have been assessed through the analysis of urine samples from a large cohort of cancer patients and corresponding controls in an effort to establish a priority list of most promising candidates to guide subsequent clinical validation studies.

  3. Application of image analysis in studies of quantitative disease resistance, exemplified using common bacterial blight-common bean pathosystem.

    PubMed

    Xie, Weilong; Yu, Kangfu; Pauls, K Peter; Navabi, Alireza

    2012-04-01

    The effectiveness of image analysis (IA) compared with an ordinal visual scale, for quantitative measurement of disease severity, its application in quantitative genetic studies, and its effect on the estimates of genetic parameters were investigated. Studies were performed using eight backcross-derived families of common bean (Phaseolus vulgaris) (n = 172) segregating for the molecular marker SU91, known to be associated with a quantitative trait locus (QTL) for resistance to common bacterial blight (CBB), caused by Xanthomonas campestris pv. phaseoli and X. fuscans subsp. fuscans. Even though both IA and visual assessments were highly repeatable, IA was more sensitive in detecting quantitative differences between bean genotypes. The CBB phenotypic difference between the two SU91 genotypic groups was consistently more than fivefold for IA assessments but generally only two- to threefold for visual assessments. Results suggest that the visual assessment results in overestimation of the effect of QTL in genetic studies. This may have been caused by lack of additivity and uneven intervals of the visual scale. Although visual assessment of disease severity is a useful tool for general selection in breeding programs, assessments using IA may be more suitable for phenotypic evaluations in quantitative genetic studies involving CBB resistance as well as other foliar diseases.

  4. Quantitative phylogenetic assessment of microbial communities in diverse environments.

    PubMed

    von Mering, C; Hugenholtz, P; Raes, J; Tringe, S G; Doerks, T; Jensen, L J; Ward, N; Bork, P

    2007-02-23

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. We used a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative, and accurate picture of community composition than that provided by traditional ribosomal RNA-based approaches depending on the polymerase chain reaction. Mapping marker genes from four diverse environmental data sets onto a reference species phylogeny shows that certain communities evolve faster than others. The method also enables determination of preferred habitats for entire microbial clades and provides evidence that such habitat preferences are often remarkably stable over time.

  5. Large-scale human skin lipidomics by quantitative, high-throughput shotgun mass spectrometry.

    PubMed

    Sadowski, Tomasz; Klose, Christian; Gerl, Mathias J; Wójcik-Maciejewicz, Anna; Herzog, Ronny; Simons, Kai; Reich, Adam; Surma, Michal A

    2017-03-07

    The lipid composition of human skin is essential for its function; however the simultaneous quantification of a wide range of stratum corneum (SC) and sebaceous lipids is not trivial. We developed and validated a quantitative high-throughput shotgun mass spectrometry-based platform for lipid analysis of tape-stripped SC skin samples. It features coverage of 16 lipid classes; total quantification to the level of individual lipid molecules; high reproducibility and high-throughput capabilities. With this method we conducted a large lipidomic survey of 268 human SC samples, where we investigated the relationship between sampling depth and lipid composition, lipidome variability in samples from 14 different sampling sites on the human body and finally, we assessed the impact of age and sex on lipidome variability in 104 healthy subjects. We found sebaceous lipids to constitute an abundant component of the SC lipidome as they diffuse into the topmost SC layers forming a gradient. Lipidomic variability with respect to sampling depth, site and subject is considerable, and mainly accredited to sebaceous lipids, while stratum corneum lipids vary less. This stresses the importance of sampling design and the role of sebaceous lipids in skin studies.

  6. Multiscale factors affecting human attitudes toward snow leopards and wolves.

    PubMed

    Suryawanshi, Kulbhushansingh R; Bhatia, Saloni; Bhatnagar, Yash Veer; Redpath, Stephen; Mishra, Charudutt

    2014-12-01

    The threat posed by large carnivores to livestock and humans makes peaceful coexistence between them difficult. Effective implementation of conservation laws and policies depends on the attitudes of local residents toward the target species. There are many known correlates of human attitudes toward carnivores, but they have only been assessed at the scale of the individual. Because human societies are organized hierarchically, attitudes are presumably influenced by different factors at different scales of social organization, but this scale dependence has not been examined. We used structured interview surveys to quantitatively assess the attitudes of a Buddhist pastoral community toward snow leopards (Panthera uncia) and wolves (Canis lupus). We interviewed 381 individuals from 24 villages within 6 study sites across the high-elevation Spiti Valley in the Indian Trans-Himalaya. We gathered information on key explanatory variables that together captured variation in individual and village-level socioeconomic factors. We used hierarchical linear models to examine how the effect of these factors on human attitudes changed with the scale of analysis from the individual to the community. Factors significant at the individual level were gender, education, and age of the respondent (for wolves and snow leopards), number of income sources in the family (wolves), agricultural production, and large-bodied livestock holdings (snow leopards). At the community level, the significant factors included the number of smaller-bodied herded livestock killed by wolves and mean agricultural production (wolves) and village size and large livestock holdings (snow leopards). Our results show that scaling up from the individual to higher levels of social organization can highlight important factors that influence attitudes of people toward wildlife and toward formal conservation efforts in general. Such scale-specific information can help managers apply conservation measures at appropriate scales. Our results reiterate the need for conflict management programs to be multipronged. © 2014 Society for Conservation Biology.

  7. Distribution of Wild Mammal Assemblages along an Urban–Rural–Forest Landscape Gradient in Warm-Temperate East Asia

    PubMed Central

    Saito, Masayuki; Koike, Fumito

    2013-01-01

    Urbanization may alter mammal assemblages via habitat loss, food subsidies, and other factors related to human activities. The general distribution patterns of wild mammal assemblages along urban–rural–forest landscape gradients have not been studied, although many studies have focused on a single species or taxon, such as rodents. We quantitatively evaluated the effects of the urban–rural–forest gradient and spatial scale on the distributions of large and mid-sized mammals in the world's largest metropolitan area in warm-temperate Asia using nonspecific camera-trapping along two linear transects spanning from the urban zone in the Tokyo metropolitan area to surrounding rural and forest landscapes. Many large and mid-sized species generally decreased from forest landscapes to urban cores, although some species preferred anthropogenic landscapes. Sika deer (Cervus nippon), Reeves' muntjac (Muntiacus reevesi), Japanese macaque (Macaca fuscata), Japanese squirrel (Sciurus lis), Japanese marten (Martes melampus), Japanese badger (Meles anakuma), and wild boar (Sus scrofa) generally dominated the mammal assemblage of the forest landscape. Raccoon (Procyon lotor), raccoon dog (Nyctereutes procyonoides), and Japanese hare (Lepus brachyurus) dominated the mammal assemblage in the intermediate zone (i.e., rural and suburban landscape). Cats (feral and free-roaming housecats; Felis catus) were common in the urban assemblage. The key spatial scales for forest species were more than 4000-m radius, indicating that conservation and management plans for these mammal assemblages should be considered on large spatial scales. However, small green spaces will also be important for mammal conservation in the urban landscape, because an indigenous omnivore (raccoon dog) had a smaller key spatial scale (500-m radius) than those of forest mammals. Urbanization was generally the most important factor in the distributions of mammals, and it is necessary to consider the spatial scale of management according to the degree of urbanization. PMID:23741495

  8. Distribution of wild mammal assemblages along an urban-rural-forest landscape gradient in warm-temperate East Asia.

    PubMed

    Saito, Masayuki; Koike, Fumito

    2013-01-01

    Urbanization may alter mammal assemblages via habitat loss, food subsidies, and other factors related to human activities. The general distribution patterns of wild mammal assemblages along urban-rural-forest landscape gradients have not been studied, although many studies have focused on a single species or taxon, such as rodents. We quantitatively evaluated the effects of the urban-rural-forest gradient and spatial scale on the distributions of large and mid-sized mammals in the world's largest metropolitan area in warm-temperate Asia using nonspecific camera-trapping along two linear transects spanning from the urban zone in the Tokyo metropolitan area to surrounding rural and forest landscapes. Many large and mid-sized species generally decreased from forest landscapes to urban cores, although some species preferred anthropogenic landscapes. Sika deer (Cervus nippon), Reeves' muntjac (Muntiacus reevesi), Japanese macaque (Macaca fuscata), Japanese squirrel (Sciurus lis), Japanese marten (Martes melampus), Japanese badger (Meles anakuma), and wild boar (Sus scrofa) generally dominated the mammal assemblage of the forest landscape. Raccoon (Procyon lotor), raccoon dog (Nyctereutes procyonoides), and Japanese hare (Lepus brachyurus) dominated the mammal assemblage in the intermediate zone (i.e., rural and suburban landscape). Cats (feral and free-roaming housecats; Felis catus) were common in the urban assemblage. The key spatial scales for forest species were more than 4000-m radius, indicating that conservation and management plans for these mammal assemblages should be considered on large spatial scales. However, small green spaces will also be important for mammal conservation in the urban landscape, because an indigenous omnivore (raccoon dog) had a smaller key spatial scale (500-m radius) than those of forest mammals. Urbanization was generally the most important factor in the distributions of mammals, and it is necessary to consider the spatial scale of management according to the degree of urbanization.

  9. Genetic and environmental determinants of violence risk in psychotic disorders: a multivariate quantitative genetic study of 1.8 million Swedish twins and siblings.

    PubMed

    Sariaslan, A; Larsson, H; Fazel, S

    2016-09-01

    Patients diagnosed with psychotic disorders (for example, schizophrenia and bipolar disorder) have elevated risks of committing violent acts, particularly if they are comorbid with substance misuse. Despite recent insights from quantitative and molecular genetic studies demonstrating considerable pleiotropy in the genetic architecture of these phenotypes, there is currently a lack of large-scale studies that have specifically examined the aetiological links between psychotic disorders and violence. Using a sample of all Swedish individuals born between 1958 and 1989 (n=3 332 101), we identified a total of 923 259 twin-sibling pairs. Patients were identified using the National Patient Register using validated algorithms based on International Classification of Diseases (ICD) 8-10. Univariate quantitative genetic models revealed that all phenotypes (schizophrenia, bipolar disorder, substance misuse, and violent crime) were highly heritable (h(2)=53-71%). Multivariate models further revealed that schizophrenia was a stronger predictor of violence (r=0.32; 95% confidence interval: 0.30-0.33) than bipolar disorder (r=0.23; 0.21-0.25), and large proportions (51-67%) of these phenotypic correlations were explained by genetic factors shared between each disorder, substance misuse, and violence. Importantly, we found that genetic influences that were unrelated to substance misuse explained approximately a fifth (21%; 20-22%) of the correlation with violent criminality in bipolar disorder but none of the same correlation in schizophrenia (Pbipolar disorder<0.001; Pschizophrenia=0.55). These findings highlight the problems of not disentangling common and unique sources of covariance across genetically similar phenotypes as the latter sources may include aetiologically important clues. Clinically, these findings underline the importance of assessing risk of different phenotypes together and integrating interventions for psychiatric disorders, substance misuse, and violence.

  10. Small- and Large-Effect Quantitative Trait Locus Interactions Underlie Variation in Yeast Sporulation Efficiency

    PubMed Central

    Lorenz, Kim; Cohen, Barak A.

    2012-01-01

    Quantitative trait loci (QTL) with small effects on phenotypic variation can be difficult to detect and analyze. Because of this a large fraction of the genetic architecture of many complex traits is not well understood. Here we use sporulation efficiency in Saccharomyces cerevisiae as a model complex trait to identify and study small-effect QTL. In crosses where the large-effect quantitative trait nucleotides (QTN) have been genetically fixed we identify small-effect QTL that explain approximately half of the remaining variation not explained by the major effects. We find that small-effect QTL are often physically linked to large-effect QTL and that there are extensive genetic interactions between small- and large-effect QTL. A more complete understanding of quantitative traits will require a better understanding of the numbers, effect sizes, and genetic interactions of small-effect QTL. PMID:22942125

  11. Histomorphometry and cortical robusticity of the adult human femur.

    PubMed

    Miszkiewicz, Justyna Jolanta; Mahoney, Patrick

    2018-01-13

    Recent quantitative analyses of human bone microanatomy, as well as theoretical models that propose bone microstructure and gross anatomical associations, have started to reveal insights into biological links that may facilitate remodeling processes. However, relationships between bone size and the underlying cortical bone histology remain largely unexplored. The goal of this study is to determine the extent to which static indicators of bone remodeling and vascularity, measured using histomorphometric techniques, relate to femoral midshaft cortical width and robusticity. Using previously published and new quantitative data from 450 adult human male (n = 233) and female (n = 217) femora, we determine if these aspects of femoral size relate to bone microanatomy. Scaling relationships are explored and interpreted within the context of tissue form and function. Analyses revealed that the area and diameter of Haversian canals and secondary osteons, and densities of secondary osteons and osteocyte lacunae from the sub-periosteal region of the posterior midshaft femur cortex were significantly, but not consistently, associated with femoral size. Cortical width and bone robusticity were correlated with osteocyte lacunae density and scaled with positive allometry. Diameter and area of osteons and Haversian canals decreased as the width of cortex and bone robusticity increased, revealing a negative allometric relationship. These results indicate that microscopic products of cortical bone remodeling and vascularity are linked to femur size. Allometric relationships between more robust human femora with thicker cortical bone and histological products of bone remodeling correspond with principles of bone functional adaptation. Future studies may benefit from exploring scaling relationships between bone histomorphometric data and measurements of bone macrostructure.

  12. A Census of Large-scale (≥10 PC), Velocity-coherent, Dense Filaments in the Northern Galactic Plane: Automated Identification Using Minimum Spanning Tree

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Testi, Leonardo; Burkert, Andreas; Walmsley, C. Malcolm; Beuther, Henrik; Henning, Thomas

    2016-09-01

    Large-scale gaseous filaments with lengths up to the order of 100 pc are on the upper end of the filamentary hierarchy of the Galactic interstellar medium (ISM). Their association with respect to the Galactic structure and their role in Galactic star formation are of great interest from both an observational and theoretical point of view. Previous “by-eye” searches, combined together, have started to uncover the Galactic distribution of large filaments, yet inherent bias and small sample size limit conclusive statistical results from being drawn. Here, we present (1) a new, automated method for identifying large-scale velocity-coherent dense filaments, and (2) the first statistics and the Galactic distribution of these filaments. We use a customized minimum spanning tree algorithm to identify filaments by connecting voxels in the position-position-velocity space, using the Bolocam Galactic Plane Survey spectroscopic catalog. In the range of 7\\buildrel{\\circ}\\over{.} 5≤slant l≤slant 194^\\circ , we have identified 54 large-scale filaments and derived mass (˜ {10}3{--}{10}5 {M}⊙ ), length (10-276 pc), linear mass density (54-8625 {M}⊙ pc-1), aspect ratio, linearity, velocity gradient, temperature, fragmentation, Galactic location, and orientation angle. The filaments concentrate along major spiral arms. They are widely distributed across the Galactic disk, with 50% located within ±20 pc from the Galactic mid-plane and 27% run in the center of spiral arms. An order of 1% of the molecular ISM is confined in large filaments. Massive star formation is more favorable in large filaments compared to elsewhere. This is the first comprehensive catalog of large filaments that can be useful for a quantitative comparison with spiral structures and numerical simulations.

  13. Extracting Primordial Non-Gaussianity from Large Scale Structure in the Post-Planck Era

    NASA Astrophysics Data System (ADS)

    Dore, Olivier

    Astronomical observations have become a unique tool to probe fundamental physics. Cosmology, in particular, emerged as a data-driven science whose phenomenological modeling has achieved great success: in the post-Planck era, key cosmological parameters are measured to percent precision. A single model reproduces a wealth of astronomical observations involving very distinct physical processes at different times. This success leads to fundamental physical questions. One of the most salient is the origin of the primordial perturbations that grew to form the large-scale structures we now observe. More and more cosmological observables point to inflationary physics as the origin of the structure observed in the universe. Inflationary physics predict the statistical properties of the primordial perturbations and it is thought to be slightly non-Gaussian. The detection of this small deviation from Gaussianity represents the next frontier in early Universe physics. To measure it would provide direct, unique and quantitative insights about the physics at play when the Universe was only a fraction of a second old, thus probing energies untouchable otherwise. En par with the well-known relic gravitational wave radiation -- the famous ``B-modes'' -- it is one the few probes of inflation. This departure from Gaussianity leads to very specific signature in the large scale clustering of galaxies. Observing large-scale structure, we can thus establish a direct connection with fundamental theories of the early universe. In the post-Planck era, large-scale structures are our most promising pathway to measuring this primordial signal. Current estimates suggests that the next generation of space or ground based large scale structure surveys (e.g. the ESA EUCLID or NASA WFIRST missions) might enable a detection of this signal. This potential huge payoff requires us to solidify the theoretical predictions supporting these measurements. Even if the exact signal we are looking for is of unknown amplitude, it is obvious that we must measure it as well as these ground breaking data set will permit. We propose to develop the supporting theoretical work to the point where the complete non-gaussianian signature can be extracted from these data sets. We will do so by developing three complementary directions: - We will develop the appropriate formalism to measure and model galaxy clustering on the largest scales. - We will study the impact of non-Gaussianity on higher-order statistics, the most promising statistics for our purpose.. - We will explicit the connection between these observables and the microphysics of a large class of inflation models, but also identify fundamental limitations to this interpretation.

  14. Crowdsourcing scoring of immunohistochemistry images: Evaluating Performance of the Crowd and an Automated Computational Method

    NASA Astrophysics Data System (ADS)

    Irshad, Humayun; Oh, Eun-Yeong; Schmolze, Daniel; Quintana, Liza M.; Collins, Laura; Tamimi, Rulla M.; Beck, Andrew H.

    2017-02-01

    The assessment of protein expression in immunohistochemistry (IHC) images provides important diagnostic, prognostic and predictive information for guiding cancer diagnosis and therapy. Manual scoring of IHC images represents a logistical challenge, as the process is labor intensive and time consuming. Since the last decade, computational methods have been developed to enable the application of quantitative methods for the analysis and interpretation of protein expression in IHC images. These methods have not yet replaced manual scoring for the assessment of IHC in the majority of diagnostic laboratories and in many large-scale research studies. An alternative approach is crowdsourcing the quantification of IHC images to an undefined crowd. The aim of this study is to quantify IHC images for labeling of ER status with two different crowdsourcing approaches, image-labeling and nuclei-labeling, and compare their performance with automated methods. Crowdsourcing- derived scores obtained greater concordance with the pathologist interpretations for both image-labeling and nuclei-labeling tasks (83% and 87%), as compared to the pathologist concordance achieved by the automated method (81%) on 5,338 TMA images from 1,853 breast cancer patients. This analysis shows that crowdsourcing the scoring of protein expression in IHC images is a promising new approach for large scale cancer molecular pathology studies.

  15. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    PubMed

    Marzinelli, Ezequiel M; Williams, Stefan B; Babcock, Russell C; Barrett, Neville S; Johnson, Craig R; Jordan, Alan; Kendrick, Gary A; Pizarro, Oscar R; Smale, Dan A; Steinberg, Peter D

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia's Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km) and depths (15-60 m) across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves.

  16. Flow topologies and turbulence scales in a jet-in-cross-flow

    DOE PAGES

    Oefelein, Joseph C.; Ruiz, Anthony M.; Lacaze, Guilhem

    2015-04-03

    This study presents a detailed analysis of the flow topologies and turbulence scales in the jet-in-cross-flow experiment of [Su and Mungal JFM 2004]. The analysis is performed using the Large Eddy Simulation (LES) technique with a highly resolved grid and time-step and well controlled boundary conditions. This enables quantitative agreement with the first and second moments of turbulence statistics measured in the experiment. LES is used to perform the analysis since experimental measurements of time-resolved 3D fields are still in their infancy and because sampling periods are generally limited with direct numerical simulation. A major focal point is the comprehensivemore » characterization of the turbulence scales and their evolution. Time-resolved probes are used with long sampling periods to obtain maps of the integral scales, Taylor microscales, and turbulent kinetic energy spectra. Scalar-fluctuation scales are also quantified. In the near-field, coherent structures are clearly identified, both in physical and spectral space. Along the jet centerline, turbulence scales grow according to a classical one-third power law. However, the derived maps of turbulence scales reveal strong inhomogeneities in the flow. From the modeling perspective, these insights are useful to design optimized grids and improve numerical predictions in similar configurations.« less

  17. A Validity and Reliability Study of the Attitudes toward Sustainable Development Scale

    ERIC Educational Resources Information Center

    Biasutti, Michele; Frate, Sara

    2017-01-01

    This article describes the development and validation of the Attitudes toward Sustainable Development scale, a quantitative 20-item scale that measures Italian university students' attitudes toward sustainable development. A total of 484 undergraduate students completed the questionnaire. The validity and reliability of the scale was statistically…

  18. A New, Continuous 5400 Yr-long Paleotsunami Record from Lake Huelde, Chiloe Island, South Central Chile.

    NASA Astrophysics Data System (ADS)

    Kempf, P.; Moernaut, J.; Vandoorne, W.; Van Daele, M. E.; Pino, M.; Urrutia, R.; De Batist, M. A. O.

    2014-12-01

    After the last decade of extreme tsunami events with catastrophic damage to infrastructure and a horrendous amount of casualties, it is clear that more and better paleotsunami records are needed to improve our understanding of the recurrence intervals and intensities of large-scale tsunamis. Coastal lakes (e.g. Bradley Lake, Cascadia; Kelsey et al., 2005) have the potential to contain long and continuous sedimentary records, which is an important asset in view of the centennial- to millennial-scale recurrence times of great tsunami-triggering earthquakes. Lake Huelde on Chiloé Island (42.5°S), Chile, is a coastal lake located in the middle of the Valdivia segment, which is known for having produced the strongest ever instrumentally recorded earthquake in 1960 AD (MW: 9.5), and other large earthquakes prior to that: i.e. 1837 AD, 1737 AD (no report of a tsunami) and 1575 AD (Lomnitz, 1970, 2004, Cisternas et al., 2005). We present a new 5400 yr-long paleotsunami record with a Bayesian age-depth model based on 23 radiocarbon dates that exceeds all previous paleotsunami records from the Valdivia segment, both in terms of length and of continuity. 18 events are described and a semi-quantitative measure of the event intensity at the study area is given, revealing at least two predecessors of the 1960 AD event in the mid to late Holocene that are equal in intensity. The resulting implications from the age-depth model and from the semi-quantitative intensity reconstruction are discussed in this contribution.

  19. Scaling properties of European research units

    PubMed Central

    Jamtveit, Bjørn; Jettestuen, Espen; Mathiesen, Joachim

    2009-01-01

    A quantitative characterization of the scale-dependent features of research units may provide important insight into how such units are organized and how they grow. The relative importance of top-down versus bottom-up controls on their growth may be revealed by their scaling properties. Here we show that the number of support staff in Scandinavian research units, ranging in size from 20 to 7,800 staff members, is related to the number of academic staff by a power law. The scaling exponent of ≈1.30 is broadly consistent with a simple hierarchical model of the university organization. Similar scaling behavior between small and large research units with a wide range of ambitions and strategies argues against top-down control of the growth. Top-down effects, and externally imposed effects from changing political environments, can be observed as fluctuations around the main trend. The observed scaling law implies that cost-benefit arguments for merging research institutions into larger and larger units may have limited validity unless the productivity per academic staff and/or the quality of the products are considerably higher in larger institutions. Despite the hierarchical structure of most large-scale research units in Europe, the network structures represented by the academic component of such units are strongly antihierarchical and suboptimal for efficient communication within individual units. PMID:19625626

  20. Receptor signaling clusters in the immune synapse(in eng)

    DOE PAGES

    Dustin, Michael L.; Groves, Jay T.

    2012-02-23

    Signaling processes between various immune cells involve large-scale spatial reorganization of receptors and signaling molecules within the cell-cell junction. These structures, now collectively referred to as immune synapses, interleave physical and mechanical processes with the cascades of chemical reactions that constitute signal transduction systems. Molecular level clustering, spatial exclusion, and long-range directed transport are all emerging as key regulatory mechanisms. The study of these processes is drawing researchers from physical sciences to join the effort and represents a rapidly growing branch of biophysical chemistry. Furthermore, recent advances in physical and quantitative analyses of signaling within the immune synapses are reviewedmore » here.« less

  1. Receptor signaling clusters in the immune synapse (in eng)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dustin, Michael L.; Groves, Jay T.

    2012-02-23

    Signaling processes between various immune cells involve large-scale spatial reorganization of receptors and signaling molecules within the cell-cell junction. These structures, now collectively referred to as immune synapses, interleave physical and mechanical processes with the cascades of chemical reactions that constitute signal transduction systems. Molecular level clustering, spatial exclusion, and long-range directed transport are all emerging as key regulatory mechanisms. The study of these processes is drawing researchers from physical sciences to join the effort and represents a rapidly growing branch of biophysical chemistry. Furthermore, recent advances in physical and quantitative analyses of signaling within the immune synapses are reviewedmore » here.« less

  2. Impact of playing American professional football on long-term brain function.

    PubMed

    Amen, Daniel G; Newberg, Andrew; Thatcher, Robert; Jin, Yi; Wu, Joseph; Keator, David; Willeumier, Kristen

    2011-01-01

    The authors recruited 100 active and former National Football League players, representing 27 teams and all positions. Players underwent a clinical history, brain SPECT imaging, qEEG, and multiple neuropsychological measures, including MicroCog. Relative to a healthy-comparison group, players showed global decreased perfusion, especially in the prefrontal, temporal, parietal, and occipital lobes, and cerebellar regions. Quantitative EEG findings were consistent, showing elevated slow waves in the frontal and temporal regions. Significant decreases from normal values were found in most neuropsychological tests. This is the first large-scale brain-imaging study to demonstrate significant differences consistent with a chronic brain trauma pattern in professional football players.

  3. Emergent Societal Effects of Crimino-Social Forces in an Animat Agent Model

    NASA Astrophysics Data System (ADS)

    Scogings, Chris J.; Hawick, Ken A.

    Societal behaviour can be studied at a causal level by perturbing a stable multi-agent model with new microscopic behaviours and observing the statistical response over an ensemble of simulated model systems. We report on the effects of introducing criminal and law-enforcing behaviours into a large scale animat agent model and describe the complex spatial agent patterns and population changes that result. Our well-established predator-prey substrate model provides a background framework against which these new microscopic behaviours can be trialled and investigated. We describe some quantitative results and some surprising conclusions concerning the overall societal health when individually anti-social behaviour is introduced.

  4. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  5. Methodological uncertainty in quantitative prediction of human hepatic clearance from in vitro experimental systems.

    PubMed

    Hallifax, D; Houston, J B

    2009-03-01

    Mechanistic prediction of unbound drug clearance from human hepatic microsomes and hepatocytes correlates with in vivo clearance but is both systematically low (10 - 20 % of in vivo clearance) and highly variable, based on detailed assessments of published studies. Metabolic capacity (Vmax) of commercially available human hepatic microsomes and cryopreserved hepatocytes is log-normally distributed within wide (30 - 150-fold) ranges; Km is also log-normally distributed and effectively independent of Vmax, implying considerable variability in intrinsic clearance. Despite wide overlap, average capacity is 2 - 20-fold (dependent on P450 enzyme) greater in microsomes than hepatocytes, when both are normalised (scaled to whole liver). The in vitro ranges contrast with relatively narrow ranges of clearance among clinical studies. The high in vitro variation probably reflects unresolved phenotypical variability among liver donors and practicalities in processing of human liver into in vitro systems. A significant contribution from the latter is supported by evidence of low reproducibility (several fold) of activity in cryopreserved hepatocytes and microsomes prepared from the same cells, between separate occasions of thawing of cells from the same liver. The large uncertainty which exists in human hepatic in vitro systems appears to dominate the overall uncertainty of in vitro-in vivo extrapolation, including uncertainties within scaling, modelling and drug dependent effects. As such, any notion of quantitative prediction of clearance appears severely challenged.

  6. N -tag probability law of the symmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Poncet, Alexis; Bénichou, Olivier; Démery, Vincent; Oshanin, Gleb

    2018-06-01

    The symmetric exclusion process (SEP), in which particles hop symmetrically on a discrete line with hard-core constraints, is a paradigmatic model of subdiffusion in confined systems. This anomalous behavior is a direct consequence of strong spatial correlations induced by the requirement that the particles cannot overtake each other. Even if this fact has been recognized qualitatively for a long time, up to now there has been no full quantitative determination of these correlations. Here we study the joint probability distribution of an arbitrary number of tagged particles in the SEP. We determine analytically its large-time limit for an arbitrary density of particles, and its full dynamics in the high-density limit. In this limit, we obtain the time-dependent large deviation function of the problem and unveil a universal scaling form shared by the cumulants.

  7. Measuring the Beginning: A Quantitative Study of the Transition to Higher Education

    ERIC Educational Resources Information Center

    Brooman, Simon; Darwent, Sue

    2014-01-01

    This quantitative study measures change in certain factors known to influence success of first-year students during the transition to higher education: self-efficacy, autonomous learning and social integration. A social integration scale was developed with three subscales: "sense of belonging", "relationship with staff" and…

  8. Observed Differences between North American Snow Extent and Snow Depth Variability

    NASA Astrophysics Data System (ADS)

    Ge, Y.; Gong, G.

    2006-12-01

    Snow extent and snow depth are two related characteristics of a snowpack, but they need not be mutually consistent. Differences between these two variables at local scales are readily apparent. However at larger scales which interact with atmospheric circulation and climate, snow extent is typically the variable used, while snow depth is often assumed to be minor and/or mutually consistent compared to snow extent, though this is rarely verified. In this study, a new regional/continental-scale gridded dataset derived from field observations is utilized to quantitatively evaluate the relationship between snow extent and snow depth over North America. Various statistical methods are applied to assess the mutual consistency of monthly snow depth vs. snow extent, including correlations, composites and principal components. Results indicate that snow depth variations are significant in their own rights, and that depth and extent anomalies are largely unrelated, especially over broad high latitude regions north of the snowline. In the vicinity of the snowline, where precipitation and ablation can affect both snow extent and snow depth, the two variables vary concurrently, especially in autumn and spring. It is also found that deeper winter snow translates into larger snow-covered area in the subsequent spring/summer season, which suggests a possible influence of winter snow depth on summer climate. The observed lack of mutual consistency at continental/regional scales suggests that snowpack depth variations may be of sufficiently large magnitude, spatial scope and temporal duration to influence regional-hemispheric climate, in a manner unrelated to the more extensively studied snow extent variations.

  9. Protein kinetic signatures of the remodeling heart following isoproterenol stimulation.

    PubMed

    Lam, Maggie P Y; Wang, Ding; Lau, Edward; Liem, David A; Kim, Allen K; Ng, Dominic C M; Liang, Xiangbo; Bleakley, Brian J; Liu, Chenguang; Tabaraki, Jason D; Cadeiras, Martin; Wang, Yibin; Deng, Mario C; Ping, Peipei

    2014-04-01

    Protein temporal dynamics play a critical role in time-dimensional pathophysiological processes, including the gradual cardiac remodeling that occurs in early-stage heart failure. Methods for quantitative assessments of protein kinetics are lacking, and despite knowledge gained from single-protein studies, integrative views of the coordinated behavior of multiple proteins in cardiac remodeling are scarce. Here, we developed a workflow that integrates deuterium oxide (2H2O) labeling, high-resolution mass spectrometry (MS), and custom computational methods to systematically interrogate in vivo protein turnover. Using this workflow, we characterized the in vivo turnover kinetics of 2,964 proteins in a mouse model of β-adrenergic-induced cardiac remodeling. The data provided a quantitative and longitudinal view of cardiac remodeling at the molecular level, revealing widespread kinetic regulations in calcium signaling, metabolism, proteostasis, and mitochondrial dynamics. We translated the workflow to human studies, creating a reference dataset of 496 plasma protein turnover rates from 4 healthy adults. The approach is applicable to short, minimal label enrichment and can be performed on as little as a single biopsy, thereby overcoming critical obstacles to clinical investigations. The protein turnover quantitation experiments and computational workflow described here should be widely applicable to large-scale biomolecular investigations of human disease mechanisms with a temporal perspective.

  10. Protein kinetic signatures of the remodeling heart following isoproterenol stimulation

    PubMed Central

    Lam, Maggie P.Y.; Wang, Ding; Lau, Edward; Liem, David A.; Kim, Allen K.; Ng, Dominic C.M.; Liang, Xiangbo; Bleakley, Brian J.; Liu, Chenguang; Tabaraki, Jason D.; Cadeiras, Martin; Wang, Yibin; Deng, Mario C.; Ping, Peipei

    2014-01-01

    Protein temporal dynamics play a critical role in time-dimensional pathophysiological processes, including the gradual cardiac remodeling that occurs in early-stage heart failure. Methods for quantitative assessments of protein kinetics are lacking, and despite knowledge gained from single-protein studies, integrative views of the coordinated behavior of multiple proteins in cardiac remodeling are scarce. Here, we developed a workflow that integrates deuterium oxide (2H2O) labeling, high-resolution mass spectrometry (MS), and custom computational methods to systematically interrogate in vivo protein turnover. Using this workflow, we characterized the in vivo turnover kinetics of 2,964 proteins in a mouse model of β-adrenergic–induced cardiac remodeling. The data provided a quantitative and longitudinal view of cardiac remodeling at the molecular level, revealing widespread kinetic regulations in calcium signaling, metabolism, proteostasis, and mitochondrial dynamics. We translated the workflow to human studies, creating a reference dataset of 496 plasma protein turnover rates from 4 healthy adults. The approach is applicable to short, minimal label enrichment and can be performed on as little as a single biopsy, thereby overcoming critical obstacles to clinical investigations. The protein turnover quantitation experiments and computational workflow described here should be widely applicable to large-scale biomolecular investigations of human disease mechanisms with a temporal perspective. PMID:24614109

  11. An Integrative Structural Health Monitoring System for the Local/Global Responses of a Large-Scale Irregular Building under Construction

    PubMed Central

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  12. SNPassoc: an R package to perform whole genome association studies.

    PubMed

    González, Juan R; Armengol, Lluís; Solé, Xavier; Guinó, Elisabet; Mercader, Josep M; Estivill, Xavier; Moreno, Víctor

    2007-03-01

    The popularization of large-scale genotyping projects has led to the widespread adoption of genetic association studies as the tool of choice in the search for single nucleotide polymorphisms (SNPs) underlying susceptibility to complex diseases. Although the analysis of individual SNPs is a relatively trivial task, when the number is large and multiple genetic models need to be explored it becomes necessary a tool to automate the analyses. In order to address this issue, we developed SNPassoc, an R package to carry out most common analyses in whole genome association studies. These analyses include descriptive statistics and exploratory analysis of missing values, calculation of Hardy-Weinberg equilibrium, analysis of association based on generalized linear models (either for quantitative or binary traits), and analysis of multiple SNPs (haplotype and epistasis analysis). Package SNPassoc is available at CRAN from http://cran.r-project.org. A tutorial is available on Bioinformatics online and in http://davinci.crg.es/estivill_lab/snpassoc.

  13. Genetic Structures of Copy Number Variants Revealed by Genotyping Single Sperm

    PubMed Central

    Luo, Minjie; Cui, Xiangfeng; Fredman, David; Brookes, Anthony J.; Azaro, Marco A.; Greenawalt, Danielle M.; Hu, Guohong; Wang, Hui-Yun; Tereshchenko, Irina V.; Lin, Yong; Shentu, Yue; Gao, Richeng; Shen, Li; Li, Honghua

    2009-01-01

    Background Copy number variants (CNVs) occupy a significant portion of the human genome and may have important roles in meiotic recombination, human genome evolution and gene expression. Many genetic diseases may be underlain by CNVs. However, because of the presence of their multiple copies, variability in copy numbers and the diploidy of the human genome, detailed genetic structure of CNVs cannot be readily studied by available techniques. Methodology/Principal Findings Single sperm samples were used as the primary subjects for the study so that CNV haplotypes in the sperm donors could be studied individually. Forty-eight CNVs characterized in a previous study were analyzed using a microarray-based high-throughput genotyping method after multiplex amplification. Seventeen single nucleotide polymorphisms (SNPs) were also included as controls. Two single-base variants, either allelic or paralogous, could be discriminated for all markers. Microarray data were used to resolve SNP alleles and CNV haplotypes, to quantitatively assess the numbers and compositions of the paralogous segments in each CNV haplotype. Conclusions/Significance This is the first study of the genetic structure of CNVs on a large scale. Resulting information may help understand evolution of the human genome, gain insight into many genetic processes, and discriminate between CNVs and SNPs. The highly sensitive high-throughput experimental system with haploid sperm samples as subjects may be used to facilitate detailed large-scale CNV analysis. PMID:19384415

  14. In-depth Qualitative and Quantitative Profiling of Tyrosine Phosphorylation Using a Combination of Phosphopeptide Immunoaffinity Purification and Stable Isotope Dimethyl Labeling*

    PubMed Central

    Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.

    2010-01-01

    Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167

  15. Earthquakes in the Laboratory: Continuum-Granular Interactions

    NASA Astrophysics Data System (ADS)

    Ecke, Robert; Geller, Drew; Ward, Carl; Backhaus, Scott

    2013-03-01

    Earthquakes in nature feature large tectonic plate motion at large scales of 10-100 km and local properties of the earth on the scale of the rupture width, of the order of meters. Fault gouge often fills the gap between the large slipping plates and may play an important role in the nature and dynamics of earthquake events. We have constructed a laboratory scale experiment that represents a similitude scale model of this general earthquake description. Two photo-elastic plates (50 cm x 25 cm x 1 cm) confine approximately 3000 bi-disperse nylon rods (diameters 0.12 and 0.16 cm, height 1 cm) in a gap of approximately 1 cm. The plates are held rigidly along their outer edges with one held fixed while the other edge is driven at constant speed over a range of about 5 cm. The local stresses exerted on the plates are measured using their photo-elastic response, the local relative motions of the plates, i.e., the local strains, are determined by the relative motion of small ball bearings attached to the top surface, and the configurations of the nylon rods are investigated using particle tracking tools. We find that this system has properties similar to real earthquakes and are exploring these ``lab-quake'' events with the quantitative tools we have developed.

  16. Text mixing shapes the anatomy of rank-frequency distributions

    NASA Astrophysics Data System (ADS)

    Williams, Jake Ryland; Bagrow, James P.; Danforth, Christopher M.; Dodds, Peter Sheridan

    2015-05-01

    Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf's law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this "law" of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim.

  17. String-like collective motion in the α- and β-relaxation of a coarse-grained polymer melt

    NASA Astrophysics Data System (ADS)

    Pazmiño Betancourt, Beatriz A.; Starr, Francis W.; Douglas, Jack F.

    2018-03-01

    Relaxation in glass-forming liquids occurs as a multi-stage hierarchical process involving cooperative molecular motion. First, there is a "fast" relaxation process dominated by the inertial motion of the molecules whose amplitude grows upon heating, followed by a longer time α-relaxation process involving both large-scale diffusive molecular motion and momentum diffusion. Our molecular dynamics simulations of a coarse-grained glass-forming polymer melt indicate that the fast, collective motion becomes progressively suppressed upon cooling, necessitating large-scale collective motion by molecular diffusion for the material to relax approaching the glass-transition. In each relaxation regime, the decay of the collective intermediate scattering function occurs through collective particle exchange motions having a similar geometrical form, and quantitative relationships are derived relating the fast "stringlet" collective motion to the larger scale string-like collective motion at longer times, which governs the temperature-dependent activation energies associated with both thermally activated molecular diffusion and momentum diffusion.

  18. Large-scale assembly bias of dark matter halos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lazeyras, Titouan; Musso, Marcello; Schmidt, Fabian, E-mail: titouan@mpa-garching.mpg.de, E-mail: mmusso@sas.upenn.edu, E-mail: fabians@mpa-garching.mpg.de

    We present precise measurements of the assembly bias of dark matter halos, i.e. the dependence of halo bias on other properties than the mass, using curved 'separate universe' N-body simulations which effectively incorporate an infinite-wavelength matter overdensity into the background density. This method measures the LIMD (local-in-matter-density) bias parameters b {sub n} in the large-scale limit. We focus on the dependence of the first two Eulerian biases b {sup E} {sup {sub 1}} and b {sup E} {sup {sub 2}} on four halo properties: the concentration, spin, mass accretion rate, and ellipticity. We quantitatively compare our results with previous worksmore » in which assembly bias was measured on fairly small scales. Despite this difference, our findings are in good agreement with previous results. We also look at the joint dependence of bias on two halo properties in addition to the mass. Finally, using the excursion set peaks model, we attempt to shed new insights on how assembly bias arises in this analytical model.« less

  19. Text mixing shapes the anatomy of rank-frequency distributions.

    PubMed

    Williams, Jake Ryland; Bagrow, James P; Danforth, Christopher M; Dodds, Peter Sheridan

    2015-05-01

    Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf's law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this "law" of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim.

  20. Spatial and temporal patterns of stranded intertidal marine debris: is there a picture of global change?

    PubMed

    Browne, Mark Anthony; Chapman, M Gee; Thompson, Richard C; Amaral Zettler, Linda A; Jambeck, Jenna; Mallos, Nicholas J

    2015-06-16

    Floating and stranded marine debris is widespread. Increasing sea levels and altered rainfall, solar radiation, wind speed, waves, and oceanic currents associated with climatic change are likely to transfer more debris from coastal cities into marine and coastal habitats. Marine debris causes economic and ecological impacts, but understanding the scope of these requires quantitative information on spatial patterns and trends in the amounts and types of debris at a global scale. There are very few large-scale programs to measure debris, but many peer-reviewed and published scientific studies of marine debris describe local patterns. Unfortunately, methods of defining debris, sampling, and interpreting patterns in space or time vary considerably among studies, yet if data could be synthesized across studies, a global picture of the problem may be avaliable. We analyzed 104 published scientific papers on marine debris in order to determine how to evaluate this. Although many studies were well designed to answer specific questions, definitions of what constitutes marine debris, the methods used to measure, and the scale of the scope of the studies means that no general picture can emerge from this wealth of data. These problems are detailed to guide future studies and guidelines provided to enable the collection of more comparable data to better manage this growing problem.

  1. Estimation of sediment yield from subsequent expanded landslides after heavy rainfalls : a case study in central Hokkaido, Japan

    NASA Astrophysics Data System (ADS)

    Koshimizu, K.; Uchida, T.

    2015-12-01

    Initial large-scale sediment yield caused by heavy rainfall or major storms have made a strong impression on us. Previous studies focusing on landslide management investigated the initial sediment movement and its mechanism. However, integrated management of catchment-scale sediment movements requires estimating the sediment yield, which is produced by the subsequent expanded landslides due to rainfall, in addition to the initial landslide movement. This study presents a quantitative analysis of expanded landslides by surveying the Shukushubetsu River basin, at the foot of the Hidaka mountain range in central Hokkaido, Japan. This area recorded heavy rainfall in 2003, reaching a maximum daily precipitation of 388 mm. We extracted the expanded landslides from 2003 to 2008 using aerial photographs taken over the river area. In particular, we calculated the probability of expansion for each landslide, the ratio of the landslide area in 2008 as compared with that in 2003, and the amount of the expanded landslide area corresponding to the initial landslide area. As a result, it is estimated 24% about probability of expansion for each landslide. In addition, each expanded landslide area is smaller than the initial landslide area. Furthermore, the amount of each expanded landslide area in 2008 is approximately 7% of their landslide area in 2003. Therefore, the sediment yield from subsequent expanded landslides is equal to or slightly greater than the sediment yield in a typical base flow. Thus, we concluded that the amount of sediment yield from subsequent expanded landslides is lower than that of initial large-scale sediment yield caused by a heavy rainfall in terms of effect on management of catchment-scale sediment movement.

  2. Functional neuroimaging for addiction medicine: From mechanisms to practical considerations.

    PubMed

    Ekhtiari, Hamed; Faghiri, Ashkan; Oghabian, Mohammad-Ali; Paulus, Martin P

    2016-01-01

    During last 20 years, neuroimaging with functional magnetic resonance imaging (fMRI) in people with drug addictions has introduced a wide range of quantitative biomarkers from brain's regional or network level activities during different cognitive functions. These quantitative biomarkers could be potentially used for assessment, planning, prediction, and monitoring for "addiction medicine" during screening, acute intoxication, admission to a program, completion of an acute program, admission to a long-term program, and postgraduation follow-up. In this chapter, we have briefly reviewed main neurocognitive targets for fMRI studies associated with addictive behaviors, main study types using fMRI among drug dependents, and potential applications for fMRI in addiction medicine. Main challenges and limitations for extending fMRI studies and evidences aiming at clinical applications in addiction medicine are also discussed. There is still a significant gap between available evidences from group-based fMRI studies and personalized decisions during daily practices in addiction medicine. It will be important to fill this gap with large-scale clinical trials and longitudinal studies using fMRI measures with a well-defined strategic plan for the future. © 2016 Elsevier B.V. All rights reserved.

  3. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  4. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Single-cell-type quantitative proteomic and ionomic analysis of epidermal bladder cells from the halophyte model plant Mesembryanthemum crystallinum to identify salt-responsive proteins.

    PubMed

    Barkla, Bronwyn J; Vera-Estrella, Rosario; Raymond, Carolyn

    2016-05-10

    Epidermal bladder cells (EBC) are large single-celled, specialized, and modified trichomes found on the aerial parts of the halophyte Mesembryanthemum crystallinum. Recent development of a simple but high throughput technique to extract the contents from these cells has provided an opportunity to conduct detailed single-cell-type analyses of their molecular characteristics at high resolution to gain insight into the role of these cells in the salt tolerance of the plant. In this study, we carry out large-scale complementary quantitative proteomic studies using both a label (DIGE) and label-free (GeLC-MS) approach to identify salt-responsive proteins in the EBC extract. Additionally we perform an ionomics analysis (ICP-MS) to follow changes in the amounts of 27 different elements. Using these methods, we were able to identify 54 proteins and nine elements that showed statistically significant changes in the EBC from salt-treated plants. GO enrichment analysis identified a large number of transport proteins but also proteins involved in photosynthesis, primary metabolism and Crassulacean acid metabolism (CAM). Validation of results by western blot, confocal microscopy and enzyme analysis helped to strengthen findings and further our understanding into the role of these specialized cells. As expected EBC accumulated large quantities of sodium, however, the most abundant element was chloride suggesting the sequestration of this ion into the EBC vacuole is just as important for salt tolerance. This single-cell type omics approach shows that epidermal bladder cells of M. crystallinum are metabolically active modified trichomes, with primary metabolism supporting cell growth, ion accumulation, compatible solute synthesis and CAM. Data are available via ProteomeXchange with identifier PXD004045.

  6. Quantitative impedance characterization of sub-10 nm scale capacitors and tunnel junctions with an interferometric scanning microwave microscope.

    PubMed

    Wang, Fei; Clément, Nicolas; Ducatteau, Damien; Troadec, David; Tanbakuchi, Hassan; Legrand, Bernard; Dambrine, Gilles; Théron, Didier

    2014-10-10

    We present a method to characterize sub-10 nm capacitors and tunnel junctions by interferometric scanning microwave microscopy (iSMM) at 7.8 GHz. At such device scaling, the small water meniscus surrounding the iSMM tip should be reduced by proper tip tuning. Quantitative impedance characterization of attofarad range capacitors is achieved using an 'on-chip' calibration kit facing thousands of nanodevices. Nanoscale capacitors and tunnel barriers were detected through variations in the amplitude and phase of the reflected microwave signal, respectively. This study promises quantitative impedance characterization of a wide range of emerging functional nanoscale devices.

  7. Measuring short-term post-fire forest recovery across a burn severity gradient in a mixed pine-oak forest using multi-sensor remote sensing techniques

    DOE PAGES

    Meng, Ran; Wu, Jin; Zhao, Feng; ...

    2018-06-01

    Understanding post-fire forest recovery is pivotal to the study of forest dynamics and global carbon cycle. Field-based studies indicated a convex response of forest recovery rate to burn severity at the individual tree level, related with fire-induced tree mortality; however, these findings were constrained in spatial/temporal extents, while not detectable by traditional optical remote sensing studies, largely attributing to the contaminated effect from understory recovery. For this work, we examined whether the combined use of multi-sensor remote sensing techniques (i.e., 1m simultaneous airborne imaging spectroscopy and LiDAR and 2m satellite multi-spectral imagery) to separate canopy recovery from understory recovery wouldmore » enable to quantify post-fire forest recovery rate spanning a large gradient in burn severity over large-scales. Our study was conducted in a mixed pine-oak forest in Long Island, NY, three years after a top-killing fire. Our studies remotely detected an initial increase and then decline of forest recovery rate to burn severity across the burned area, with a maximum canopy area-based recovery rate of 10% per year at moderate forest burn severity class. More intriguingly, such remotely detected convex relationships also held at species level, with pine trees being more resilient to high burn severity and having a higher maximum recovery rate (12% per year) than oak trees (4% per year). These results are one of the first quantitative evidences showing the effects of fire adaptive strategies on post-fire forest recovery, derived from relatively large spatial-temporal domains. Our study thus provides the methodological advance to link multi-sensor remote sensing techniques to monitor forest dynamics in a spatially explicit manner over large-scales, with important implications for fire-related forest management, and for constraining/benchmarking fire effect schemes in ecological process models.« less

  8. Measuring short-term post-fire forest recovery across a burn severity gradient in a mixed pine-oak forest using multi-sensor remote sensing techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Ran; Wu, Jin; Zhao, Feng

    Understanding post-fire forest recovery is pivotal to the study of forest dynamics and global carbon cycle. Field-based studies indicated a convex response of forest recovery rate to burn severity at the individual tree level, related with fire-induced tree mortality; however, these findings were constrained in spatial/temporal extents, while not detectable by traditional optical remote sensing studies, largely attributing to the contaminated effect from understory recovery. For this work, we examined whether the combined use of multi-sensor remote sensing techniques (i.e., 1m simultaneous airborne imaging spectroscopy and LiDAR and 2m satellite multi-spectral imagery) to separate canopy recovery from understory recovery wouldmore » enable to quantify post-fire forest recovery rate spanning a large gradient in burn severity over large-scales. Our study was conducted in a mixed pine-oak forest in Long Island, NY, three years after a top-killing fire. Our studies remotely detected an initial increase and then decline of forest recovery rate to burn severity across the burned area, with a maximum canopy area-based recovery rate of 10% per year at moderate forest burn severity class. More intriguingly, such remotely detected convex relationships also held at species level, with pine trees being more resilient to high burn severity and having a higher maximum recovery rate (12% per year) than oak trees (4% per year). These results are one of the first quantitative evidences showing the effects of fire adaptive strategies on post-fire forest recovery, derived from relatively large spatial-temporal domains. Our study thus provides the methodological advance to link multi-sensor remote sensing techniques to monitor forest dynamics in a spatially explicit manner over large-scales, with important implications for fire-related forest management, and for constraining/benchmarking fire effect schemes in ecological process models.« less

  9. Primary production in the Delta: Then and now

    USGS Publications Warehouse

    Cloern, James E.; Robinson, April; Richey, Amy; Grenier, Letitia; Grossinger, Robin; Boyer, Katharyn E.; Burau, Jon; Canuel, Elizabeth A.; DeGeorge, John F.; Drexler, Judith Z.; Enright, Chris; Howe, Emily R.; Kneib, Ronald; Mueller-Solger, Anke; Naiman, Robert J.; Pinckney, James L.; Safran, Samuel M.; Schoellhamer, David H.; Simenstad, Charles A.

    2016-01-01

    To evaluate the role of restoration in the recovery of the Delta ecosystem, we need to have clear targets and performance measures that directly assess ecosystem function. Primary production is a crucial ecosystem process, which directly limits the quality and quantity of food available for secondary consumers such as invertebrates and fish. The Delta has a low rate of primary production, but it is unclear whether this was always the case. Recent analyses from the Historical Ecology Team and Delta Landscapes Project provide quantitative comparisons of the areal extent of 14 habitat types in the modern Delta versus the historical Delta (pre-1850). Here we describe an approach for using these metrics of land use change to: (1) produce the first quantitative estimates of how Delta primary production and the relative contributions from five different producer groups have been altered by large-scale drainage and conversion to agriculture; (2) convert these production estimates into a common currency so the contributions of each producer group reflect their food quality and efficiency of transfer to consumers; and (3) use simple models to discover how tidal exchange between marshes and open water influences primary production and its consumption. Application of this approach could inform Delta management in two ways. First, it would provide a quantitative estimate of how large-scale conversion to agriculture has altered the Delta's capacity to produce food for native biota. Second, it would provide restoration practitioners with a new approach—based on ecosystem function—to evaluate the success of restoration projects and gauge the trajectory of ecological recovery in the Delta region.

  10. Intercontinental convergence of stream fish community traits along geomorphic and hydraulic gradients

    USGS Publications Warehouse

    Lamouroux, N.; Poff, N.L.; Angermeier, P.L.

    2002-01-01

    Community convergence across biogeographically distinct regions suggests the existence of key, repeated, evolutionary mechanisms relating community characteristics to the environment. However, convergence studies at the community level often involve only qualitative comparisons of the environment and may fail to identify which environmental variables drive community structure. We tested the hypothesis that the biological traits of fish communities on two continents (Europe and North America) are similarly related to environmental conditions. Specifically, from observations of individual fish made at the microhabitat scale (a few square meters) within French streams, we generated habitat preference models linking traits of fish species to local scale hydraulic conditions (Froude number), Using this information, we then predicted how hydraulics and geomorphology at the larger scale of stream reaches (several pool-riffle sequences) should quantitatively influence the trait composition of fish communities. Trait composition for fishes in stream reaches with low Froude number at low flow or high proportion of pools was predicted as nonbenthic, large, fecund, long-lived, nonstreamlined, and weak swimmers. We tested our predictions in contrasting stream reaches in France (n = 11) and Virginia, USA (n = 76), using analyses of covariance to quantify the relative influence of continent vs. physical habitat variables on fish traits. The reach-scale convergence analysis indicated that trait proportions in the communities differed between continents (up to 55% of the variance in each trait was explained by "continent"), partly due to distinct evolutionary histories. However, within continents, trait proportions were comparably related to the hydraulic and geomorphic variables (up to 54% of the variance within continents explained). In particular, a synthetic measure of fish traits in reaches was well explained (50% of its variance) by the Froude number independently of the continent. The effect of physical variables did not differ across continents for most traits, confirming our predictions qualitatively and quantitatively. Therefore, despite phylogenetic and historical differences between continents, fish communities of France and Virginia exhibit convergence in biological traits related to hydraulics and geomorphology. This convergence reflects morphological and behavioral adaptations to physical stress in streams. This study supports the existence of a habitat template for ecological strategies. Some key quantitative variables that define this habitat template can be identified by characterizing how individual organisms use their physical environment, and by using dimensionless physical variables that reveal common energetic properties in different systems. Overall, quantitative tests of community convergence are efficient tools to demonstrate that some community traits are predictable from environmental features.

  11. Multi-scale modeling of microstructure dependent intergranular brittle fracture using a quantitative phase-field based method

    DOE PAGES

    Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.

    2015-12-07

    In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less

  12. Multi-scale modeling of microstructure dependent intergranular brittle fracture using a quantitative phase-field based method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.

    In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less

  13. Advancing the speed, sensitivity and accuracy of biomolecular detection using multi-length-scale engineering

    PubMed Central

    Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.

    2015-01-01

    Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices. PMID:25466541

  14. Advancing the speed, sensitivity and accuracy of biomolecular detection using multi-length-scale engineering

    NASA Astrophysics Data System (ADS)

    Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.

    2014-12-01

    Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices.

  15. Systematic Construction of Kinetic Models from Genome-Scale Metabolic Networks

    PubMed Central

    Smallbone, Kieran; Klipp, Edda; Mendes, Pedro; Liebermeister, Wolfram

    2013-01-01

    The quantitative effects of environmental and genetic perturbations on metabolism can be studied in silico using kinetic models. We present a strategy for large-scale model construction based on a logical layering of data such as reaction fluxes, metabolite concentrations, and kinetic constants. The resulting models contain realistic standard rate laws and plausible parameters, adhere to the laws of thermodynamics, and reproduce a predefined steady state. These features have not been simultaneously achieved by previous workflows. We demonstrate the advantages and limitations of the workflow by translating the yeast consensus metabolic network into a kinetic model. Despite crudely selected data, the model shows realistic control behaviour, a stable dynamic, and realistic response to perturbations in extracellular glucose concentrations. The paper concludes by outlining how new data can continuously be fed into the workflow and how iterative model building can assist in directing experiments. PMID:24324546

  16. Stream network analysis and geomorphic flood plain mapping from orbital and suborbital remote sensing imagery application to flood hazard studies in central Texas

    NASA Technical Reports Server (NTRS)

    Baker, V. R. (Principal Investigator); Holz, R. K.; Hulke, S. D.; Patton, P. C.; Penteado, M. M.

    1975-01-01

    The author has identified the following significant results. Development of a quantitative hydrogeomorphic approach to flood hazard evaluation was hindered by (1) problems of resolution and definition of the morphometric parameters which have hydrologic significance, and (2) mechanical difficulties in creating the necessary volume of data for meaningful analysis. Measures of network resolution such as drainage density and basin Shreve magnitude indicated that large scale topographic maps offered greater resolution than small scale suborbital imagery and orbital imagery. The disparity in network resolution capabilities between orbital and suborbital imagery formats depends on factors such as rock type, vegetation, and land use. The problem of morphometric data analysis was approached by developing a computer-assisted method for network analysis. The system allows rapid identification of network properties which can then be related to measures of flood response.

  17. Differences in flood hazard projections in Europe – their causes and consequences for decision making

    USGS Publications Warehouse

    Kundzewicz, Z. W.; Krysanova, V.; Dankers, R.; Hirabayashi, Y.; Kanae, S.; Hattermann, F. F.; Huang, S.; Milly, Paul C.D.; Stoffel, M.; Driessen, P.P.J.; Matczak, P.; Quevauviller, P.; Schellnhuber, H.-J.

    2017-01-01

    This paper interprets differences in flood hazard projections over Europe and identifies likely sources of discrepancy. Further, it discusses potential implications of these differences for flood risk reduction and adaptation to climate change. The discrepancy in flood hazard projections raises caution, especially among decision makers in charge of water resources management, flood risk reduction, and climate change adaptation at regional to local scales. Because it is naïve to expect availability of trustworthy quantitative projections of future flood hazard, in order to reduce flood risk one should focus attention on mapping of current and future risks and vulnerability hotspots and improve the situation there. Although an intercomparison of flood hazard projections is done in this paper and differences are identified and interpreted, it does not seems possible to recommend which large-scale studies may be considered most credible in particular areas of Europe.

  18. Direct observation of pitting corrosion evolutions on carbon steel surfaces at the nano-to-micro- scales.

    PubMed

    Guo, Peng; La Plante, Erika Callagon; Wang, Bu; Chen, Xin; Balonis, Magdalena; Bauchy, Mathieu; Sant, Gaurav

    2018-05-22

    The Cl - -induced corrosion of metals and alloys is of relevance to a wide range of engineered materials, structures, and systems. Because of the challenges in studying pitting corrosion in a quantitative and statistically significant manner, its kinetics remain poorly understood. Herein, by direct, nano- to micro-scale observations using vertical scanning interferometry (VSI), we examine the temporal evolution of pitting corrosion on AISI 1045 carbon steel over large surface areas in Cl - -free, and Cl - -enriched solutions. Special focus is paid to examine the nucleation and growth of pits, and the associated formation of roughened regions on steel surfaces. By statistical analysis of hundreds of individual pits, three stages of pitting corrosion, namely, induction, propagation, and saturation, are quantitatively distinguished. By quantifying the kinetics of these processes, we contextualize our current understanding of electrochemical corrosion within a framework that considers spatial dynamics and morphology evolutions. In the presence of Cl - ions, corrosion is highly accelerated due to multiple autocatalytic factors including destabilization of protective surface oxide films and preservation of aggressive microenvironments within the pits, both of which promote continued pit nucleation and growth. These findings offer new insights into predicting and modeling steel corrosion processes in mid-pH aqueous environments.

  19. Nonlinear optical microscopy and ultrasound imaging of human cervical structure

    PubMed Central

    Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.

    2013-01-01

    Abstract. The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth. PMID:23412434

  20. Scaling Up, "Writ Small": Using an Assessment for Learning Audit Instrument to Stimulate Site-Based Professional Development, One School at a Time

    ERIC Educational Resources Information Center

    Lysaght, Zita; O'Leary, Michael

    2017-01-01

    Exploiting the potential that Assessment for Learning (AfL) offers to optimise student learning is contingent on both teachers' knowledge and use of AfL and the fidelity with which this translates into their daily classroom practices. Quantitative data derived from the use of an Assessment for Learning Audit Instrument (AfLAI) with a large sample…

  1. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  2. Launching of Active Galactic Nuclei Jets

    NASA Astrophysics Data System (ADS)

    Tchekhovskoy, Alexander

    As black holes accrete gas, they often produce relativistic, collimated outflows, or jets. Jets are expected to form in the vicinity of a black hole, making them powerful probes of strong-field gravity. However, how jet properties (e.g., jet power) connect to those of the accretion flow (e.g., mass accretion rate) and the black hole (e.g., black hole spin) remains an area of active research. This is because what determines a crucial parameter that controls jet properties—the strength of large-scale magnetic flux threading the black hole—remains largely unknown. First-principles computer simulations show that due to this, even if black hole spin and mass accretion rate are held constant, the simulated jet powers span a wide range, with no clear winner. This limits our ability to use jets as a quantitative diagnostic tool of accreting black holes. Recent advances in computer simulations demonstrated that accretion disks can accumulate large-scale magnetic flux on the black hole, until the magnetic flux becomes so strong that it obstructs gas infall and leads to a magnetically-arrested disk (MAD). Recent evidence suggests that central black holes in jetted active galactic nuclei and tidal disruptions are surrounded by MADs. Since in MADs both the black hole magnetic flux and the jet power are at their maximum, well-defined values, this opens up a new vista in the measurements of black hole masses and spins and quantitative tests of accretion and jet theory.

  3. Should fatty acid signature proportions sum to 1 for diet estimation?

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Budge, Suzanne M.; Thiemann, Gregory W.

    2016-01-01

    Knowledge of predator diets, including how diets might change through time or differ among predators, provides essential insights into their ecology. Diet estimation therefore remains an active area of research within quantitative ecology. Quantitative fatty acid signature analysis (QFASA) is an increasingly common method of diet estimation. QFASA is based on a data library of prey signatures, which are vectors of proportions summarizing the fatty acid composition of lipids, and diet is estimated as the mixture of prey signatures that most closely approximates a predator’s signature. Diets are typically estimated using proportions from a subset of all fatty acids that are known to be solely or largely influenced by diet. Given the subset of fatty acids selected, the current practice is to scale their proportions to sum to 1.0. However, scaling signature proportions has the potential to distort the structural relationships within a prey library and between predators and prey. To investigate that possibility, we compared the practice of scaling proportions with two alternatives and found that the traditional scaling can meaningfully bias diet estimators under some conditions. Two aspects of the prey types that contributed to a predator’s diet influenced the magnitude of the bias: the degree to which the sums of unscaled proportions differed among prey types and the identifiability of prey types within the prey library. We caution investigators against the routine scaling of signature proportions in QFASA.

  4. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  5. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    NASA Astrophysics Data System (ADS)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables such the Green Area Index (GAI), fAPAR and fcover usually retrieved from MODIS, MERIS, SPOT-Vegetation described the quality of the green vegetation development. The GLOBAM (Belgium) and EU FP-7 MOCCCASIN projects (Russia) improved the standard products and were demonstrated over large scale. The GAI retrieved from MODIS time series using a purity index criterion depicted successfully the inter-annual variability. Furthermore, the quantitative assimilation of these GAI time series into a crop growth model improved the yield estimate over years. These results showed that the GAI assimilation works best at the district or provincial level. In the context of the GEO Ag., the Joint Experiment of Crop Assessment and Monitoring (JECAM) was designed to enable the global agricultural monitoring community to compare such methods and results over a variety of regional cropping systems. For a network of test sites around the world, satellite and field measurements are currently collected and will be made available for collaborative effort. This experiment should facilitate international standards for data products and reporting, eventually supporting the development of a global system of systems for agricultural crop assessment and monitoring.

  6. Novel Autism Subtype-Dependent Genetic Variants Are Revealed by Quantitative Trait and Subphenotype Association Analyses of Published GWAS Data

    PubMed Central

    Hu, Valerie W.; Addington, Anjene; Hyman, Alexander

    2011-01-01

    The heterogeneity of symptoms associated with autism spectrum disorders (ASDs) has presented a significant challenge to genetic analyses. Even when associations with genetic variants have been identified, it has been difficult to associate them with a specific trait or characteristic of autism. Here, we report that quantitative trait analyses of ASD symptoms combined with case-control association analyses using distinct ASD subphenotypes identified on the basis of symptomatic profiles result in the identification of highly significant associations with 18 novel single nucleotide polymorphisms (SNPs). The symptom categories included deficits in language usage, non-verbal communication, social development, and play skills, as well as insistence on sameness or ritualistic behaviors. Ten of the trait-associated SNPs, or quantitative trait loci (QTL), were associated with more than one subtype, providing partial replication of the identified QTL. Notably, none of the novel SNPs is located within an exonic region, suggesting that these hereditary components of ASDs are more likely related to gene regulatory processes (or gene expression) than to structural or functional changes in gene products. Seven of the QTL reside within intergenic chromosomal regions associated with rare copy number variants that have been previously reported in autistic samples. Pathway analyses of the genes associated with the QTL identified in this study implicate neurological functions and disorders associated with autism pathophysiology. This study underscores the advantage of incorporating both quantitative traits as well as subphenotypes into large-scale genome-wide analyses of complex disorders. PMID:21556359

  7. Organizing "mountains of words" for data analysis, both qualitative and quantitative.

    PubMed

    Johnson, Bruce D; Dunlap, Eloise; Benoit, Ellen

    2010-04-01

    Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semiquantitative coding and analysis. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed.

  8. A two-factor error model for quantitative steganalysis

    NASA Astrophysics Data System (ADS)

    Böhme, Rainer; Ker, Andrew D.

    2006-02-01

    Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.

  9. Scaling laws in the dynamics of crime growth rate

    NASA Astrophysics Data System (ADS)

    Alves, Luiz G. A.; Ribeiro, Haroldo V.; Mendes, Renio S.

    2013-06-01

    The increasing number of crimes in areas with large concentrations of people have made cities one of the main sources of violence. Understanding characteristics of how crime rate expands and its relations with the cities size goes beyond an academic question, being a central issue for contemporary society. Here, we characterize and analyze quantitative aspects of murders in the period from 1980 to 2009 in Brazilian cities. We find that the distribution of the annual, biannual and triannual logarithmic homicide growth rates exhibit the same functional form for distinct scales, that is, a scale invariant behavior. We also identify asymptotic power-law decay relations between the standard deviations of these three growth rates and the initial size. Further, we discuss similarities with complex organizations.

  10. Analysis on the restriction factors of the green building scale promotion based on DEMATEL

    NASA Astrophysics Data System (ADS)

    Wenxia, Hong; Zhenyao, Jiang; Zhao, Yang

    2017-03-01

    In order to promote the large-scale development of the green building in our country, DEMATEL method was used to classify influence factors of green building development into three parts, including green building market, green technology and macro economy. Through the DEMATEL model, the interaction mechanism of each part was analyzed. The mutual influence degree of each barrier factor that affects the green building promotion was quantitatively analysed and key factors for the development of green building in China were also finally determined. In addition, some implementation strategies of promoting green building scale development in our country were put forward. This research will show important reference value and practical value for making policies of the green building promotion.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, Arka; Dalal, Neal, E-mail: abanerj6@illinois.edu, E-mail: dalaln@illinois.edu

    We present a new method for simulating cosmologies that contain massive particles with thermal free streaming motion, such as massive neutrinos or warm/hot dark matter. This method combines particle and fluid descriptions of the thermal species to eliminate the shot noise known to plague conventional N-body simulations. We describe this method in detail, along with results for a number of test cases to validate our method, and check its range of applicability. Using this method, we demonstrate that massive neutrinos can produce a significant scale-dependence in the large-scale biasing of deep voids in the matter field. We show that thismore » scale-dependence may be quantitatively understood using an extremely simple spherical expansion model which reproduces the behavior of the void bias for different neutrino parameters.« less

  12. The nature of micro CMEs within coronal holes

    NASA Astrophysics Data System (ADS)

    Bothmer, Volker; Nistico, Giuseppe; Zimbardo, Gaetano; Patsourakos, Spiros; Bosman, Eckhard

    Whilst investigating the origin and characteristics of coronal jets and large-scale CMEs identi-fied in data from the SECCHI (Sun Earth Connection Coronal and Heliospheric Investigation) instrument suites on board the two STEREO satellites, we discovered transient events that originated in the low corona with a morphology resembling that of typical three-part struc-tured coronal mass ejections (CMEs). However, the CMEs occurred on considerably smaller spatial scales. In this presentation we show evidence for the existence of small-scale CMEs from inside coronal holes and present quantitative estimates of their speeds and masses. We interprete the origin and evolution of micro CMEs as a natural consequence of the emergence of small-scale magnetic bipoles related to the Sun's ever changing photospheric magnetic flux on various scales and their interactions with the ambient plasma and magnetic field. The analysis of CMEs is performed within the framework of the EU Erasmus and FP7 SOTERIA projects.

  13. Efficient Constant-Time Complexity Algorithm for Stochastic Simulation of Large Reaction Networks.

    PubMed

    Thanh, Vo Hong; Zunino, Roberto; Priami, Corrado

    2017-01-01

    Exact stochastic simulation is an indispensable tool for a quantitative study of biochemical reaction networks. The simulation realizes the time evolution of the model by randomly choosing a reaction to fire and update the system state according to a probability that is proportional to the reaction propensity. Two computationally expensive tasks in simulating large biochemical networks are the selection of next reaction firings and the update of reaction propensities due to state changes. We present in this work a new exact algorithm to optimize both of these simulation bottlenecks. Our algorithm employs the composition-rejection on the propensity bounds of reactions to select the next reaction firing. The selection of next reaction firings is independent of the number reactions while the update of propensities is skipped and performed only when necessary. It therefore provides a favorable scaling for the computational complexity in simulating large reaction networks. We benchmark our new algorithm with the state of the art algorithms available in literature to demonstrate its applicability and efficiency.

  14. Comparative Study of Seven Commercial Kits for Human DNA Extraction from Urine Samples Suitable for DNA Biomarker-Based Public Health Studies

    PubMed Central

    El Bali, Latifa; Diman, Aurélie; Bernard, Alfred; Roosens, Nancy H. C.; De Keersmaecker, Sigrid C. J.

    2014-01-01

    Human genomic DNA extracted from urine could be an interesting tool for large-scale public health studies involving characterization of genetic variations or DNA biomarkers as a result of the simple and noninvasive collection method. These studies, involving many samples, require a rapid, easy, and standardized extraction protocol. Moreover, for practicability, there is a necessity to collect urine at a moment different from the first void and to store it appropriately until analysis. The present study compared seven commercial kits to select the most appropriate urinary human DNA extraction procedure for epidemiological studies. DNA yield has been determined using different quantification methods: two classical, i.e., NanoDrop and PicoGreen, and two species-specific real-time quantitative (q)PCR assays, as DNA extracted from urine contains, besides human, microbial DNA also, which largely contributes to the total DNA yield. In addition, the kits giving a good yield were also tested for the presence of PCR inhibitors. Further comparisons were performed regarding the sampling time and the storage conditions. Finally, as a proof-of-concept, an important gene related to smoking has been genotyped using the developed tools. We could select one well-performing kit for the human DNA extraction from urine suitable for molecular diagnostic real-time qPCR-based assays targeting genetic variations, applicable to large-scale studies. In addition, successful genotyping was possible using DNA extracted from urine stored at −20°C for several months, and an acceptable yield could also be obtained from urine collected at different moments during the day, which is particularly important for public health studies. PMID:25365790

  15. Correlation between quantitative PCR and Culture-Based methods for measuring Enterococcus spp. over various temporal scales at three California marine beaches

    EPA Science Inventory

    Several studies have examined how fecal indicator bacteria (FIB) measurements compare between quantitative polymerase chain reaction (QPCR) and the culture methods it is intended to replace. Here we extend those studies by examining the stability of that relationship within a be...

  16. Determination of Factors Affecting Preschool Teacher Candidates' Attitudes towards Science Teaching

    ERIC Educational Resources Information Center

    Timur, Betul

    2012-01-01

    The purpose of this study was to determine preschool teacher candidates' attitudes towards science teaching and to examine the reasons behind their attitudes in depth. In this study, mixed methods were used including quantitative and qualitative data. Quantitative data gained by attitudes towards science teaching scale, qualitative data gained by…

  17. Revealing Less Derived Nature of Cartilaginous Fish Genomes with Their Evolutionary Time Scale Inferred with Nuclear Genes

    PubMed Central

    Renz, Adina J.; Meyer, Axel; Kuraku, Shigehiro

    2013-01-01

    Cartilaginous fishes, divided into Holocephali (chimaeras) and Elasmoblanchii (sharks, rays and skates), occupy a key phylogenetic position among extant vertebrates in reconstructing their evolutionary processes. Their accurate evolutionary time scale is indispensable for better understanding of the relationship between phenotypic and molecular evolution of cartilaginous fishes. However, our current knowledge on the time scale of cartilaginous fish evolution largely relies on estimates using mitochondrial DNA sequences. In this study, making the best use of the still partial, but large-scale sequencing data of cartilaginous fish species, we estimate the divergence times between the major cartilaginous fish lineages employing nuclear genes. By rigorous orthology assessment based on available genomic and transcriptomic sequence resources for cartilaginous fishes, we selected 20 protein-coding genes in the nuclear genome, spanning 2973 amino acid residues. Our analysis based on the Bayesian inference resulted in the mean divergence time of 421 Ma, the late Silurian, for the Holocephali-Elasmobranchii split, and 306 Ma, the late Carboniferous, for the split between sharks and rays/skates. By applying these results and other documented divergence times, we measured the relative evolutionary rate of the Hox A cluster sequences in the cartilaginous fish lineages, which resulted in a lower substitution rate with a factor of at least 2.4 in comparison to tetrapod lineages. The obtained time scale enables mapping phenotypic and molecular changes in a quantitative framework. It is of great interest to corroborate the less derived nature of cartilaginous fish at the molecular level as a genome-wide phenomenon. PMID:23825540

  18. Revealing less derived nature of cartilaginous fish genomes with their evolutionary time scale inferred with nuclear genes.

    PubMed

    Renz, Adina J; Meyer, Axel; Kuraku, Shigehiro

    2013-01-01

    Cartilaginous fishes, divided into Holocephali (chimaeras) and Elasmoblanchii (sharks, rays and skates), occupy a key phylogenetic position among extant vertebrates in reconstructing their evolutionary processes. Their accurate evolutionary time scale is indispensable for better understanding of the relationship between phenotypic and molecular evolution of cartilaginous fishes. However, our current knowledge on the time scale of cartilaginous fish evolution largely relies on estimates using mitochondrial DNA sequences. In this study, making the best use of the still partial, but large-scale sequencing data of cartilaginous fish species, we estimate the divergence times between the major cartilaginous fish lineages employing nuclear genes. By rigorous orthology assessment based on available genomic and transcriptomic sequence resources for cartilaginous fishes, we selected 20 protein-coding genes in the nuclear genome, spanning 2973 amino acid residues. Our analysis based on the Bayesian inference resulted in the mean divergence time of 421 Ma, the late Silurian, for the Holocephali-Elasmobranchii split, and 306 Ma, the late Carboniferous, for the split between sharks and rays/skates. By applying these results and other documented divergence times, we measured the relative evolutionary rate of the Hox A cluster sequences in the cartilaginous fish lineages, which resulted in a lower substitution rate with a factor of at least 2.4 in comparison to tetrapod lineages. The obtained time scale enables mapping phenotypic and molecular changes in a quantitative framework. It is of great interest to corroborate the less derived nature of cartilaginous fish at the molecular level as a genome-wide phenomenon.

  19. Quantifying adsorption-induced deformation of nanoporous materials on different length scales

    PubMed Central

    Morak, Roland; Braxmeier, Stephan; Ludescher, Lukas; Hüsing, Nicola; Reichenauer, Gudrung

    2017-01-01

    A new in situ setup combining small-angle neutron scattering (SANS) and dilatometry was used to measure water-adsorption-induced deformation of a monolithic silica sample with hierarchical porosity. The sample exhibits a disordered framework consisting of macropores and struts containing two-dimensional hexagonally ordered cylindrical mesopores. The use of an H2O/D2O water mixture with zero scattering length density as an adsorptive allows a quantitative determination of the pore lattice strain from the shift of the corresponding diffraction peak. This radial strut deformation is compared with the simultaneously measured macroscopic length change of the sample with dilatometry, and differences between the two quantities are discussed on the basis of the deformation mechanisms effective at the different length scales. It is demonstrated that the SANS data also provide a facile way to quantitatively determine the adsorption isotherm of the material by evaluating the incoherent scattering contribution of H2O at large scattering vectors. PMID:29021735

  20. A comparison of refuse attenuation in laboratory and field scale lysimeters.

    PubMed

    Youcai, Zhao; Luochun, Wang; Renhua, Hua; Dimin, Xu; Guowei, Gu

    2002-01-01

    For this study, small and middle scale laboratory lysimeters, and a large scale field lysimeter in situ in Shanghai Refuse Landfill, with refuse weights of 187,600 and 10,800,000 kg, respectively, were created. These lysimeters are compared in terms of leachate quality (pH, concentrations of COD, BOD and NH3-N), refuse composition (biodegradable matter and volatile solid) and surface settlement for a monitoring period of 0-300 days. The objectives of this study were to explore both the similarities and disparities between laboratory and field scale lysimeters, and to compare degradation behaviors of refuse at the intensive reaction phase in the different scale lysimeters. Quantitative relationships of leachate quality and refuse composition with placement time show that degradation behaviors of refuse seem to depend heavily on the scales of the lysimeters and the parameters of concern, especially in the starting period of 0-6 months. However, some similarities exist between laboratory and field lysimeters after 4-6 months of placement because COD and BOD concentrations in leachate in the field lysimeter decrease regularly in a parallel pattern with those in the laboratory lysimeters. NH3-N, volatile solid (VS) and biodegradable matter (BDM) also gradually decrease in parallel in this intensive reaction phase for all scale lysimeters as refuse ages. Though the concrete data are different among the different scale lysimeters, it may be considered that laboratory lysimeters with sufficient scale are basically applicable for a rough simulation of a real landfill, especially for illustrating the degradation pattern and mechanism. Settlement of refuse surface is roughly proportional to the initial refuse height.

  1. MRMPROBS: a data assessment and metabolite identification tool for large-scale multiple reaction monitoring based widely targeted metabolomics.

    PubMed

    Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro

    2013-05-21

    We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.

  2. The brainstem reticular formation is a small-world, not scale-free, network

    PubMed Central

    Humphries, M.D; Gurney, K; Prescott, T.J

    2005-01-01

    Recently, it has been demonstrated that several complex systems may have simple graph-theoretic characterizations as so-called ‘small-world’ and ‘scale-free’ networks. These networks have also been applied to the gross neural connectivity between primate cortical areas and the nervous system of Caenorhabditis elegans. Here, we extend this work to a specific neural circuit of the vertebrate brain—the medial reticular formation (RF) of the brainstem—and, in doing so, we have made three key contributions. First, this work constitutes the first model (and quantitative review) of this important brain structure for over three decades. Second, we have developed the first graph-theoretic analysis of vertebrate brain connectivity at the neural network level. Third, we propose simple metrics to quantitatively assess the extent to which the networks studied are small-world or scale-free. We conclude that the medial RF is configured to create small-world (implying coherent rapid-processing capabilities), but not scale-free, type networks under assumptions which are amenable to quantitative measurement. PMID:16615219

  3. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  4. Micro/nano-computed tomography technology for quantitative dynamic, multi-scale imaging of morphogenesis.

    PubMed

    Gregg, Chelsea L; Recknagel, Andrew K; Butcher, Jonathan T

    2015-01-01

    Tissue morphogenesis and embryonic development are dynamic events challenging to quantify, especially considering the intricate events that happen simultaneously in different locations and time. Micro- and more recently nano-computed tomography (micro/nanoCT) has been used for the past 15 years to characterize large 3D fields of tortuous geometries at high spatial resolution. We and others have advanced micro/nanoCT imaging strategies for quantifying tissue- and organ-level fate changes throughout morphogenesis. Exogenous soft tissue contrast media enables visualization of vascular lumens and tissues via extravasation. Furthermore, the emergence of antigen-specific tissue contrast enables direct quantitative visualization of protein and mRNA expression. Micro-CT X-ray doses appear to be non-embryotoxic, enabling longitudinal imaging studies in live embryos. In this chapter we present established soft tissue contrast protocols for obtaining high-quality micro/nanoCT images and the image processing techniques useful for quantifying anatomical and physiological information from the data sets.

  5. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  6. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  7. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  8. The prevalence of mind-body dualism in early China.

    PubMed

    Slingerland, Edward; Chudek, Maciej

    2011-07-01

    We present the first large-scale, quantitative examination of mind and body concepts in a set of historical sources by measuring the predictions of folk mind-body dualism against the surviving textual corpus of pre-Qin (pre-221 BCE) China. Our textual analysis found clear patterns in the historically evolving reference of the word xin (heart/heart-mind): It alone of the organs was regularly contrasted with the physical body, and during the Warring States period it became less associated with emotions and increasingly portrayed as the unique locus of "higher" cognitive abilities. We interpret this as a semantic shift toward a shared cognitive bias in response to a vast and rapid expansion of literacy. Our study helps test the proposed universality of folk dualism, adds a new quantitative approach to the methods used in the humanities, and opens up a new and valuable data source for cognitive scientists: the record of dead minds. Copyright © 2011 Cognitive Science Society, Inc.

  9. Are quantitative cultures useful in the diagnosis of hospital-acquired pneumonia?

    PubMed

    San Pedro, G

    2001-02-01

    Noninvasive and invasive tests have been developed and studied for their utility in diagnosing and guiding the treatment of hospital-acquired pneumonia, a condition with an inherently high mortality. Early empiric antibiotic treatment has been shown to reduce mortality, so delaying this treatment until test results are available is not justifiable. Furthermore, tailoring therapy based on results of either noninvasive or invasive tests has not been clearly shown to affect morbidity and mortality. This may be related to quantitative limitations of these tests or possibly to a high false-negative rate in patients who receive early antibiotic treatment and may therefore have suppressed bacterial counts. Results of these tests, however, do influence treatment. It is therefore hoped that they may ultimately provide a rational basis for making therapeutic decisions. In the future, outcomes research should be a part of large-scale clinical trials, and noninvasive and invasive tests should be incorporated into the design in an attempt to provide a better understanding of the value of such tests.

  10. Association of Parkinson Disease Risk Loci With Mild Parkinsonian Signs in Older Persons

    PubMed Central

    Shulman, Joshua M.; Yu, Lei; Buchman, Aron S.; Evans, Denis A.; Schneider, Julie A.; Bennett, David A.; De Jager, Philip L.

    2014-01-01

    IMPORTANCE Parkinsonian motor signs are common in the aging population and are associated with adverse health outcomes. Compared with Parkinson disease (PD), potential genetic risk factors for mild parkinsonian signs have been largely unexplored. OBJECTIVE To determine whether PD susceptibility loci are associated with parkinsonism or substantia nigra pathology in a large community-based cohort of older persons. DESIGN, SETTING, AND PARTICIPANTS Eighteen candidate single-nucleotide polymorphisms from PD genome-wide association studies were evaluated in a joint clinicopathologic cohort. Participants included 1698 individuals and a nested autopsy collection of 821 brains from the Religious Orders Study and the Rush Memory and Aging Project, 2 prospective community-based studies. MAIN OUTCOMES AND MEASURES The primary outcomes were a quantitative measure of global parkinsonism or component measures of bradykinesia, rigidity, tremor, and gait impairment that were based on the motor Unified Parkinson’s Disease Rating Scale. In secondary analyses, we examined associations with additional quantitative motor traits and postmortem indices, including substantia nigra Lewy bodies and neuronal loss. RESULTS Parkinson disease risk alleles in the MAPT (rs2942168; P = .0006) and CCDC62 (rs12817488; P = .004) loci were associated with global parkinsonism, and these associations remained after exclusion of patients with a PD diagnosis. Based on motor Unified Parkinson’s Disease Rating Scale subscores, MAPT (P = .0002) and CCDC62 (P = .003) were predominantly associated with bradykinesia, and we further discovered associations between SREBF1 (rs11868035; P = .005) and gait impairment, SNCA (rs356220; P = .04) and rigidity, and GAK (rs1564282; P = .03) and tremor. In the autopsy cohort, only NMD3 (rs34016896; P = .03) was related to nigral neuronal loss, and no associations were detected with Lewy bodies. CONCLUSIONS AND RELEVANCE In addition to the established link to PD susceptibility, our results support a broader role for several loci in the development of parkinsonian motor signs and nigral pathology in older persons. PMID:24514572

  11. Hypothesis exploration with visualization of variance

    PubMed Central

    2014-01-01

    Background The Consortium for Neuropsychiatric Phenomics (CNP) at UCLA was an investigation into the biological bases of traits such as memory and response inhibition phenotypes—to explore whether they are linked to syndromes including ADHD, Bipolar disorder, and Schizophrenia. An aim of the consortium was in moving from traditional categorical approaches for psychiatric syndromes towards more quantitative approaches based on large-scale analysis of the space of human variation. It represented an application of phenomics—wide-scale, systematic study of phenotypes—to neuropsychiatry research. Results This paper reports on a system for exploration of hypotheses in data obtained from the LA2K, LA3C, and LA5C studies in CNP. ViVA is a system for exploratory data analysis using novel mathematical models and methods for visualization of variance. An example of these methods is called VISOVA, a combination of visualization and analysis of variance, with the flavor of exploration associated with ANOVA in biomedical hypothesis generation. It permits visual identification of phenotype profiles—patterns of values across phenotypes—that characterize groups. Visualization enables screening and refinement of hypotheses about variance structure of sets of phenotypes. Conclusions The ViVA system was designed for exploration of neuropsychiatric hypotheses by interdisciplinary teams. Automated visualization in ViVA supports ‘natural selection’ on a pool of hypotheses, and permits deeper understanding of the statistical architecture of the data. Large-scale perspective of this kind could lead to better neuropsychiatric diagnostics. PMID:25097666

  12. Identification of susceptibility genes and genetic modifiers of human diseases

    NASA Astrophysics Data System (ADS)

    Abel, Kenneth; Kammerer, Stefan; Hoyal, Carolyn; Reneland, Rikard; Marnellos, George; Nelson, Matthew R.; Braun, Andreas

    2005-03-01

    The completion of the human genome sequence enables the discovery of genes involved in common human disorders. The successful identification of these genes is dependent on the availability of informative sample sets, validated marker panels, a high-throughput scoring technology, and a strategy for combining these resources. We have developed a universal platform technology based on mass spectrometry (MassARRAY) for analyzing nucleic acids with high precision and accuracy. To fuel this technology, we generated more than 100,000 validated assays for single nucleotide polymorphisms (SNPs) covering virtually all known and predicted human genes. We also established a large DNA sample bank comprised of more than 50,000 consented healthy and diseased individuals. This combination of reagents and technology allows the execution of large-scale genome-wide association studies. Taking advantage of MassARRAY"s capability for quantitative analysis of nucleic acids, allele frequencies are estimated in sample pools containing large numbers of individual DNAs. To compare pools as a first-pass "filtering" step is a tremendous advantage in throughput and cost over individual genotyping. We employed this approach in numerous genome-wide, hypothesis-free searches to identify genes associated with common complex diseases, such as breast cancer, osteoporosis, and osteoarthritis, and genes involved in quantitative traits like high density lipoproteins cholesterol (HDL-c) levels and central fat. Access to additional well-characterized patient samples through collaborations allows us to conduct replication studies that validate true disease genes. These discoveries will expand our understanding of genetic disease predisposition, and our ability for early diagnosis and determination of specific disease subtype or progression stage.

  13. Detection of Influenza A viruses at migratory bird stopover sites in Michigan, USA.

    PubMed

    Lickfett, Todd M; Clark, Erica; Gehring, Thomas M; Alm, Elizabeth W

    2018-01-01

    Introduction: Influenza A viruses have the potential to cause devastating illness in humans and domestic poultry. Wild birds are the natural reservoirs of Influenza A viruses and migratory birds are implicated in their global dissemination. High concentrations of this virus are excreted in the faeces of infected birds and faecal contamination of shared aquatic habitats can lead to indirect transmission among birds via the faecal-oral route. The role of migratory birds in the spread of avian influenza has led to large-scale surveillance efforts of circulating avian influenza viruses through direct sampling of live and dead wild birds. Environmental monitoring of bird habitats using molecular detection methods may provide additional information on the persistence of influenza virus at migratory stopover sites distributed across large spatial scales. Materials and methods: In the current study, faecal and water samples were collected at migratory stopover sites and evaluated for Influenza A by real-time quantitative reverse transcriptase PCR. Results and Discussion: This study found that Influenza A was detected at 53% of the evaluated stopover sites, and 7% and 4.8% of the faecal and water samples, respectively, tested positive for Influenza A virus. Conclusion: Environmental monitoring detected Influenza A at stopover sites used by migratory birds.

  14. Comparative evaluation of saliva collection methods for proteome analysis.

    PubMed

    Golatowski, Claas; Salazar, Manuela Gesell; Dhople, Vishnu Mukund; Hammer, Elke; Kocher, Thomas; Jehmlich, Nico; Völker, Uwe

    2013-04-18

    Saliva collection devices are widely used for large-scale screening approaches. This study was designed to compare the suitability of three different whole-saliva collection approaches for subsequent proteome analyses. From 9 young healthy volunteers (4 women and 5 men) saliva samples were collected either unstimulated by passive drooling or stimulated using a paraffin gum or Salivette® (cotton swab). Saliva volume, protein concentration and salivary protein patterns were analyzed comparatively. Samples collected using paraffin gum showed the highest saliva volume (4.1±1.5 ml) followed by Salivette® collection (1.8±0.4 ml) and drooling (1.0±0.4 ml). Saliva protein concentrations (average 1145 μg/ml) showed no significant differences between the three sampling schemes. Each collection approach facilitated the identification of about 160 proteins (≥2 distinct peptides) per subject, but collection-method dependent variations in protein composition were observed. Passive drooling, paraffin gum and Salivette® each allows similar coverage of the whole saliva proteome, but the specific proteins observed depended on the collection approach. Thus, only one type of collection device should be used for quantitative proteome analysis in one experiment, especially when performing large-scale cross-sectional or multi-centric studies. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Integrated care pilot in north-west London: a mixed methods evaluation

    PubMed Central

    Curry, Natasha; Harris, Matthew; Gunn, Laura H.; Pappas, Yannis; Blunt, Ian; Soljak, Michael; Mastellos, Nikolaos; Holder, Holly; Smith, Judith; Majeed, Azeem; Ignatowicz, Agnieszka; Greaves, Felix; Belsi, Athina; Costin-Davis, Nicola; Jones Nielsen, Jessica D.; Greenfield, Geva; Cecil, Elizabeth; Patterson, Susan; Car, Josip; Bardsley, Martin

    2013-01-01

    Introduction This paper provides the results of a year-long evaluation of a large-scale integrated care pilot in north-west London. The pilot aimed to integrate care across primary, acute, community, mental health and social care for people with diabetes and/or those aged 75+ through care planning, multidisciplinary case reviews, information sharing and project management support. Methods The evaluation team conducted qualitative studies of change at organisational, clinician and patient levels (using interviews, focus groups and a survey); and quantitative analysis of change in service use and patient-level clinical outcomes (using patient-level datasets and a matched control study). Results The pilot had successfully engaged provider organisations, created a shared strategic vision and established governance structures. However, the engagement of clinicians was variable and there was no evidence to date of significant reductions in emergency admissions. There was some evidence of changes in care processes. Conclusion Although the pilot has demonstrated the beginnings of large-scale change, it remains in the early stages and faces significant challenges as it seeks to become sustainable for the longer term. It is critical that National Health Service managers and clinicians have realistic expectations of what can be achieved in a relatively short period of time. PMID:24167455

  16. Incorporating residual temperature and specific humidity in predicting weather-dependent warm-season electricity consumption

    NASA Astrophysics Data System (ADS)

    Guan, Huade; Beecham, Simon; Xu, Hanqiu; Ingleton, Greg

    2017-02-01

    Climate warming and increasing variability challenges the electricity supply in warm seasons. A good quantitative representation of the relationship between warm-season electricity consumption and weather condition provides necessary information for long-term electricity planning and short-term electricity management. In this study, an extended version of cooling degree days (ECDD) is proposed for better characterisation of this relationship. The ECDD includes temperature, residual temperature and specific humidity effects. The residual temperature is introduced for the first time to reflect the building thermal inertia effect on electricity consumption. The study is based on the electricity consumption data of four multiple-street city blocks and three office buildings. It is found that the residual temperature effect is about 20% of the current-day temperature effect at the block scale, and increases with a large variation at the building scale. Investigation of this residual temperature effect provides insight to the influence of building designs and structures on electricity consumption. The specific humidity effect appears to be more important at the building scale than at the block scale. A building with high energy performance does not necessarily have low specific humidity dependence. The new ECDD better reflects the weather dependence of electricity consumption than the conventional CDD method.

  17. Comparison of spatio-temporal resolution of different flow measurement techniques for marine renewable energy applications

    NASA Astrophysics Data System (ADS)

    Lyon, Vincent; Wosnik, Martin

    2013-11-01

    Marine hydrokinetic (MHK) energy conversion devices are subject to a wide range of turbulent scales, either due to upstream bathymetry, obstacles and waves, or from wakes of upstream devices in array configurations. The commonly used, robust Acoustic Doppler Current Profilers (ADCP) are well suited for long term flow measurements in the marine environment, but are limited to low sampling rates due to their operational principle. The resulting temporal and spatial resolution is insufficient to measure all turbulence scales of interest to the device, e.g., ``blade-scale turbulence.'' The present study systematically characterizes the spatial and temporal resolution of ADCP, Acoustic Doppler Velocimetry (ADV), and Particle Image Velocimetry (PIV). Measurements were conducted in a large cross section tow tank (3.7m × 2.4m) for several benchmark cases, including low and high turbulence intensity uniform flow as well as in the wake of a cylinder, to quantitatively investigate the flow scales which each of the instruments can resolve. The purpose of the study is to supply data for mathematical modeling to improve predictions from ADCP measurements, which can help lead to higher-fidelity energy resource assessment and more accurate device evaluation, including wake measurements. Supported by NSF-CBET grant 1150797.

  18. The Relationship between Shyness and Internet Addiction: A Quantitative Study on Middle and Post Secondary School Students

    ERIC Educational Resources Information Center

    Hollingsworth, W. Craig

    2005-01-01

    This small scale quantitative study looks into the relationship between shyness and internet addiction in middle school students. This study has been conducted on the belief that shyness is a possible predictor of Internet Addiction. To prove this hypothesis a questionnaire was created and distributed to 53 middle school students and 159 post…

  19. Quantitative psychomotor dysfunction in schizophrenia: a loss of drive, impaired movement execution or both?

    PubMed

    Docx, Lise; Sabbe, Bernard; Provinciael, Pieter; Merckx, Niel; Morrens, Manuel

    2013-01-01

    The aim of the present study was to investigate the predictive value of qualitative psychomotor performance levels and subaspects of the negative syndrome for quantitative motor activity levels in patients with schizophrenia. Twenty-seven stabilized patients with schizophrenia and 22 age- and sex-matched healthy controls were included in the study. An extensive battery of psychomotor performance tests (Finger Tapping Test, Purdue Pegboard Test, Line Copying Test, Neurological Evaluation Scale, Salpêtrière Retardation Rating Scale), clinical rating scales (Positive and Negative Syndrome Scale) and 24-hour actigraphy were administered to all participants. Correlational analyses showed that motor activity levels were associated with avolition as well as clinically assessed psychomotor slowing. However, in a regression model, only avolition was found to be a significant predictor for motor activity levels in patients with schizophrenia; none of the psychomotor performance tests nor the severity of emotional expressivity deficits contributed to the model. Qualitative and quantitative psychomotor deficits seem to be independent phenomena in stabilized patients with schizophrenia. The diminishing in motor activity in patients with schizophrenia is related to a loss of drive and not to problems in the quality of movement execution. © 2013 S. Karger AG, Basel.

  20. Scale Development and Initial Tests of the Multidimensional Complex Adaptive Leadership Scale for School Principals: An Exploratory Mixed Method Study

    ERIC Educational Resources Information Center

    Özen, Hamit; Turan, Selahattin

    2017-01-01

    This study was designed to develop the scale of the Complex Adaptive Leadership for School Principals (CAL-SP) and examine its psychometric properties. This was an exploratory mixed method research design (ES-MMD). Both qualitative and quantitative methods were used to develop and assess psychometric properties of the questionnaire. This study…

  1. Methods comparison for microsatellite marker development: Different isolation methods, different yield efficiency

    NASA Astrophysics Data System (ADS)

    Zhan, Aibin; Bao, Zhenmin; Hu, Xiaoli; Lu, Wei; Hu, Jingjie

    2009-06-01

    Microsatellite markers have become one kind of the most important molecular tools used in various researches. A large number of microsatellite markers are required for the whole genome survey in the fields of molecular ecology, quantitative genetics and genomics. Therefore, it is extremely necessary to select several versatile, low-cost, efficient and time- and labor-saving methods to develop a large panel of microsatellite markers. In this study, we used Zhikong scallop ( Chlamys farreri) as the target species to compare the efficiency of the five methods derived from three strategies for microsatellite marker development. The results showed that the strategy of constructing small insert genomic DNA library resulted in poor efficiency, while the microsatellite-enriched strategy highly improved the isolation efficiency. Although the mining public database strategy is time- and cost-saving, it is difficult to obtain a large number of microsatellite markers, mainly due to the limited sequence data of non-model species deposited in public databases. Based on the results in this study, we recommend two methods, microsatellite-enriched library construction method and FIASCO-colony hybridization method, for large-scale microsatellite marker development. Both methods were derived from the microsatellite-enriched strategy. The experimental results obtained from Zhikong scallop also provide the reference for microsatellite marker development in other species with large genomes.

  2. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    PubMed

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Characterizing and Assessing a Large-Scale Software Maintenance Organization

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1995-01-01

    One important component of a software process is the organizational context in which the process is enacted. This component is often missing or incomplete in current process modeling approaches. One technique for modeling this perspective is the Actor-Dependency (AD) Model. This paper reports on a case study which used this approach to analyze and assess a large software maintenance organization. Our goal was to identify the approach's strengths and weaknesses while providing practical recommendations for improvement and research directions. The AD model was found to be very useful in capturing the important properties of the organizational context of the maintenance process, and aided in the understanding of the flaws found in this process. However, a number of opportunities for extending and improving the AD model were identified. Among others, there is a need to incorporate quantitative information to complement the qualitative model.

  4. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  5. Recurrent patterning in the daily foraging routes of hamadryas baboons (Papio hamadryas): spatial memory in large-scale versus small-scale space.

    PubMed

    Schreier, Amy L; Grove, Matt

    2014-05-01

    The benefits of spatial memory for foraging animals can be assessed on two distinct spatial scales: small-scale space (travel within patches) and large-scale space (travel between patches). While the patches themselves may be distributed at low density, within patches resources are likely densely distributed. We propose, therefore, that spatial memory for recalling the particular locations of previously visited feeding sites will be more advantageous during between-patch movement, where it may reduce the distances traveled by animals that possess this ability compared to those that must rely on random search. We address this hypothesis by employing descriptive statistics and spectral analyses to characterize the daily foraging routes of a band of wild hamadryas baboons in Filoha, Ethiopia. The baboons slept on two main cliffs--the Filoha cliff and the Wasaro cliff--and daily travel began and ended on a cliff; thus four daily travel routes exist: Filoha-Filoha, Filoha-Wasaro, Wasaro-Wasaro, Wasaro-Filoha. We use newly developed partial sum methods and distribution-fitting analyses to distinguish periods of area-restricted search from more extensive movements. The results indicate a single peak in travel activity in the Filoha-Filoha and Wasaro-Filoha routes, three peaks of travel activity in the Filoha-Wasaro routes, and two peaks in the Wasaro-Wasaro routes; and are consistent with on-the-ground observations of foraging and ranging behavior of the baboons. In each of the four daily travel routes the "tipping points" identified by the partial sum analyses indicate transitions between travel in small- versus large-scale space. The correspondence between the quantitative analyses and the field observations suggest great utility for using these types of analyses to examine primate travel patterns and especially in distinguishing between movement in small versus large-scale space. Only the distribution-fitting analyses are inconsistent with the field observations, which may be due to the scale at which these analyses were conducted. © 2013 Wiley Periodicals, Inc.

  6. Non Linear Programming (NLP) Formulation for Quantitative Modeling of Protein Signal Transduction Pathways

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239

  7. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    PubMed

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  8. Development of life story experience (LSE) scales for migrant dentists in Australia: a sequential qualitative-quantitative study.

    PubMed

    Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S

    2016-09-01

    The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd

  9. Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery

    NASA Technical Reports Server (NTRS)

    Baker, V. R.; Holz, R. K.; Patton, P. C.

    1975-01-01

    Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).

  10. Multi-filter spectrophotometry simulations

    NASA Technical Reports Server (NTRS)

    Callaghan, Kim A. S.; Gibson, Brad K.; Hickson, Paul

    1993-01-01

    To complement both the multi-filter observations of quasar environments described in these proceedings, as well as the proposed UBC 2.7 m Liquid Mirror Telescope (LMT) redshift survey, we have initiated a program of simulated multi-filter spectrophotometry. The goal of this work, still very much in progress, is a better quantitative assessment of the multiband technique as a viable mechanism for obtaining useful redshift and morphological class information from large scale multi-filter surveys.

  11. JPRS Report. Soviet Union: World Economy & International Relations, No. 5, May 1987

    DTIC Science & Technology

    1987-09-22

    at the same time substitute for their creative assertiveness and intuition. This applies not only to the elite group of researchers and designers ...achieved such large-scale results in such a short period of time. Not only quantitative but primarily qualitative results. And not simply in the...into A GROUP OF PERSONALITIES, each of which and all together in time acquiring growing importance in history? Today, in the era of reconstruction

  12. Characterization of flow in a scroll duct

    NASA Technical Reports Server (NTRS)

    Begg, E. K.; Bennett, J. C.

    1985-01-01

    A quantitative, flow visualization study was made of a partially elliptic cross section, inward curving duct (scroll duct), with an axial outflow through a vaneless annular cutlet. The working fluid was water, with a Re(d) of 40,000 at the inlet to the scroll duct, this Reynolds number being representative of the conditions in an actual gas turbine scroll. Both still and high speed moving pictures of fluorescein dye injected into the flow and illuminated by an argon ion laser were used to document the flow. Strong secondary flow, similar to the secondary flow in a pipe bend, was found in the bottom half of the scroll within the first 180 degs of turning. The pressure field set up by the turning duct was strong enough to affect the inlet flow condition. At 90 degs downstream, the large scale secondary flow was found to be oscillatory in nature. The exit flow was nonuniform in the annular exit. By 270 degs downstream, the flow appeared unorganized with no distinctive secondary flow pattern. Large scale structures from the upstream core region appeared by 90 degs and continued through the duct to reenter at the inlet section.

  13. Development of a Kelp-type Structure Module in a Coastal Ocean Model to Assess the Hydrodynamic Impact of Seawater Uranium Extraction Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Taiping; Khangaonkar, Tarang; Long, Wen

    2014-02-07

    In recent years, with the rapid growth of global energy demand, the interest in extracting uranium from seawater for nuclear energy has been renewed. While extracting seawater uranium is not yet commercially viable, it serves as a “backstop” to the conventional uranium resources and provides an essentially unlimited supply of uranium resource. With recent advances in seawater uranium extraction technology, extracting uranium from seawater could be economically feasible when the extraction devices are deployed at a large scale (e.g., several hundred km2). There is concern however that the large scale deployment of adsorbent farms could result in potential impacts tomore » the hydrodynamic flow field in an oceanic setting. In this study, a kelp-type structure module was incorporated into a coastal ocean model to simulate the blockage effect of uranium extraction devices on the flow field. The module was quantitatively validated against laboratory flume experiments for both velocity and turbulence profiles. The model-data comparison showed an overall good agreement and validated the approach of applying the model to assess the potential hydrodynamic impact of uranium extraction devices or other underwater structures in coastal oceans.« less

  14. Kinetic-MHD simulations of gyroresonance instability driven by CR pressure anisotropy

    NASA Astrophysics Data System (ADS)

    Lebiga, O.; Santos-Lima, R.; Yan, H.

    2018-05-01

    The transport of cosmic rays (CRs) is crucial for the understanding of almost all high-energy phenomena. Both pre-existing large-scale magnetohydrodynamic (MHD) turbulence and locally generated turbulence through plasma instabilities are important for the CR propagation in astrophysical media. The potential role of the resonant instability triggered by CR pressure anisotropy to regulate the parallel spatial diffusion of low-energy CRs (≲100 GeV) in the interstellar and intracluster medium of galaxies has been shown in previous theoretical works. This work aims to study the gyroresonance instability via direct numerical simulations, in order to access quantitatively the wave-particle scattering rates. For this, we employ a 1D PIC-MHD code to follow the growth and saturation of the gyroresonance instability. We extract from the simulations the pitch-angle diffusion coefficient Dμμ produced by the instability during the linear and saturation phases, and a very good agreement (within a factor of 3) is found with the values predicted by the quasi-linear theory (QLT). Our results support the applicability of the QLT for modelling the scattering of low-energy CRs by the gyroresonance instability in the complex interplay between this instability and the large-scale MHD turbulence.

  15. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  16. Effects of Computer Programming on Students' Cognitive Performance: A Quantitative Synthesis.

    ERIC Educational Resources Information Center

    Liao, Yuen-Kuang Cliff

    A meta-analysis was performed to synthesize existing data concerning the effects of computer programing on cognitive outcomes of students. Sixty-five studies were located from three sources, and their quantitative data were transformed into a common scale--Effect Size (ES). The analysis showed that 58 (89%) of the study-weighted ESs were positive…

  17. Extreme Precipitation and High-Impact Landslides

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Huffman, George; Peters-Lidard, Christa

    2012-01-01

    It is well known that extreme or prolonged rainfall is the dominant trigger of landslides; however, there remain large uncertainties in characterizing the distribution of these hazards and meteorological triggers at the global scale. Researchers have evaluated the spatiotemporal distribution of extreme rainfall and landslides at local and regional scale primarily using in situ data, yet few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This research uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from Tropical Rainfall Measuring Mission (TRMM) data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurence of precipitation and rainfall-triggered landslides globally. The GLC, available from 2007 to the present, contains information on reported rainfall-triggered landslide events around the world using online media reports, disaster databases, etc. When evaluating this database, we observed that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This research also considers the sources for this extreme rainfall, citing teleconnections from ENSO as likely contributors to regional precipitation variability. This work demonstrates the potential for using satellite-based precipitation estimates to identify potentially active landslide areas at the global scale in order to improve landslide cataloging and quantify landslide triggering at daily, monthly and yearly time scales.

  18. Nanosecond laser coloration on stainless steel surface.

    PubMed

    Lu, Yan; Shi, Xinying; Huang, Zhongjia; Li, Taohai; Zhang, Meng; Czajkowski, Jakub; Fabritius, Tapio; Huttula, Marko; Cao, Wei

    2017-08-02

    In this work, we present laser coloration on 304 stainless steel using nanosecond laser. Surface modifications are tuned by adjusting laser parameters of scanning speed, repetition rate, and pulse width. A comprehensive study of the physical mechanism leading to the appearance is presented. Microscopic patterns are measured and employed as input to simulate light-matter interferences, while chemical states and crystal structures of composites to figure out intrinsic colors. Quantitative analysis clarifies the final colors and RGB values are the combinations of structural colors and intrinsic colors from the oxidized pigments, with the latter dominating. Therefore, the engineering and scientific insights of nanosecond laser coloration highlight large-scale utilization of the present route for colorful and resistant steels.

  19. Quantitative evidence for the effects of multiple drivers on continental-scale amphibian declines

    USGS Publications Warehouse

    Grant, Evan H. Campbell; Miller, David A. W.; Schmidt, Benedikt R.; Adams, Michael J.; Amburgey, Staci M.; Chambert, Thierry A.; Cruickshank, Sam S.; Fisher, Robert N.; Green, David M.; Hossack, Blake R.; Johnson, Pieter T.J.; Joseph, Maxwell B.; Rittenhouse, Tracy A. G.; Ryan, Maureen E.; Waddle, J. Hardin; Walls, Susan C.; Bailey, Larissa L.; Fellers, Gary M.; Gorman, Thomas A.; Ray, Andrew M.; Pilliod, David S.; Price, Steven J.; Saenz, Daniel; Sadinski, Walt; Muths, Erin L.

    2016-01-01

    Since amphibian declines were first proposed as a global phenomenon over a quarter century ago, the conservation community has made little progress in halting or reversing these trends. The early search for a “smoking gun” was replaced with the expectation that declines are caused by multiple drivers. While field observations and experiments have identified factors leading to increased local extinction risk, evidence for effects of these drivers is lacking at large spatial scales. Here, we use observations of 389 time-series of 83 species and complexes from 61 study areas across North America to test the effects of 4 of the major hypothesized drivers of declines. While we find that local amphibian populations are being lost from metapopulations at an average rate of 3.79% per year, these declines are not related to any particular threat at the continental scale; likewise the effect of each stressor is variable at regional scales. This result - that exposure to threats varies spatially, and populations vary in their response - provides little generality in the development of conservation strategies. Greater emphasis on local solutions to this globally shared phenomenon is needed.

  20. 2D and 3D characterization of pore defects in die cast AM60

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhuofei; CanmetMATERIALS, 183 Longwood Road South, Hamilton L8P 0A5, Ontario Canada; Maurey, Alexandre

    2016-04-15

    The widespread application of die castings can be hampered due to the potential of large scale porosity to act as nucleation sites for fracture and fatigue. It is therefore important to develop robust approaches to the characterization of porosity providing parameters that can be linked to the material's mechanical properties. We have tackled this problem in a study of the AM60 die cast Mg alloy, using samples extracted from a prototype shock tower. A quantitative characterization of porosity has been undertaken, analyzing porosity in both 2D (using classical metallographic methods) and in 3D (using X-ray computed tomography (XCT)). Metallographic characterizationmore » results show that shrinkage pores and small gas pores can be distinguished based on their distinct geometrical features. Shrinkage pores are irregular with multiple arms, resulting in a form factor less than 0.4. In contrast, gas pores are generally more circular in shape yielding form factors larger than 0.6. XCT provides deeper insight into the shape of pores, although this understanding is limited by the resolution obtainable by laboratory based XCT. It also shows how 2D sectioning can produce artefacts as single complex pores are sectioned into multiple small pores. - Highlights: • Mg (e.g. AM60) die castings may contain large scale porosity that act as nucleation sites for fracture and fatigue • Quantitative characterization of porosity metallography (2D) and X-ray tomography (3D) is used • Shrinkage pores and small gas pores can be distinguished based on their distinct geometrical features. • Shrinkage pores are irregular giving a form factor < 0.4; gas pores are rounder with form factors > 0.6 • XCT enables pore visualization, although limited by the resolution obtainable by laboratory based XCT.« less

Top