Sample records for large scale quantitative

  1. The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.

    2006-01-01

    This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…

  2. Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations

    NASA Technical Reports Server (NTRS)

    Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.

    1993-01-01

    We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.

  3. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  4. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  5. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  6. Quantitative Large-Scale Three-Dimensional Imaging of Human Kidney Biopsies: A Bridge to Precision Medicine in Kidney Disease.

    PubMed

    Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M

    2018-06-05

    Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.

  7. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  8. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Model Analysis of an Aircraft Fueslage Panel using Experimental and Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A.; Buehrle, Ralph D.; Storaasli, Olaf L.

    1998-01-01

    The application of Electro-Optic Holography (EOH) for measuring the center bay vibration modes of an aircraft fuselage panel under forced excitation is presented. The requirement of free-free panel boundary conditions made the acquisition of quantitative EOH data challenging since large scale rigid body motions corrupted measurements of the high frequency vibrations of interest. Image processing routines designed to minimize effects of large scale motions were applied to successfully resurrect quantitative EOH vibrational amplitude measurements

  10. Tools for understanding landscapes: combining large-scale surveys to characterize change. Chapter 9.

    Treesearch

    W. Keith Moser; Janine Bolliger; Don C. Bragg; Mark H. Hansen; Mark A. Hatfield; Timothy A. Nigh; Lisa A. Schulte

    2008-01-01

    All landscapes change continuously. Since change is perceived and interpreted through measures of scale, any quantitative analysis of landscapes must identify and describe the spatiotemporal mosaics shaped by large-scale structures and processes. This process is controlled by core influences, or "drivers," that shape the change and affect the outcome...

  11. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  12. Quantitative Serum Nuclear Magnetic Resonance Metabolomics in Large-Scale Epidemiology: A Primer on -Omic Technologies

    PubMed Central

    Kangas, Antti J; Soininen, Pasi; Lawlor, Debbie A; Davey Smith, George; Ala-Korpela, Mika

    2017-01-01

    Abstract Detailed metabolic profiling in large-scale epidemiologic studies has uncovered novel biomarkers for cardiometabolic diseases and clarified the molecular associations of established risk factors. A quantitative metabolomics platform based on nuclear magnetic resonance spectroscopy has found widespread use, already profiling over 400,000 blood samples. Over 200 metabolic measures are quantified per sample; in addition to many biomarkers routinely used in epidemiology, the method simultaneously provides fine-grained lipoprotein subclass profiling and quantification of circulating fatty acids, amino acids, gluconeogenesis-related metabolites, and many other molecules from multiple metabolic pathways. Here we focus on applications of magnetic resonance metabolomics for quantifying circulating biomarkers in large-scale epidemiology. We highlight the molecular characterization of risk factors, use of Mendelian randomization, and the key issues of study design and analyses of metabolic profiling for epidemiology. We also detail how integration of metabolic profiling data with genetics can enhance drug development. We discuss why quantitative metabolic profiling is becoming widespread in epidemiology and biobanking. Although large-scale applications of metabolic profiling are still novel, it seems likely that comprehensive biomarker data will contribute to etiologic understanding of various diseases and abilities to predict disease risks, with the potential to translate into multiple clinical settings. PMID:29106475

  13. Harnessing quantitative genetics and genomics for understanding and improving complex traits in crops

    USDA-ARS?s Scientific Manuscript database

    Classical quantitative genetics aids crop improvement by providing the means to estimate heritability, genetic correlations, and predicted responses to various selection schemes. Genomics has the potential to aid quantitative genetics and applied crop improvement programs via large-scale, high-thro...

  14. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    PubMed

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Boarding School, Academic Motivation and Engagement, and Psychological Well-Being: A Large-Scale Investigation

    ERIC Educational Resources Information Center

    Martin, Andrew J.; Papworth, Brad; Ginns, Paul; Liem, Gregory Arief D.

    2014-01-01

    Boarding school has been a feature of education systems for centuries. Minimal large-scale quantitative data have been collected to examine its association with important educational and other outcomes. The present study represents one of the largest studies into boarding school conducted to date. It investigates boarding school and students'…

  17. Image segmentation evaluation for very-large datasets

    NASA Astrophysics Data System (ADS)

    Reeves, Anthony P.; Liu, Shuang; Xie, Yiting

    2016-03-01

    With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.

  18. Post-16 Physics and Chemistry Uptake: Combining Large-Scale Secondary Analysis with In-Depth Qualitative Methods

    ERIC Educational Resources Information Center

    Hampden-Thompson, Gillian; Lubben, Fred; Bennett, Judith

    2011-01-01

    Quantitative secondary analysis of large-scale data can be combined with in-depth qualitative methods. In this paper, we discuss the role of this combined methods approach in examining the uptake of physics and chemistry in post compulsory schooling for students in England. The secondary data analysis of the National Pupil Database (NPD) served…

  19. A quantitative approach to the topology of large-scale structure. [for galactic clustering computation

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III; Weinberg, David H.; Melott, Adrian L.

    1987-01-01

    A quantitative measure of the topology of large-scale structure: the genus of density contours in a smoothed density distribution, is described and applied. For random phase (Gaussian) density fields, the mean genus per unit volume exhibits a universal dependence on threshold density, with a normalizing factor that can be calculated from the power spectrum. If large-scale structure formed from the gravitational instability of small-amplitude density fluctuations, the topology observed today on suitable scales should follow the topology in the initial conditions. The technique is illustrated by applying it to simulations of galaxy clustering in a flat universe dominated by cold dark matter. The technique is also applied to a volume-limited sample of the CfA redshift survey and to a model in which galaxies reside on the surfaces of polyhedral 'bubbles'. The topology of the evolved mass distribution and 'biased' galaxy distribution in the cold dark matter models closely matches the topology of the density fluctuations in the initial conditions. The topology of the observational sample is consistent with the random phase, cold dark matter model.

  20. Large scale anomalies in the microwave background: causation and correlation.

    PubMed

    Aslanyan, Grigor; Easther, Richard

    2013-12-27

    Most treatments of large scale anomalies in the microwave sky are a posteriori, with unquantified look-elsewhere effects. We contrast these with physical models of specific inhomogeneities in the early Universe which can generate these apparent anomalies. Physical models predict correlations between candidate anomalies and the corresponding signals in polarization and large scale structure, reducing the impact of cosmic variance. We compute the apparent spatial curvature associated with large-scale inhomogeneities and show that it is typically small, allowing for a self-consistent analysis. As an illustrative example we show that a single large plane wave inhomogeneity can contribute to low-l mode alignment and odd-even asymmetry in the power spectra and the best-fit model accounts for a significant part of the claimed odd-even asymmetry. We argue that this approach can be generalized to provide a more quantitative assessment of potential large scale anomalies in the Universe.

  1. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1974-01-01

    Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  2. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities. [cryogenic traonics wind tunnel

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  3. Research-Based Recommendations for the Use of Accommodations in Large-Scale Assessments: 2012 Update. Practical Guidelines for the Education of English Language Learners. Book 4

    ERIC Educational Resources Information Center

    Kieffer, Michael J.; Rivera, Mabel; Francis, David J.

    2012-01-01

    This report presents results from a new quantitative synthesis of research on the effectiveness and validity of test accommodations for English language learners (ELLs) taking large-scale assessments. In 2006, the Center on Instruction published a review of the literature on test accommodations for ELLs titled "Practical Guidelines for the…

  4. Emergence of Multiscaling in a Random-Force Stirred Fluid

    NASA Astrophysics Data System (ADS)

    Yakhot, Victor; Donzis, Diego

    2017-07-01

    We consider the transition to strong turbulence in an infinite fluid stirred by a Gaussian random force. The transition is defined as a first appearance of anomalous scaling of normalized moments of velocity derivatives (dissipation rates) emerging from the low-Reynolds-number Gaussian background. It is shown that, due to multiscaling, strongly intermittent rare events can be quantitatively described in terms of an infinite number of different "Reynolds numbers" reflecting a multitude of anomalous scaling exponents. The theoretically predicted transition disappears at Rλ≤3 . The developed theory is in quantitative agreement with the outcome of large-scale numerical simulations.

  5. Response of deep and shallow tropical maritime cumuli to large-scale processes

    NASA Technical Reports Server (NTRS)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  6. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  7. Solar Wind Turbulent Cascade from MHD to Sub-ion Scales: Large-size 3D Hybrid Particle-in-cell Simulations

    NASA Astrophysics Data System (ADS)

    Franci, Luca; Landi, Simone; Verdini, Andrea; Matteini, Lorenzo; Hellinger, Petr

    2018-01-01

    Properties of the turbulent cascade from fluid to kinetic scales in collisionless plasmas are investigated by means of large-size 3D hybrid (fluid electrons, kinetic protons) particle-in-cell simulations. Initially isotropic Alfvénic fluctuations rapidly develop a strongly anisotropic turbulent cascade, mainly in the direction perpendicular to the ambient magnetic field. The omnidirectional magnetic field spectrum shows a double power-law behavior over almost two decades in wavenumber, with a Kolmogorov-like index at large scales, a spectral break around ion scales, and a steepening at sub-ion scales. Power laws are also observed in the spectra of the ion bulk velocity, density, and electric field, at both magnetohydrodynamic (MHD) and kinetic scales. Despite the complex structure, the omnidirectional spectra of all fields at ion and sub-ion scales are in remarkable quantitative agreement with those of a 2D simulation with similar physical parameters. This provides a partial, a posteriori validation of the 2D approximation at kinetic scales. Conversely, at MHD scales, the spectra of the density and of the velocity (and, consequently, of the electric field) exhibit differences between the 2D and 3D cases. Although they can be partly ascribed to the lower spatial resolution, the main reason is likely the larger importance of compressible effects in the full 3D geometry. Our findings are also in remarkable quantitative agreement with solar wind observations.

  8. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  9. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  10. "Invisible" Bilingualism--"Invisible" Language Ideologies: Greek Teachers' Attitudes Towards Immigrant Pupils' Heritage Languages

    ERIC Educational Resources Information Center

    Gkaintartzi, Anastasia; Kiliari, Angeliki; Tsokalidou, Roula

    2015-01-01

    This paper presents data from two studies--a nationwide quantitative research and an ethnographic study--on Greek schoolteachers' attitudes towards immigrant pupils' bilingualism. The quantitative data come from a large-scale questionnaire survey, which aimed at the investigation of the needs and requirements for the implementation of a pilot…

  11. Confirmatory Factor Analytic Structure and Measurement Invariance of Quantitative Autistic Traits Measured by the Social Responsiveness Scale-2

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Ratliff, Kristin R.; Gruber, Chris; Zhang, Yi; Law, Paul A.; Constantino, John N.

    2014-01-01

    Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large ("N" = 9635) accumulated collection of reports on quantitative autistic traits using…

  12. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  13. A leap forward in geographic scale for forest ectomycorrhizal fungi

    Treesearch

    Filipa Cox; Nadia Barsoum; Martin I. Bidartondo; Isabella Børja; Erik Lilleskov; Lars O. Nilsson; Pasi Rautio; Kath Tubby; Lars Vesterdal

    2010-01-01

    In this letter we propose a first large-scale assessment of mycorrhizas with a European-wide network of intensively monitored forest plots as a research platform. This effort would create a qualitative and quantitative shift in mycorrhizal research by delivering the first continental-scale map of mycorrhizal fungi. Readersmay note that several excellent detailed...

  14. [Development and application of morphological analysis method in Aspergillus niger fermentation].

    PubMed

    Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2015-02-01

    Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.

  15. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  16. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  17. Collection of quantitative chemical release field data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demirgian, J.; Macha, S.; Loyola Univ.

    1999-01-01

    Detection and quantitation of chemicals in the environment requires Fourier-transform infrared (FTIR) instruments that are properly calibrated and tested. This calibration and testing requires field testing using matrices that are representative of actual instrument use conditions. Three methods commonly used for developing calibration files and training sets in the field are a closed optical cell or chamber, a large-scale chemical release, and a small-scale chemical release. There is no best method. The advantages and limitations of each method should be considered in evaluating field results. Proper calibration characterizes the sensitivity of an instrument, its ability to detect a component inmore » different matrices, and the quantitative accuracy and precision of the results.« less

  18. Commentary: The Observed Association between Autistic Severity Measured by the Social Responsiveness Scale (SRS) and General Psychopathology-- A Response to Hus et al.()

    ERIC Educational Resources Information Center

    Constantino, John N.; Frazier, Thomas W.

    2013-01-01

    In their analysis of the accumulated data from the clinically ascertained Simons Simplex Collection (SSC), Hus et al. (2013) provide a large-scale clinical replication of previously reported associations (see Constantino, Hudziak & Todd, 2003) between quantitative autistic traits [as measured by the Social Responsiveness Scale (SRS)] and…

  19. Heritage Language Maintenance and Education in the Greek Sociolinguistic Context: Albanian Immigrant Parents' Views

    ERIC Educational Resources Information Center

    Gkaintartzi, Anastasia; Kiliari, Angeliki; Tsokalidou, Roula

    2016-01-01

    This paper presents data from two studies--a nationwide quantitative research and an ethnographic study--on immigrant parents' perspectives about heritage language maintenance and education in Greek state schools. The quantitative data come from a large-scale questionnaire survey, which aimed at the investigation of the needs and requirements for…

  20. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  1. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  2. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  3. Ascertaining Validity in the Abstract Realm of PMESII Simulation Models: An Analysis of the Peace Support Operations Model (PSOM)

    DTIC Science & Technology

    2009-06-01

    simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE

  4. The Role of Forests in Regulating the River Flow Regime of Large Basins of the World

    NASA Astrophysics Data System (ADS)

    Salazar, J. F.; Villegas, J. C.; Mercado-Bettin, D. A.; Rodríguez, E.

    2016-12-01

    Many natural and social phenomena depend on river flow regimes that are being altered by global change. Understanding the mechanisms behind such alterations is crucial for predicting river flow regimes in a changing environment. Here we explore potential linkages between the presence of forests and the capacity of river basins for regulating river flows. Regulation is defined here as the capacity of river basins to attenuate the amplitude of the river flow regime, that is to reduce the difference between high and low flows. We first use scaling theory to show how scaling properties of observed river flows can be used to classify river basins as regulated or unregulated. This parsimonious classification is based on a physical interpretation of the scaling properties (particularly the scaling exponents) that is novel (most previous studies have focused on the interpretation of the scaling exponents for floods only), and widely-applicable to different basins (the only assumption is that river flows in a given river basin exhibit scaling properties through well-known power laws). Then we show how this scaling framework can be used to explore global-change-induced temporal variations in the regulation capacity of river basins. Finally, we propose a conceptual hypothesis (the "Forest reservoir concept") to explain how large-scale forests can exert important effects on the long-term water balance partitioning and regulation capacity of large basins of the world. Our quantitative results are based on data analysis (river flows and land cover features) from 22 large basins of the world, with emphasis in the Amazon river and its main tributaries. Collectively, our findings support the hypothesis that forest cover enhances the capacity of large river basins to maintain relatively high mean river flows, as well as to regulate (ameliorate) extreme river flows. Advancing towards this quantitative understanding of the relation between forest cover and river flow regimes is crucial for water management- and land cover-related decisions.

  5. The Role of Forests in Regulating the River Flow Regime of Large Basins of the World

    NASA Astrophysics Data System (ADS)

    Salazar, J. F.; Villegas, J. C.; Mercado-Bettin, D. A.; Rodríguez, E.

    2017-12-01

    Many natural and social phenomena depend on river flow regimes that are being altered by global change. Understanding the mechanisms behind such alterations is crucial for predicting river flow regimes in a changing environment. Here we explore potential linkages between the presence of forests and the capacity of river basins for regulating river flows. Regulation is defined here as the capacity of river basins to attenuate the amplitude of the river flow regime, that is to reduce the difference between high and low flows. We first use scaling theory to show how scaling properties of observed river flows can be used to classify river basins as regulated or unregulated. This parsimonious classification is based on a physical interpretation of the scaling properties (particularly the scaling exponents) that is novel (most previous studies have focused on the interpretation of the scaling exponents for floods only), and widely-applicable to different basins (the only assumption is that river flows in a given river basin exhibit scaling properties through well-known power laws). Then we show how this scaling framework can be used to explore global-change-induced temporal variations in the regulation capacity of river basins. Finally, we propose a conceptual hypothesis (the "Forest reservoir concept") to explain how large-scale forests can exert important effects on the long-term water balance partitioning and regulation capacity of large basins of the world. Our quantitative results are based on data analysis (river flows and land cover features) from 22 large basins of the world, with emphasis in the Amazon river and its main tributaries. Collectively, our findings support the hypothesis that forest cover enhances the capacity of large river basins to maintain relatively high mean river flows, as well as to regulate (ameliorate) extreme river flows. Advancing towards this quantitative understanding of the relation between forest cover and river flow regimes is crucial for water management- and land cover-related decisions.

  6. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  7. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  8. Statistical Model to Analyze Quantitative Proteomics Data Obtained by 18O/16O Labeling and Linear Ion Trap Mass Spectrometry

    PubMed Central

    Jorge, Inmaculada; Navarro, Pedro; Martínez-Acedo, Pablo; Núñez, Estefanía; Serrano, Horacio; Alfranca, Arántzazu; Redondo, Juan Miguel; Vázquez, Jesús

    2009-01-01

    Statistical models for the analysis of protein expression changes by stable isotope labeling are still poorly developed, particularly for data obtained by 16O/18O labeling. Besides large scale test experiments to validate the null hypothesis are lacking. Although the study of mechanisms underlying biological actions promoted by vascular endothelial growth factor (VEGF) on endothelial cells is of considerable interest, quantitative proteomics studies on this subject are scarce and have been performed after exposing cells to the factor for long periods of time. In this work we present the largest quantitative proteomics study to date on the short term effects of VEGF on human umbilical vein endothelial cells by 18O/16O labeling. Current statistical models based on normality and variance homogeneity were found unsuitable to describe the null hypothesis in a large scale test experiment performed on these cells, producing false expression changes. A random effects model was developed including four different sources of variance at the spectrum-fitting, scan, peptide, and protein levels. With the new model the number of outliers at scan and peptide levels was negligible in three large scale experiments, and only one false protein expression change was observed in the test experiment among more than 1000 proteins. The new model allowed the detection of significant protein expression changes upon VEGF stimulation for 4 and 8 h. The consistency of the changes observed at 4 h was confirmed by a replica at a smaller scale and further validated by Western blot analysis of some proteins. Most of the observed changes have not been described previously and are consistent with a pattern of protein expression that dynamically changes over time following the evolution of the angiogenic response. With this statistical model the 18O labeling approach emerges as a very promising and robust alternative to perform quantitative proteomics studies at a depth of several thousand proteins. PMID:19181660

  9. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    PubMed

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  10. Measuring the topology of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III

    1988-01-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  11. Measuring the topology of large-scale structure in the universe

    NASA Astrophysics Data System (ADS)

    Gott, J. Richard, III

    1988-11-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  12. A behavioral-level HDL description of SFQ logic circuits for quantitative performance analysis of large-scale SFQ digital systems

    NASA Astrophysics Data System (ADS)

    Matsuzaki, F.; Yoshikawa, N.; Tanaka, M.; Fujimaki, A.; Takai, Y.

    2003-10-01

    Recently many single flux quantum (SFQ) logic circuits containing several thousands of Josephson junctions have been designed successfully by using digital domain simulation based on the hard ware description language (HDL). In the present HDL-based design of SFQ circuits, a structure-level HDL description has been used, where circuits are made up of basic gate cells. However, in order to analyze large-scale SFQ digital systems, such as a microprocessor, more higher-level circuit abstraction is necessary to reduce the circuit simulation time. In this paper we have investigated the way to describe functionality of the large-scale SFQ digital circuits by a behavior-level HDL description. In this method, the functionality and the timing of the circuit block is defined directly by describing their behavior by the HDL. Using this method, we can dramatically reduce the simulation time of large-scale SFQ digital circuits.

  13. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network

    PubMed Central

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L.; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-01-01

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae. In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. PMID:28325812

  14. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network.

    PubMed

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-05-05

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. Copyright © 2017 Usaj et al.

  15. Large-Scale Diffraction Patterns from Circular Objects

    ERIC Educational Resources Information Center

    Rinard, Phillip M.

    1976-01-01

    Investigates quantitatively the diffractions of light by a U.S. penny and an aperture of the same size. Differences noted between the theory and measurements are discussed, with probable causes indicated. (Author/CP)

  16. The Timing of Teacher Hires and Teacher Qualifications: Is There an Association?

    ERIC Educational Resources Information Center

    Engel, Mimi

    2012-01-01

    Background: Case studies suggest that late hiring timelines are common in large urban school districts and result in the loss of qualified teachers to surrounding suburbs. To date, however, there has been no large-scale quantitative investigation of the relationship between the timing of teacher hires and teacher qualifications. Purpose: This…

  17. Development and Evaluation of a Parallel Reaction Monitoring Strategy for Large-Scale Targeted Metabolomics Quantification.

    PubMed

    Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin

    2016-04-19

    Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.

  18. Lunar terrain mapping and relative-roughness analysis

    NASA Technical Reports Server (NTRS)

    Rowan, L. C.; Mccauley, J. F.; Holm, E. A.

    1971-01-01

    Terrain maps of the equatorial zone were prepared at scales of 1:2,000,000 and 1:1,000,000 to classify lunar terrain with respect to roughness and to provide a basis for selecting sites for Surveyor and Apollo landings, as well as for Ranger and Lunar Orbiter photographs. Lunar terrain was described by qualitative and quantitative methods and divided into four fundamental classes: maria, terrae, craters, and linear features. Some 35 subdivisions were defined and mapped throughout the equatorial zone, and, in addition, most of the map units were illustrated by photographs. The terrain types were analyzed quantitatively to characterize and order their relative roughness characteristics. For some morphologically homogeneous mare areas, relative roughness can be extrapolated to the large scales from measurements at small scales.

  19. Large-scale protein-protein interaction analysis in Arabidopsis mesophyll protoplasts by split firefly luciferase complementation.

    PubMed

    Li, Jian-Feng; Bush, Jenifer; Xiong, Yan; Li, Lei; McCormack, Matthew

    2011-01-01

    Protein-protein interactions (PPIs) constitute the regulatory network that coordinates diverse cellular functions. There are growing needs in plant research for creating protein interaction maps behind complex cellular processes and at a systems biology level. However, only a few approaches have been successfully used for large-scale surveys of PPIs in plants, each having advantages and disadvantages. Here we present split firefly luciferase complementation (SFLC) as a highly sensitive and noninvasive technique for in planta PPI investigation. In this assay, the separate halves of a firefly luciferase can come into close proximity and transiently restore its catalytic activity only when their fusion partners, namely the two proteins of interest, interact with each other. This assay was conferred with quantitativeness and high throughput potential when the Arabidopsis mesophyll protoplast system and a microplate luminometer were employed for protein expression and luciferase measurement, respectively. Using the SFLC assay, we could monitor the dynamics of rapamycin-induced and ascomycin-disrupted interaction between Arabidopsis FRB and human FKBP proteins in a near real-time manner. As a proof of concept for large-scale PPI survey, we further applied the SFLC assay to testing 132 binary PPIs among 8 auxin response factors (ARFs) and 12 Aux/IAA proteins from Arabidopsis. Our results demonstrated that the SFLC assay is ideal for in vivo quantitative PPI analysis in plant cells and is particularly powerful for large-scale binary PPI screens.

  20. Embarking on large-scale qualitative research: reaping the benefits of mixed methods in studying youth, clubs and drugs

    PubMed Central

    Hunt, Geoffrey; Moloney, Molly; Fazio, Adam

    2012-01-01

    Qualitative research is often conceptualized as inherently small-scale research, primarily conducted by a lone researcher enmeshed in extensive and long-term fieldwork or involving in-depth interviews with a small sample of 20 to 30 participants. In the study of illicit drugs, traditionally this has often been in the form of ethnographies of drug-using subcultures. Such small-scale projects have produced important interpretive scholarship that focuses on the culture and meaning of drug use in situated, embodied contexts. Larger-scale projects are often assumed to be solely the domain of quantitative researchers, using formalistic survey methods and descriptive or explanatory models. In this paper, however, we will discuss qualitative research done on a comparatively larger scale—with in-depth qualitative interviews with hundreds of young drug users. Although this work incorporates some quantitative elements into the design, data collection, and analysis, the qualitative dimension and approach has nevertheless remained central. Larger-scale qualitative research shares some of the challenges and promises of smaller-scale qualitative work including understanding drug consumption from an emic perspective, locating hard-to-reach populations, developing rapport with respondents, generating thick descriptions and a rich analysis, and examining the wider socio-cultural context as a central feature. However, there are additional challenges specific to the scale of qualitative research, which include data management, data overload and problems of handling large-scale data sets, time constraints in coding and analyzing data, and personnel issues including training, organizing and mentoring large research teams. Yet large samples can prove to be essential for enabling researchers to conduct comparative research, whether that be cross-national research within a wider European perspective undertaken by different teams or cross-cultural research looking at internal divisions and differences within diverse communities and cultures. PMID:22308079

  1. Water balance model for Kings Creek

    NASA Technical Reports Server (NTRS)

    Wood, Eric F.

    1990-01-01

    Particular attention is given to the spatial variability that affects the representation of water balance at the catchment scale in the context of macroscale water-balance modeling. Remotely sensed data are employed for parameterization, and the resulting model is developed so that subgrid spatial variability is preserved and therefore influences the grid-scale fluxes of the model. The model permits the quantitative evaluation of the surface-atmospheric interactions related to the large-scale hydrologic water balance.

  2. Racking Response of Reinforced Concrete Cut and Cover Tunnel

    DOT National Transportation Integrated Search

    2016-01-01

    Currently, the knowledge base and quantitative data sets concerning cut and cover tunnel seismic response are scarce. In this report, a large-scale experimental program is conducted to assess: i) stiffness, capacity, and potential seismically-induced...

  3. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  4. Topology of large-scale structure. IV - Topology in two dimensions

    NASA Technical Reports Server (NTRS)

    Melott, Adrian L.; Cohen, Alexander P.; Hamilton, Andrew J. S.; Gott, J. Richard, III; Weinberg, David H.

    1989-01-01

    In a recent series of papers, an algorithm was developed for quantitatively measuring the topology of the large-scale structure of the universe and this algorithm was applied to numerical models and to three-dimensional observational data sets. In this paper, it is shown that topological information can be derived from a two-dimensional cross section of a density field, and analytic expressions are given for a Gaussian random field. The application of a two-dimensional numerical algorithm for measuring topology to cross sections of three-dimensional models is demonstrated.

  5. Jovian meterology: Large-scale moist convection without a lower boundary

    NASA Technical Reports Server (NTRS)

    Gierasch, P. J.

    1975-01-01

    It is proposed that Jupiter's cloud bands represent large scale convection whose character is determined by the phase change of water at a level where the temperature is about 275K. It is argued that there are three important layers in the atmosphere: a tropopause layer where emission to space occurs; an intermediate layer between the tropopause and the water cloud base; and the deep layer below the water cloud. All arguments are only semi-quantitative. It is pointed out that these ingredients are essential to Jovian meteorology.

  6. The detection of large deletions or duplications in genomic DNA.

    PubMed

    Armour, J A L; Barton, D E; Cockburn, D J; Taylor, G R

    2002-11-01

    While methods for the detection of point mutations and small insertions or deletions in genomic DNA are well established, the detection of larger (>100 bp) genomic duplications or deletions can be more difficult. Most mutation scanning methods use PCR as a first step, but the subsequent analyses are usually qualitative rather than quantitative. Gene dosage methods based on PCR need to be quantitative (i.e., they should report molar quantities of starting material) or semi-quantitative (i.e., they should report gene dosage relative to an internal standard). Without some sort of quantitation, heterozygous deletions and duplications may be overlooked and therefore be under-ascertained. Gene dosage methods provide the additional benefit of reporting allele drop-out in the PCR. This could impact on SNP surveys, where large-scale genotyping may miss null alleles. Here we review recent developments in techniques for the detection of this type of mutation and compare their relative strengths and weaknesses. We emphasize that comprehensive mutation analysis should include scanning for large insertions and deletions and duplications. Copyright 2002 Wiley-Liss, Inc.

  7. MEAN-FIELD MODELING OF AN α{sup 2} DYNAMO COUPLED WITH DIRECT NUMERICAL SIMULATIONS OF RIGIDLY ROTATING CONVECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masada, Youhei; Sano, Takayoshi, E-mail: ymasada@harbor.kobe-u.ac.jp, E-mail: sano@ile.osaka-u.ac.jp

    2014-10-10

    The mechanism of large-scale dynamos in rigidly rotating stratified convection is explored by direct numerical simulations (DNS) in Cartesian geometry. A mean-field dynamo model is also constructed using turbulent velocity profiles consistently extracted from the corresponding DNS results. By quantitative comparison between the DNS and our mean-field model, it is demonstrated that the oscillatory α{sup 2} dynamo wave, excited and sustained in the convection zone, is responsible for large-scale magnetic activities such as cyclic polarity reversal and spatiotemporal migration. The results provide strong evidence that a nonuniformity of the α-effect, which is a natural outcome of rotating stratified convection, canmore » be an important prerequisite for large-scale stellar dynamos, even without the Ω-effect.« less

  8. Inquiring Minds Want to Know: Progress Report on SCALE-UP Physics at Penn State Erie

    NASA Astrophysics Data System (ADS)

    Hall, Jonathan

    2008-03-01

    SCALE-UP (Student Centered Activities for Large Enrollment University Programs) is a ``studio'' approach to learning developed by Bob Beichner at North Carolina State University. SCALE-UP was adapted for teaching and learning in the introductory calculus-based mechanics course at Penn State Erie, The Behrend College, starting in Spring 2007. We are presently doing quantitative and qualitative research on using inquiry-based learning with first year college students, in particular how it effects female students and students from groups that are traditionally under-represented in STEM fields. Using field notes of observations of the classes, focus groups, and the collection of quantitative data, the feedback generated by the research is also being used to improve the delivery of the course, and in the planning of adopting SCALE-UP to the second semester course on electromagnetism in the Fall 2008 semester.

  9. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  10. Impact of Chromosome 4p- Syndrome on Communication and Expressive Language Skills: A Preliminary Investigation

    ERIC Educational Resources Information Center

    Marshall, Althea T.

    2010-01-01

    Purpose: The purpose of this investigation was to examine the impact of Chromosome 4p- syndrome on the communication and expressive language phenotype of a large cross-cultural population of children, adolescents, and adults. Method: A large-scale survey study was conducted and a descriptive research design was used to analyze quantitative and…

  11. Selfing results in inbreeding depression of growth but not of gas exchange of surviving adult black spruce trees

    Treesearch

    Kurt Johnsen; John E. Major; Chris A. Maier

    2003-01-01

    Summary In most tree species, inbreeding greatly reduces seed production, seed viability, survival and growth. In a previous large-scale quantitative analysis of a black spruce (Picea mariana (Mill.) B.S.P.) diallel experiment, selfing had large deleterious effects on growth but no impact on stable carbon isotope discrimination (an...

  12. Quantitative stem cell biology: the threat and the glory.

    PubMed

    Pollard, Steven M

    2016-11-15

    Major technological innovations over the past decade have transformed our ability to extract quantitative data from biological systems at an unprecedented scale and resolution. These quantitative methods and associated large datasets should lead to an exciting new phase of discovery across many areas of biology. However, there is a clear threat: will we drown in these rivers of data? On 18th July 2016, stem cell biologists gathered in Cambridge for the 5th annual Cambridge Stem Cell Symposium to discuss 'Quantitative stem cell biology: from molecules to models'. This Meeting Review provides a summary of the data presented by each speaker, with a focus on quantitative techniques and the new biological insights that are emerging. © 2016. Published by The Company of Biologists Ltd.

  13. Stable isotope dimethyl labelling for quantitative proteomics and beyond

    PubMed Central

    Hsu, Jue-Liang; Chen, Shu-Hui

    2016-01-01

    Stable-isotope reductive dimethylation, a cost-effective, simple, robust, reliable and easy-to- multiplex labelling method, is widely applied to quantitative proteomics using liquid chromatography-mass spectrometry. This review focuses on biological applications of stable-isotope dimethyl labelling for a large-scale comparative analysis of protein expression and post-translational modifications based on its unique properties of the labelling chemistry. Some other applications of the labelling method for sample preparation and mass spectrometry-based protein identification and characterization are also summarized. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644970

  14. Reverse Fluorescence Enhancement and Colorimetric Bimodal Signal Readout Immunochromatography Test Strip for Ultrasensitive Large-Scale Screening and Postoperative Monitoring.

    PubMed

    Yao, Yingyi; Guo, Weisheng; Zhang, Jian; Wu, Yudong; Fu, Weihua; Liu, Tingting; Wu, Xiaoli; Wang, Hanjie; Gong, Xiaoqun; Liang, Xing-Jie; Chang, Jin

    2016-09-07

    Ultrasensitive and quantitative fast screening of cancer biomarkers by immunochromatography test strip (ICTS) is still challenging in clinic. The gold nanoparticles (NPs) based ICTS with colorimetric readout enables a quick spectrum screening but suffers from nonquantitative performance; although ICTS with fluorescence readout (FICTS) allows quantitative detection, its sensitivity still deserves more efforts and attentions. In this work, by taking advantages of colorimetric ICTS and FICTS, we described a reverse fluorescence enhancement ICTS (rFICTS) with bimodal signal readout for ultrasensitive and quantitative fast screening of carcinoembryonic antigen (CEA). In the presence of target, gold NPs aggregation in T line induced colorimetric readout, allowing on-the-spot spectrum screening in 10 min by naked eye. Meanwhile, the reverse fluorescence enhancement signal enabled more accurately quantitative detection with better sensitivity (5.89 pg/mL for CEA), which is more than 2 orders of magnitude lower than that of the conventional FICTS. The accuracy and stability of the rFICTS were investigated with more than 100 clinical serum samples for large-scale screening. Furthermore, this rFICTS also realized postoperative monitoring by detecting CEA in a patient with colon cancer and comparing with CT imaging diagnosis. These results indicated this rFICTS is particularly suitable for point-of-care (POC) diagnostics in both resource-rich and resource-limited settings.

  15. An invariability-area relationship sheds new light on the spatial scaling of ecological stability.

    PubMed

    Wang, Shaopeng; Loreau, Michel; Arnoldi, Jean-Francois; Fang, Jingyun; Rahman, K Abd; Tao, Shengli; de Mazancourt, Claire

    2017-05-19

    The spatial scaling of stability is key to understanding ecological sustainability across scales and the sensitivity of ecosystems to habitat destruction. Here we propose the invariability-area relationship (IAR) as a novel approach to investigate the spatial scaling of stability. The shape and slope of IAR are largely determined by patterns of spatial synchrony across scales. When synchrony decays exponentially with distance, IARs exhibit three phases, characterized by steeper increases in invariability at both small and large scales. Such triphasic IARs are observed for primary productivity from plot to continental scales. When synchrony decays as a power law with distance, IARs are quasilinear on a log-log scale. Such quasilinear IARs are observed for North American bird biomass at both species and community levels. The IAR provides a quantitative tool to predict the effects of habitat loss on population and ecosystem stability and to detect regime shifts in spatial ecological systems, which are goals of relevance to conservation and policy.

  16. Light sheet theta microscopy for rapid high-resolution imaging of large biological samples.

    PubMed

    Migliori, Bianca; Datta, Malika S; Dupre, Christophe; Apak, Mehmet C; Asano, Shoh; Gao, Ruixuan; Boyden, Edward S; Hermanson, Ola; Yuste, Rafael; Tomer, Raju

    2018-05-29

    Advances in tissue clearing and molecular labeling methods are enabling unprecedented optical access to large intact biological systems. These developments fuel the need for high-speed microscopy approaches to image large samples quantitatively and at high resolution. While light sheet microscopy (LSM), with its high planar imaging speed and low photo-bleaching, can be effective, scaling up to larger imaging volumes has been hindered by the use of orthogonal light sheet illumination. To address this fundamental limitation, we have developed light sheet theta microscopy (LSTM), which uniformly illuminates samples from the same side as the detection objective, thereby eliminating limits on lateral dimensions without sacrificing the imaging resolution, depth, and speed. We present a detailed characterization of LSTM, and demonstrate its complementary advantages over LSM for rapid high-resolution quantitative imaging of large intact samples with high uniform quality. The reported LSTM approach is a significant step for the rapid high-resolution quantitative mapping of the structure and function of very large biological systems, such as a clarified thick coronal slab of human brain and uniformly expanded tissues, and also for rapid volumetric calcium imaging of highly motile animals, such as Hydra, undergoing non-isomorphic body shape changes.

  17. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    PubMed

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  18. Multifractal spectrum and lacunarity as measures of complexity of osseointegration.

    PubMed

    de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko

    2016-07-01

    The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.

  19. [Quantitative estimation of vegetation cover and management factor in USLE and RUSLE models by using remote sensing data: a review].

    PubMed

    Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie

    2012-06-01

    Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.

  20. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor.

    PubMed

    Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim

    2016-03-14

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.

  1. Science, marketing and wishful thinking in quantitative proteomics.

    PubMed

    Hackett, Murray

    2008-11-01

    In a recent editorial (J. Proteome Res. 2007, 6, 1633) and elsewhere questions have been raised regarding the lack of attention paid to good analytical practice with respect to the reporting of quantitative results in proteomics. Using those comments as a starting point, several issues are discussed that relate to the challenges involved in achieving adequate sampling with MS-based methods in order to generate valid data for large-scale studies. The discussion touches on the relationships that connect sampling depth and the power to detect protein abundance change, conflict of interest, and strategies to overcome bureaucratic obstacles that impede the use of peer-to-peer technologies for transfer and storage of large data files generated in such experiments.

  2. Infrared Multiphoton Dissociation for Quantitative Shotgun Proteomics

    PubMed Central

    Ledvina, Aaron R.; Lee, M. Violet; McAlister, Graeme C.; Westphall, Michael S.; Coon, Joshua J.

    2012-01-01

    We modified a dual-cell linear ion trap mass spectrometer to perform infrared multiphoton dissociation (IRMPD) in the low pressure trap of a dual-cell quadrupole linear ion trap (dual cell QLT) and perform large-scale IRMPD analyses of complex peptide mixtures. Upon optimization of activation parameters (precursor q-value, irradiation time, and photon flux), IRMPD subtly, but significantly outperforms resonant excitation CAD for peptides identified at a 1% false-discovery rate (FDR) from a yeast tryptic digest (95% confidence, p = 0.019). We further demonstrate that IRMPD is compatible with the analysis of isobaric-tagged peptides. Using fixed QLT RF amplitude allows for the consistent retention of reporter ions, but necessitates the use of variable IRMPD irradiation times, dependent upon precursor mass-to-charge (m/z). We show that IRMPD activation parameters can be tuned to allow for effective peptide identification and quantitation simultaneously. We thus conclude that IRMPD performed in a dual-cell ion trap is an effective option for the large-scale analysis of both unmodified and isobaric-tagged peptides. PMID:22480380

  3. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  4. Quantifying streamflow change caused by forest disturbance at a large spatial scale: A single watershed study

    NASA Astrophysics Data System (ADS)

    Wei, Xiaohua; Zhang, Mingfang

    2010-12-01

    Climatic variability and forest disturbance are commonly recognized as two major drivers influencing streamflow change in large-scale forested watersheds. The greatest challenge in evaluating quantitative hydrological effects of forest disturbance is the removal of climatic effect on hydrology. In this paper, a method was designed to quantify respective contributions of large-scale forest disturbance and climatic variability on streamflow using the Willow River watershed (2860 km2) located in the central part of British Columbia, Canada. Long-term (>50 years) data on hydrology, climate, and timber harvesting history represented by equivalent clear-cutting area (ECA) were available to discern climatic and forestry influences on streamflow by three steps. First, effective precipitation, an integrated climatic index, was generated by subtracting evapotranspiration from precipitation. Second, modified double mass curves were developed by plotting accumulated annual streamflow against annual effective precipitation, which presented a much clearer picture of the cumulative effects of forest disturbance on streamflow following removal of climatic influence. The average annual streamflow changes that were attributed to forest disturbances and climatic variability were then estimated to be +58.7 and -72.4 mm, respectively. The positive (increasing) and negative (decreasing) values in streamflow change indicated opposite change directions, which suggest an offsetting effect between forest disturbance and climatic variability in the study watershed. Finally, a multivariate Autoregressive Integrated Moving Average (ARIMA) model was generated to establish quantitative relationships between accumulated annual streamflow deviation attributed to forest disturbances and annual ECA. The model was then used to project streamflow change under various timber harvesting scenarios. The methodology can be effectively applied to any large-scale single watershed where long-term data (>50 years) are available.

  5. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  6. EmbryoMiner: A new framework for interactive knowledge discovery in large-scale cell tracking data of developing embryos.

    PubMed

    Schott, Benjamin; Traub, Manuel; Schlagenhauf, Cornelia; Takamiya, Masanari; Antritter, Thomas; Bartschat, Andreas; Löffler, Katharina; Blessing, Denis; Otte, Jens C; Kobitski, Andrei Y; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf; Stegmaier, Johannes

    2018-04-01

    State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.

  7. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  8. Liquidity crises on different time scales

    NASA Astrophysics Data System (ADS)

    Corradi, Francesco; Zaccaria, Andrea; Pietronero, Luciano

    2015-12-01

    We present an empirical analysis of the microstructure of financial markets and, in particular, of the static and dynamic properties of liquidity. We find that on relatively large time scales (15 min) large price fluctuations are connected to the failure of the subtle mechanism of compensation between the flows of market and limit orders: in other words, the missed revelation of the latent order book breaks the dynamical equilibrium between the flows, triggering the large price jumps. On smaller time scales (30 s), instead, the static depletion of the limit order book is an indicator of an intrinsic fragility of the system, which is related to a strongly nonlinear enhancement of the response. In order to quantify this phenomenon we introduce a measure of the liquidity imbalance present in the book and we show that it is correlated to both the sign and the magnitude of the next price movement. These findings provide a quantitative definition of the effective liquidity, which proves to be strongly dependent on the considered time scales.

  9. Liquidity crises on different time scales.

    PubMed

    Corradi, Francesco; Zaccaria, Andrea; Pietronero, Luciano

    2015-12-01

    We present an empirical analysis of the microstructure of financial markets and, in particular, of the static and dynamic properties of liquidity. We find that on relatively large time scales (15 min) large price fluctuations are connected to the failure of the subtle mechanism of compensation between the flows of market and limit orders: in other words, the missed revelation of the latent order book breaks the dynamical equilibrium between the flows, triggering the large price jumps. On smaller time scales (30 s), instead, the static depletion of the limit order book is an indicator of an intrinsic fragility of the system, which is related to a strongly nonlinear enhancement of the response. In order to quantify this phenomenon we introduce a measure of the liquidity imbalance present in the book and we show that it is correlated to both the sign and the magnitude of the next price movement. These findings provide a quantitative definition of the effective liquidity, which proves to be strongly dependent on the considered time scales.

  10. Environmental impacts of large-scale CSP plants in northwestern China.

    PubMed

    Wu, Zhiyong; Hou, Anping; Chang, Chun; Huang, Xiang; Shi, Duoqi; Wang, Zhifeng

    2014-01-01

    Several concentrated solar power demonstration plants are being constructed, and a few commercial plants have been announced in northwestern China. However, the mutual impacts between the concentrated solar power plants and their surrounding environments have not yet been addressed comprehensively in literature by the parties involved in these projects. In China, these projects are especially important as an increasing amount of low carbon electricity needs to be generated in order to maintain the current economic growth while simultaneously lessening pollution. In this study, the authors assess the potential environmental impacts of large-scale concentrated solar power plants. Specifically, the water use intensity, soil erosion and soil temperature are quantitatively examined. It was found that some of the impacts are favorable, while some impacts are negative in relation to traditional power generation techniques and some need further research before they can be reasonably appraised. In quantitative terms, concentrated solar power plants consume about 4000 L MW(-1) h(-1) of water if wet cooling technology is used, and the collectors lead to the soil temperature changes of between 0.5 and 4 °C; however, it was found that the soil erosion is dramatically alleviated. The results of this study are helpful to decision-makers in concentrated solar power site selection and regional planning. Some conclusions of this study are also valid for large-scale photovoltaic plants.

  11. Comparisons of ionospheric electron density distributions reconstructed by GPS computerized tomography, backscatter ionograms, and vertical ionograms

    NASA Astrophysics Data System (ADS)

    Zhou, Chen; Lei, Yong; Li, Bofeng; An, Jiachun; Zhu, Peng; Jiang, Chunhua; Zhao, Zhengyu; Zhang, Yuannong; Ni, Binbin; Wang, Zemin; Zhou, Xuhua

    2015-12-01

    Global Positioning System (GPS) computerized ionosphere tomography (CIT) and ionospheric sky wave ground backscatter radar are both capable of measuring the large-scale, two-dimensional (2-D) distributions of ionospheric electron density (IED). Here we report the spatial and temporal electron density results obtained by GPS CIT and backscatter ionogram (BSI) inversion for three individual experiments. Both the GPS CIT and BSI inversion techniques demonstrate the capability and the consistency of reconstructing large-scale IED distributions. To validate the results, electron density profiles obtained from GPS CIT and BSI inversion are quantitatively compared to the vertical ionosonde data, which clearly manifests that both methods output accurate information of ionopsheric electron density and thereby provide reliable approaches to ionospheric soundings. Our study can improve current understanding of the capability and insufficiency of these two methods on the large-scale IED reconstruction.

  12. Climate Change and Macro-Economic Cycles in Pre-Industrial Europe

    PubMed Central

    Pei, Qing; Zhang, David D.; Lee, Harry F.; Li, Guodong

    2014-01-01

    Climate change has been proven to be the ultimate cause of social crisis in pre-industrial Europe at a large scale. However, detailed analyses on climate change and macro-economic cycles in the pre-industrial era remain lacking, especially within different temporal scales. Therefore, fine-grained, paleo-climate, and economic data were employed with statistical methods to quantitatively assess the relations between climate change and agrarian economy in Europe during AD 1500 to 1800. In the study, the Butterworth filter was adopted to filter the data series into a long-term trend (low-frequency) and short-term fluctuations (high-frequency). Granger Causality Analysis was conducted to scrutinize the associations between climate change and macro-economic cycle at different frequency bands. Based on quantitative results, climate change can only show significant effects on the macro-economic cycle within the long-term. In terms of the short-term effects, society can relieve the influences from climate variations by social adaptation methods and self-adjustment mechanism. On a large spatial scale, temperature holds higher importance for the European agrarian economy than precipitation. By examining the supply-demand mechanism in the grain market, population during the study period acted as the producer in the long term, whereas as the consumer in the short term. These findings merely reflect the general interactions between climate change and macro-economic cycles at the large spatial region with a long-term study period. The findings neither illustrate individual incidents that can temporarily distort the agrarian economy nor explain some specific cases. In the study, the scale thinking in the analysis is raised as an essential methodological issue for the first time to interpret the associations between climatic impact and macro-economy in the past agrarian society within different temporal scales. PMID:24516601

  13. Climate change and macro-economic cycles in pre-industrial europe.

    PubMed

    Pei, Qing; Zhang, David D; Lee, Harry F; Li, Guodong

    2014-01-01

    Climate change has been proven to be the ultimate cause of social crisis in pre-industrial Europe at a large scale. However, detailed analyses on climate change and macro-economic cycles in the pre-industrial era remain lacking, especially within different temporal scales. Therefore, fine-grained, paleo-climate, and economic data were employed with statistical methods to quantitatively assess the relations between climate change and agrarian economy in Europe during AD 1500 to 1800. In the study, the Butterworth filter was adopted to filter the data series into a long-term trend (low-frequency) and short-term fluctuations (high-frequency). Granger Causality Analysis was conducted to scrutinize the associations between climate change and macro-economic cycle at different frequency bands. Based on quantitative results, climate change can only show significant effects on the macro-economic cycle within the long-term. In terms of the short-term effects, society can relieve the influences from climate variations by social adaptation methods and self-adjustment mechanism. On a large spatial scale, temperature holds higher importance for the European agrarian economy than precipitation. By examining the supply-demand mechanism in the grain market, population during the study period acted as the producer in the long term, whereas as the consumer in the short term. These findings merely reflect the general interactions between climate change and macro-economic cycles at the large spatial region with a long-term study period. The findings neither illustrate individual incidents that can temporarily distort the agrarian economy nor explain some specific cases. In the study, the scale thinking in the analysis is raised as an essential methodological issue for the first time to interpret the associations between climatic impact and macro-economy in the past agrarian society within different temporal scales.

  14. COMPARISON OF THE SINK CHARACTERISTICS OF THREE FULL-SCALE ENVIRONMENTAL CHAMBERS

    EPA Science Inventory

    The paper gives results of an investigation of the interaction of vapor-phase organic compounds with the interior surfaces of three large dynamic test chambers. A pattern of adsorption and reemission of the test compounds was observed in all three chambers. Quantitative compari...

  15. A pilot rating scale for evaluating failure transients in electronic flight control systems

    NASA Technical Reports Server (NTRS)

    Hindson, William S.; Schroeder, Jeffery A.; Eshow, Michelle M.

    1990-01-01

    A pilot rating scale was developed to describe the effects of transients in helicopter flight-control systems on safety-of-flight and on pilot recovery action. The scale was applied to the evaluation of hardovers that could potentially occur in the digital flight-control system being designed for a variable-stability UH-60A research helicopter. Tests were conducted in a large moving-base simulator and in flight. The results of the investigation were combined with existing airworthiness criteria to determine quantitative reliability design goals for the control system.

  16. Multi-scale modeling of diffusion-controlled reactions in polymers: renormalisation of reactivity parameters.

    PubMed

    Everaers, Ralf; Rosa, Angelo

    2012-01-07

    The quantitative description of polymeric systems requires hierarchical modeling schemes, which bridge the gap between the atomic scale, relevant to chemical or biomolecular reactions, and the macromolecular scale, where the longest relaxation modes occur. Here, we use the formalism for diffusion-controlled reactions in polymers developed by Wilemski, Fixman, and Doi to discuss the renormalisation of the reactivity parameters in polymer models with varying spatial resolution. In particular, we show that the adjustments are independent of chain length. As a consequence, it is possible to match reactions times between descriptions with different resolution for relatively short reference chains and to use the coarse-grained model to make quantitative predictions for longer chains. We illustrate our results by a detailed discussion of the classical problem of chain cyclization in the Rouse model, which offers the simplest example of a multi-scale descriptions, if we consider differently discretized Rouse models for the same physical system. Moreover, we are able to explore different combinations of compact and non-compact diffusion in the local and large-scale dynamics by varying the embedding dimension.

  17. Large perceptual distortions of locomotor action space occur in ground-based coordinates: Angular expansion and the large-scale horizontal-vertical illusion.

    PubMed

    Klein, Brennan J; Li, Zhi; Durgin, Frank H

    2016-04-01

    What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides to dissociate egocentric from allocentric reference frames. In Experiment 1, it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Large perceptual distortions of locomotor action space occur in ground-based coordinates: Angular expansion and the large-scale horizontal-vertical illusion

    PubMed Central

    Klein, Brennan J.; Li, Zhi; Durgin, Frank H.

    2015-01-01

    What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides in order to dissociate egocentric from allocentric reference frames. In Experiment 1 it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. PMID:26594884

  19. Large-scale label-free quantitative proteomics of the pea aphid-Buchnera symbiosis.

    PubMed

    Poliakov, Anton; Russell, Calum W; Ponnala, Lalit; Hoops, Harold J; Sun, Qi; Douglas, Angela E; van Wijk, Klaas J

    2011-06-01

    Many insects are nutritionally dependent on symbiotic microorganisms that have tiny genomes and are housed in specialized host cells called bacteriocytes. The obligate symbiosis between the pea aphid Acyrthosiphon pisum and the γ-proteobacterium Buchnera aphidicola (only 584 predicted proteins) is particularly amenable for molecular analysis because the genomes of both partners have been sequenced. To better define the symbiotic relationship between this aphid and Buchnera, we used large-scale, high accuracy tandem mass spectrometry (nanoLC-LTQ-Orbtrap) to identify aphid and Buchnera proteins in the whole aphid body, purified bacteriocytes, isolated Buchnera cells and the residual bacteriocyte fraction. More than 1900 aphid and 400 Buchnera proteins were identified. All enzymes in amino acid metabolism annotated in the Buchnera genome were detected, reflecting the high (68%) coverage of the proteome and supporting the core function of Buchnera in the aphid symbiosis. Transporters mediating the transport of predicted metabolites were present in the bacteriocyte. Label-free spectral counting combined with hierarchical clustering, allowed to define the quantitative distribution of a subset of these proteins across both symbiotic partners, yielding no evidence for the selective transfer of protein among the partners in either direction. This is the first quantitative proteome analysis of bacteriocyte symbiosis, providing a wealth of information about molecular function of both the host cell and bacterial symbiont.

  20. Quantitation of 87 Proteins by nLC-MRM/MS in Human Plasma: Workflow for Large-Scale Analysis of Biobank Samples.

    PubMed

    Rezeli, Melinda; Sjödin, Karin; Lindberg, Henrik; Gidlöf, Olof; Lindahl, Bertil; Jernberg, Tomas; Spaak, Jonas; Erlinge, David; Marko-Varga, György

    2017-09-01

    A multiple reaction monitoring (MRM) assay was developed for precise quantitation of 87 plasma proteins including the three isoforms of apolipoprotein E (APOE) associated with cardiovascular diseases using nanoscale liquid chromatography separation and stable isotope dilution strategy. The analytical performance of the assay was evaluated and we found an average technical variation of 4.7% in 4-5 orders of magnitude dynamic range (≈0.2 mg/L to 4.5 g/L) from whole plasma digest. Here, we report a complete workflow, including sample processing adapted to 96-well plate format and normalization strategy for large-scale studies. To further investigate the MS-based quantitation the amount of six selected proteins was measured by routinely used clinical chemistry assays as well and the two methods showed excellent correlation with high significance (p-value < 10e-5) for the six proteins, in addition for the cardiovascular predictor factor, APOB: APOA1 ratio (r = 0.969, p-value < 10e-5). Moreover, we utilized the developed assay for screening of biobank samples from patients with myocardial infarction and performed the comparative analysis of patient groups with STEMI (ST- segment elevation myocardial infarction), NSTEMI (non ST- segment elevation myocardial infarction) and type-2 AMI (type-2 myocardial infarction) patients.

  1. The Small College Administrative Environment.

    ERIC Educational Resources Information Center

    Buzza, Bonnie Wilson

    Environmental differences for speech departments at large and small colleges are not simply of scale; there are qualitative as well as quantitative differences. At small colleges, faculty are hired as teachers, rather than as researchers. Because speech teachers at small colleges must be generalists, and because it is often difficult to replace…

  2. Secondary Students' Stable and Unstable Optics Conceptions Using Contextualized Questions

    ERIC Educational Resources Information Center

    Chu, Hye-Eun; Treagust, David F.

    2014-01-01

    This study focuses on elucidating and explaining reasons for the stability of and interrelationships between students' conceptions about "Light Propagation" and "Visibility of Objects" using contextualized questions across 3 years of secondary schooling from Years 7 to 9. In a large-scale quantitative study involving 1,233…

  3. Potential Large-Scale Production of Conjugated Soybean Oil Catalyzed by Photolyzed Iodine in Hexanes

    USDA-ARS?s Scientific Manuscript database

    A laboratory apparatus is described for the production of conjugated soybean oil (SBO) in pound quantities via irradiation with visible-light. Under our reaction conditions, quantitative conversions (determined by NMR spectroscopy) of SBO to conjugated SBO, in hexanes at reflux temperatures, were a...

  4. Motivation to Read among Rural Adolescents

    ERIC Educational Resources Information Center

    Belken, Gloria

    2013-01-01

    This study used quantitative methods to investigate motivation to read among high school students in a tenth-grade English course at a rural high school in the Midwestern USA. Data were collected and analyzed to replicate previous studies. In this study, when compared to large-scale surveys, respondents showed more positive attitudes toward…

  5. Reframing Approaches to Narrating Young People's Conceptualisations of Citizenship in Education Research

    ERIC Educational Resources Information Center

    Akar, Bassel

    2018-01-01

    Large-scale quantitative studies on citizenship and citizenship education research have advanced an international and comparative field of democratic citizenship education. Their instruments, however, informed by theoretical variables constructed in Western Europe and North America mostly measure young people's understandings of a predefined…

  6. Scale dependence of the alignment between strain rate and rotation in turbulent shear flow

    NASA Astrophysics Data System (ADS)

    Fiscaletti, D.; Elsinga, G. E.; Attili, A.; Bisetti, F.; Buxton, O. R. H.

    2016-10-01

    The scale dependence of the statistical alignment tendencies of the eigenvectors of the strain-rate tensor ei, with the vorticity vector ω , is examined in the self-preserving region of a planar turbulent mixing layer. Data from a direct numerical simulation are filtered at various length scales and the probability density functions of the magnitude of the alignment cosines between the two unit vectors | ei.ω ̂| are examined. It is observed that the alignment tendencies are insensitive to the concurrent large-scale velocity fluctuations, but are quantitatively affected by the nature of the concurrent large-scale velocity-gradient fluctuations. It is confirmed that the small-scale (local) vorticity vector is preferentially aligned in parallel with the large-scale (background) extensive strain-rate eigenvector e1, in contrast to the global tendency for ω to be aligned in parallel with the intermediate strain-rate eigenvector [Hamlington et al., Phys. Fluids 20, 111703 (2008), 10.1063/1.3021055]. When only data from regions of the flow that exhibit strong swirling are included, the so-called high-enstrophy worms, the alignment tendencies are exaggerated with respect to the global picture. These findings support the notion that the production of enstrophy, responsible for a net cascade of turbulent kinetic energy from large scales to small scales, is driven by vorticity stretching due to the preferential parallel alignment between ω and nonlocal e1 and that the strongly swirling worms are kinematically significant to this process.

  7. Impact of spatially correlated pore-scale heterogeneity on drying porous media

    NASA Astrophysics Data System (ADS)

    Borgman, Oshri; Fantinel, Paolo; Lühder, Wieland; Goehring, Lucas; Holtzman, Ran

    2017-07-01

    We study the effect of spatially-correlated heterogeneity on isothermal drying of porous media. We combine a minimal pore-scale model with microfluidic experiments with the same pore geometry. Our simulated drying behavior compares favorably with experiments, considering the large sensitivity of the emergent behavior to the uncertainty associated with even small manufacturing errors. We show that increasing the correlation length in particle sizes promotes preferential drying of clusters of large pores, prolonging liquid connectivity and surface wetness and thus higher drying rates for longer periods. Our findings improve our quantitative understanding of how pore-scale heterogeneity impacts drying, which plays a role in a wide range of processes ranging from fuel cells to curing of paints and cements to global budgets of energy, water and solutes in soils.

  8. Planar isotropy of passive scalar turbulent mixing with a mean perpendicular gradient.

    PubMed

    Danaila, L; Dusek, J; Le Gal, P; Anselmet, F; Brun, C; Pumir, A

    1999-08-01

    A recently proposed evolution equation [Vaienti et al., Physica D 85, 405 (1994)] for the probability density functions (PDF's) of turbulent passive scalar increments obtained under the assumptions of fully three-dimensional homogeneity and isotropy is submitted to validation using direct numerical simulation (DNS) results of the mixing of a passive scalar with a nonzero mean gradient by a homogeneous and isotropic turbulent velocity field. It is shown that this approach leads to a quantitatively correct balance between the different terms of the equation, in a plane perpendicular to the mean gradient, at small scales and at large Péclet number. A weaker assumption of homogeneity and isotropy restricted to the plane normal to the mean gradient is then considered to derive an equation describing the evolution of the PDF's as a function of the spatial scale and the scalar increments. A very good agreement between the theory and the DNS data is obtained at all scales. As a particular case of the theory, we derive a generalized form for the well-known Yaglom equation (the isotropic relation between the second-order moments for temperature increments and the third-order velocity-temperature mixed moments). This approach allows us to determine quantitatively how the integral scale properties influence the properties of mixing throughout the whole range of scales. In the simple configuration considered here, the PDF's of the scalar increments perpendicular to the mean gradient can be theoretically described once the sources of inhomogeneity and anisotropy at large scales are correctly taken into account.

  9. Structured Qualitative Research: Organizing “Mountains of Words” for Data Analysis, both Qualitative and Quantitative

    PubMed Central

    Johnson, Bruce D.; Dunlap, Eloise; Benoit, Ellen

    2008-01-01

    Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semi-quantitative coding and analysis. The authors have 30 years of experience in designing and completing major qualitative research projects, mainly funded by the US National Institute on Drug Abuse [NIDA]. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Several different staff members can contribute to the data collection, even when working from remote locations. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed. The major difficulties involve a need for staff who can program and manage large databases, and who can be skillful analysts of both qualitative and quantitative data. PMID:20222777

  10. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor

    NASA Astrophysics Data System (ADS)

    Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim

    2016-03-01

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions. Electronic supplementary information (ESI) available: DNA sequences and nomenclature (Table 1S); SDS-PAGE assay of IHF stock solution (Fig. 1S); determination of the concentration of IHF stock solution by Bradford assay (Fig. 2S); equilibrium binding isotherm fitting results of other DNA sequences (Table 2S); calculation of dissociation constants (Fig. 3S, 4S; Table 2S); geometric model for quantitation of DNA bending angle induced by specific IHF binding (Fig. 4S); customized flow cell assembly (Fig. 5S); real-time measurement of average fluorophore height change by SSFM (Fig. 6S); summary of binding parameters obtained from additive isotherm model fitting (Table 3S); average surface densities of 10 dsDNA spots and bound IHF at equilibrium (Table 4S); effects of surface densities on the binding and bending of dsDNA (Tables 5S, 6S and Fig. 7S-10S). See DOI: 10.1039/c5nr06785e

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mezyk, Stephen P.; Mincher, Bruce J.; Nilsson, Mikael

    This document is the final report for the Nuclear Energy Universities Program (NEUP) grant 10-910 (DE-AC07-05ID14517) “Alpha Radiolysis of Nuclear Solvent Extraction Ligands used for An(III) and Ln(III) Separations”. The goal of this work was to obtain a quantitative understanding of the impacts of both low Linear Energy Transfer (LET, gamma-rays) and high LET (alpha particles) radiation chemistry occurring in future large-scale separations processes. This quantitative understanding of the major radiation effects on diluents and ligands is essential for optimal process implementation, and could result in significant cost savings in the future.

  12. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  13. Quantitation of heparosan with heparin lyase III and spectrophotometry.

    PubMed

    Huang, Haichan; Zhao, Yingying; Lv, Shencong; Zhong, Weihong; Zhang, Fuming; Linhardt, Robert J

    2014-02-15

    Heparosan is Escherichia coli K5 capsule polysaccharide, which is the key precursor for preparing bioengineered heparin. A rapid and effective quantitative method for detecting heparosan is important in the large-scale production of heparosan. Heparin lyase III (Hep III) effectively catalyzes the heparosan depolymerization, forming unsaturated disaccharides that are measurable using a spectrophotometer at 232 nm. We report a new method for the quantitative detection of heparosan with heparin lyase III and spectrophotometry that is safer and more specific than the traditional carbazole assay. In an optimized detection system, heparosan at a minimum concentration of 0.60 g/L in fermentation broth can be detected. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Enabling Interactive Measurements from Large Coverage Microscopy

    PubMed Central

    Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary

    2017-01-01

    Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600

  15. Universality and predictability in molecular quantitative genetics.

    PubMed

    Nourmohammad, Armita; Held, Torsten; Lässig, Michael

    2013-12-01

    Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.

  16. Large-scale impacts of herbivores on the structural diversity of African savannas

    PubMed Central

    Asner, Gregory P.; Levick, Shaun R.; Kennedy-Bowdoin, Ty; Knapp, David E.; Emerson, Ruth; Jacobson, James; Colgan, Matthew S.; Martin, Roberta E.

    2009-01-01

    African savannas are undergoing management intensification, and decision makers are increasingly challenged to balance the needs of large herbivore populations with the maintenance of vegetation and ecosystem diversity. Ensuring the sustainability of Africa's natural protected areas requires information on the efficacy of management decisions at large spatial scales, but often neither experimental treatments nor large-scale responses are available for analysis. Using a new airborne remote sensing system, we mapped the three-dimensional (3-D) structure of vegetation at a spatial resolution of 56 cm throughout 1640 ha of savanna after 6-, 22-, 35-, and 41-year exclusions of herbivores, as well as in unprotected areas, across Kruger National Park in South Africa. Areas in which herbivores were excluded over the short term (6 years) contained 38%–80% less bare ground compared with those that were exposed to mammalian herbivory. In the longer-term (> 22 years), the 3-D structure of woody vegetation differed significantly between protected and accessible landscapes, with up to 11-fold greater woody canopy cover in the areas without herbivores. Our maps revealed 2 scales of ecosystem response to herbivore consumption, one broadly mediated by geologic substrate and the other mediated by hillslope-scale variation in soil nutrient availability and moisture conditions. Our results are the first to quantitatively illustrate the extent to which herbivores can affect the 3-D structural diversity of vegetation across large savanna landscapes. PMID:19258457

  17. Photogrammetric portrayal of Mars topography.

    USGS Publications Warehouse

    Wu, S.S.C.

    1979-01-01

    Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.-Author

  18. Photogrammetric portrayal of Mars topography

    NASA Technical Reports Server (NTRS)

    Wu, S. S. C.

    1979-01-01

    Special photogrammetric techniques have been developed to portray Mars topography, using Mariner and Viking imaging and nonimaging topographic information and earth-based radar data. Topography is represented by the compilation of maps at three scales: global, intermediate, and very large scale. The global map is a synthesis of topographic information obtained from Mariner 9 and earth-based radar, compiled at a scale of 1:25,000,000 with a contour interval of 1 km; it gives a broad quantitative view of the planet. At intermediate scales, Viking Orbiter photographs of various resolutions are used to compile detailed contour maps of a broad spectrum of prominent geologic features; a contour interval as small as 20 m has been obtained from very high resolution orbital photography. Imagery from the Viking lander facsimile cameras permits construction of detailed, very large scale (1:10) topographic maps of the terrain surrounding the two landers; these maps have a contour interval of 1 cm. This paper presents several new detailed topographic maps of Mars.

  19. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  20. Collaboration in the Humanities, Arts and Social Sciences in Australia

    ERIC Educational Resources Information Center

    Haddow, Gaby; Xia, Jianhong; Willson, Michele

    2017-01-01

    This paper reports on the first large-scale quantitative investigation into collaboration, demonstrated in co-authorship, by Australian humanities, arts and social sciences (HASS) researchers. Web of Science data were extracted for Australian HASS publications, with a focus on the softer social sciences, over the period 2004-2013. The findings…

  1. Doing Disability Research in a Southern Context: Challenges and Possibilities

    ERIC Educational Resources Information Center

    Singal, Nidhi

    2010-01-01

    Research on disability issues in countries of the South is primarily dominated by a focus on generating large scale quantitative data sets. This paper discusses the many challenges, opportunities and dilemmas faced in designing and undertaking a qualitative research study in one district in India. The Disability, Education and Poverty Project…

  2. Quality Control Charts in Large-Scale Assessment Programs

    ERIC Educational Resources Information Center

    Schafer, William D.; Coverdale, Bradley J.; Luxenberg, Harlan; Jin, Ying

    2011-01-01

    There are relatively few examples of quantitative approaches to quality control in educational assessment and accountability contexts. Among the several techniques that are used in other fields, Shewart charts have been found in a few instances to be applicable in educational settings. This paper describes Shewart charts and gives examples of how…

  3. Evaluating Change in Medical School Curricula: How Did We Know Where We Were Going?

    ERIC Educational Resources Information Center

    Mahaffy, John; Gerrity, Martha S.

    1998-01-01

    Compares and contrasts the primary outcomes and methods used to evaluate curricular changes at eight medical schools participating in a large-scale medical curriculum development project. Describes how the evaluative data, both quantitative and qualitative, were collected, and how evaluation drove curricular change. Although the evaluations were…

  4. Landscape pattern metrics and regional assessment

    Treesearch

    Robert V. O' Neill; Kurt H. Riitters; J.D. Wickham; Bruce K. Jones

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop interpret quantitative measures of spatial patter-the landscape indices. This article reviews what is known about...

  5. Invasive Russian knapweed (Acroptilon repens) creates large patches almost entirely by rhizomic growth

    USDA-ARS?s Scientific Manuscript database

    Russian knapweed is an outcrossing perennial invasive weed in North America that can spread by both seed and horizontal rhizome growth leading to new shoots. The predominant mode of spread at the local and long-distance scales has not been quantitatively researched. We used Amplified Fragment Length...

  6. Advantages of Social Network Analysis in Educational Research

    ERIC Educational Resources Information Center

    Ushakov, K. M.; Kukso, K. N.

    2015-01-01

    Currently one of the main tools for the large scale studies of schools is statistical analysis. Although it is the most common method and it offers greatest opportunities for analysis, there are other quantitative methods for studying schools, such as network analysis. We discuss the potential advantages that network analysis has for educational…

  7. Using Association Mapping in Teosinte (Zea Mays ssp Parviglumis) to Investigate the Function of Selection-Candidate Genes

    USDA-ARS?s Scientific Manuscript database

    Large-scale screens of the maize genome identified 48 genes that show the putative signature of artificial selection during maize domestication or improvement. These selection-candidate genes may act as quantitative trait loci (QTL) that control the phenotypic differences between maize and its proge...

  8. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  9. An Account of Studies of Organizational Development in Schools.

    ERIC Educational Resources Information Center

    Runkel, Philip J.; Schmuck, Richard A.

    Most organizational development (OD) projects in schools are never reported in the literature. This paper discusses benefits, outcomes, and success factors disclosed by the first large-scale quantitative survey of OD in schoools conducted by Fullan, Miles, and Taylor in 1978. The paper also explores other relevant studies published through early…

  10. Intra- and Inter-Individual Variation in Self-Reported Code-Switching Patterns of Adult Multilinguals

    ERIC Educational Resources Information Center

    Dewaele, Jean-Marc; Li, Wei

    2014-01-01

    The present study is a large-scale quantitative analysis of intra-individual variation (linked to type of interlocutor) and inter-individual variation (linked to multilingualism, sociobiographical variables and three personality traits) in self-reported frequency of code-switching (CS) among 2116 multilinguals. We found a significant effect of…

  11. Predicting Southern Appalachian overstory vegetation with digital terrain data

    Treesearch

    Paul V. Bolstad; Wayne Swank; James Vose

    1998-01-01

    Vegetation in mountainous regions responds to small-scale variation in terrain, largely due to effects on both temperature and soil moisture. However, there are few studies of quantitative, terrain-based methods for predicting vegetation composition. This study investigated relationships between forest composition, elevation, and a derived index of terrain shape, and...

  12. A Quantitative Mass Spectrometry-based Approach for Identifying Protein Kinase-Clients and Quantifying Kinase Activity

    USDA-ARS?s Scientific Manuscript database

    The Homo sapiens and Arabidopsis thaliana genomes are believed to encode >500 and >1,000 protein kinases, respectively. Despite this abundance, few bona fide kinase-client relationships have been described in detail. Mass spectrometry (MS)-based approaches have been integral to the large-scale mapp...

  13. A large scale joint analysis of flowering time reveals independent temperate adaptations in maize

    USDA-ARS?s Scientific Manuscript database

    Modulating days to flowering is a key mechanism in plants for adapting to new environments, and variation in days to flowering drives population structure by limiting mating. To elucidate the genetic architecture of flowering across maize, a quantitative trait, we mapped flowering in five global pop...

  14. Factors influencing stream fish recovery following a large-scale disturbance

    Treesearch

    William E. Ensign; Angermeier Leftwich; C. Andrew Dolloff

    1997-01-01

    The authors examined fish distribution and abundance in erosional habitat units in South Fork Roanoke River, VA, following a fish kill by using a reachwide sampling approach for 3 species and a representative-reach sampling approach for 10 species. Qualitative (presence-absence) and quantitative (relative abundance) estimates of distribution and abundance provided...

  15. Reconciling Rigour and Impact by Collaborative Research Design: Study of Teacher Agency

    ERIC Educational Resources Information Center

    Pantic, Nataša

    2017-01-01

    This paper illustrates a new way of working collaboratively on the development of a methodology for studying teacher agency for social justice. Increasing emphasis of impact on change as a purpose of social research raises questions about appropriate research designs. Large-scale quantitative research framed within externally set parameters has…

  16. Automated microscopy for high-content RNAi screening

    PubMed Central

    2010-01-01

    Fluorescence microscopy is one of the most powerful tools to investigate complex cellular processes such as cell division, cell motility, or intracellular trafficking. The availability of RNA interference (RNAi) technology and automated microscopy has opened the possibility to perform cellular imaging in functional genomics and other large-scale applications. Although imaging often dramatically increases the content of a screening assay, it poses new challenges to achieve accurate quantitative annotation and therefore needs to be carefully adjusted to the specific needs of individual screening applications. In this review, we discuss principles of assay design, large-scale RNAi, microscope automation, and computational data analysis. We highlight strategies for imaging-based RNAi screening adapted to different library and assay designs. PMID:20176920

  17. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  18. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. An Alternative to the Search for Single Polymorphisms: Toward Molecular Personality Scales for the Five-Factor Model

    PubMed Central

    McCrae, Robert R.; Scally, Matthew; Terracciano, Antonio; Abecasis, Gonçalo R.; Costa, Paul T.

    2011-01-01

    There is growing evidence that personality traits are affected by many genes, all of which have very small effects. As an alternative to the largely-unsuccessful search for individual polymorphisms associated with personality traits, we identified large sets of potentially related single nucleotide polymorphisms (SNPs) and summed them to form molecular personality scales (MPSs) with from 4 to 2,497 SNPs. Scales were derived from two-thirds of a large (N = 3,972) sample of individuals from Sardinia who completed the Revised NEO Personality Inventory and were assessed in a genome-wide association scan. When MPSs were correlated with the phenotype in the remaining third of the sample, very small but significant associations were found for four of the five personality factors when the longest scales were examined. These data suggest that MPSs for Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness (but not Extraversion) contain genetic information that can be refined in future studies, and the procedures described here should be applicable to other quantitative traits. PMID:21114353

  20. Quantitative measurement of the growth rate of the PHA-producing photosynthetic bacterium Rhodocyclus gelatinous CBS-2[PolyHydroxyAlkanoate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfrum, E.J.; Weaver, P.F.

    Researchers at the National Renewable Energy Laboratory (NREL) have been investigating the use of model photosynthetic microorganisms that use sunlight and two-carbon organic substrates (e.g., ethanol, acetate) to produce biodegradable polyhydroxyalkanoate (PHA) copolymers as carbon storage compounds. Use of these biological PHAs in single-use plastics applications, followed by their post-consumer composting or anaerobic digestion, could impact petroleum consumption as well as the overloading of landfills. The large-scale production of PHA polymers by photosynthetic bacteria will require large-scale reactor systems utilizing either sunlight or artificial illumination. The first step in the scale-up process is to quantify the microbial growth rates andmore » the PHA production rates as a function of reaction conditions such as nutrient concentration, temperature, and light quality and intensity.« less

  1. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  2. The topology of large-scale structure. I - Topology and the random phase hypothesis. [galactic formation models

    NASA Technical Reports Server (NTRS)

    Weinberg, David H.; Gott, J. Richard, III; Melott, Adrian L.

    1987-01-01

    Many models for the formation of galaxies and large-scale structure assume a spectrum of random phase (Gaussian), small-amplitude density fluctuations as initial conditions. In such scenarios, the topology of the galaxy distribution on large scales relates directly to the topology of the initial density fluctuations. Here a quantitative measure of topology - the genus of contours in a smoothed density distribution - is described and applied to numerical simulations of galaxy clustering, to a variety of three-dimensional toy models, and to a volume-limited sample of the CfA redshift survey. For random phase distributions the genus of density contours exhibits a universal dependence on threshold density. The clustering simulations show that a smoothing length of 2-3 times the mass correlation length is sufficient to recover the topology of the initial fluctuations from the evolved galaxy distribution. Cold dark matter and white noise models retain a random phase topology at shorter smoothing lengths, but massive neutrino models develop a cellular topology.

  3. Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.

    PubMed

    Zauber, Henrik; Schulze, Waltraud X

    2012-11-02

    The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.

  4. Advances in the Quantitative Characterization of the Shape of Ash-Sized Pyroclast Populations: Fractal Analyses Coupled to Micro- and Nano-Computed Tomography Techniques

    NASA Astrophysics Data System (ADS)

    Rausch, J.; Vonlanthen, P.; Grobety, B. H.

    2014-12-01

    The quantification of shape parameters in pyroclasts is fundamental to infer the dominant type of magma fragmentation (magmatic vs. phreatomagmatic), as well as the behavior of volcanic plumes and clouds in the atmosphere. In a case study aiming at reconstructing the fragmentation mechanisms triggering maar eruptions in two geologically and compositionally distinctive volcanic fields (West and East Eifel, Germany), the shapes of a large number of ash particle contours obtained from SEM images were analyzed by a dilation-based fractal method. Volcanic particle contours are pseudo-fractals showing mostly two distinct slopes in Richardson plots related to the fractal dimensions D1 (small-scale "textural" dimension) and D2 (large-scale "morphological" dimension). The validity of the data obtained from 2D sections was tested by analysing SEM micro-CT slices of one particle cut in different orientations and positions. Results for West Eifel maar particles yield large D1 values (> 1.023), resembling typical values of magmatic particles, which are characterized by a complex shape, especially at small scales. In contrast, the D1 values of ash particles from one East Eifel maar deposit are much smaller, coinciding with the fractal dimensions obtained from phreatomagmatic end-member particles. These quantitative morphological analyses suggest that the studied maar eruptions were triggered by two different fragmentation processes: phreatomagmatic in the East Eifel and magmatic in the West Eifel. The application of fractal analysis to quantitatively characterize the shape of pyroclasts and the linking of fractal dimensions to specific fragmentation processes has turned out to be a very promising tool for studying the fragmentation history of any volcanic eruption. The next step is to extend morphological analysis of volcanic particles to 3 dimensions. SEM micro-CT, already applied in this study, offers the required resolution, but is not suitable for the analysis of a large number of particles. Newly released nano CT-scanners, however, allows the simultaneous analysis of a statistically relevant number of particles (in the hundreds range). Preliminary results of a first trial will be presented.

  5. Deformation and Failure Mechanisms of Shape Memory Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daly, Samantha Hayes

    2015-04-15

    The goal of this research was to understand the fundamental mechanics that drive the deformation and failure of shape memory alloys (SMAs). SMAs are difficult materials to characterize because of the complex phase transformations that give rise to their unique properties, including shape memory and superelasticity. These phase transformations occur across multiple length scales (one example being the martensite-austenite twinning that underlies macroscopic strain localization) and result in a large hysteresis. In order to optimize the use of this hysteretic behavior in energy storage and damping applications, we must first have a quantitative understanding of this transformation behavior. Prior resultsmore » on shape memory alloys have been largely qualitative (i.e., mapping phase transformations through cracked oxide coatings or surface morphology). The PI developed and utilized new approaches to provide a quantitative, full-field characterization of phase transformation, conducting a comprehensive suite of experiments across multiple length scales and tying these results to theoretical and computational analysis. The research funded by this award utilized new combinations of scanning electron microscopy, diffraction, digital image correlation, and custom testing equipment and procedures to study phase transformation processes at a wide range of length scales, with a focus at small length scales with spatial resolution on the order of 1 nanometer. These experiments probe the basic connections between length scales during phase transformation. In addition to the insights gained on the fundamental mechanisms driving transformations in shape memory alloys, the unique experimental methodologies developed under this award are applicable to a wide range of solid-to-solid phase transformations and other strain localization mechanisms.« less

  6. Investigation on the Size Effect in Large-Scale Beta-Processed Ti-17 Disks Based on Quantitative Metallography

    NASA Astrophysics Data System (ADS)

    Zhang, Saifei; Zeng, Weidong; Gao, Xiongxiong; Zhao, Xingdong; Li, Siqing

    2017-10-01

    The present study investigates the mechanical properties of large-scale beta-processed Ti-17 forgings because of the increasing interest in beta thermal-mechanical processing method for fabricating compressor disks or blisks in aero-engines due to its advantage in damage tolerance performance. Three Ti-17 disks with different weights of 57, 250 and 400 kg were prepared by beta processing techniques firstly for comparative study. The results reveal a significant `size effect' in beta-processed Ti-17 disks, i.e., dependences of high cycle fatigue, tensile properties and fracture toughness of beta-processed Ti-17 disks on disk size (or weight). With increasing disk weight from 57 to 400 kg, the fatigue limit (fatigue strength at 107 cycles, R = -1) was reduced from 583 to 495 MPa, tensile yield strength dropped from 1073 to 1030 MPa, while fracture toughness ( K IC) rose from 70.9 to 95.5 MPaṡm1/2. Quantitative metallography analysis shows that the `size effect' of mechanical properties can be attributed to evident differences between microstructures of the three disk forgings. With increasing disk size, nearly all microstructural components in the basket-weave microstructure, including prior β grain, α layers at β grain boundaries (GB- α) and α lamellas at the interior of the grains, get coarsened to different degrees. Further, the microstructural difference between the beta-processed disks is proved to be the consequence of longer pre-forging soaking time and lower post-forging cooling rate for large disks than small ones. Finally, suggestions are made from the perspective of microstructural control on how to improve mechanical properties of large-scale beta-processed Ti-17 forgings.

  7. Void probability as a function of the void's shape and scale-invariant models

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1991-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  8. Size and structure of Chlorella zofingiensis /FeCl 3 flocs in a shear flow: Algae Floc Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyatt, Nicholas B.; O'Hern, Timothy J.; Shelden, Bion

    Flocculation is a promising method to overcome the economic hurdle to separation of algae from its growth medium in large scale operations. But, understanding of the floc structure and the effects of shear on the floc structure are crucial to the large scale implementation of this technique. The floc structure is important because it determines, in large part, the density and settling behavior of the algae. Freshwater algae floc size distributions and fractal dimensions are presented as a function of applied shear rate in a Couette cell using ferric chloride as a flocculant. Comparisons are made with measurements made formore » a polystyrene microparticle model system taken here as well as reported literature results. The algae floc size distributions are found to be self-preserving with respect to shear rate, consistent with literature data for polystyrene. Moreover, three fractal dimensions are calculated which quantitatively characterize the complexity of the floc structure. Low shear rates result in large, relatively dense packed flocs which elongate and fracture as the shear rate is increased. Our results presented here provide crucial information for economically implementing flocculation as a large scale algae harvesting strategy.« less

  9. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  10. Analysis of the ability of large-scale reanalysis data to define Siberian fire danger in preparation for future fire prediction

    NASA Astrophysics Data System (ADS)

    Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly

    2010-05-01

    Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.

  11. An Exploratory Analysis of the Longitudinal Impact of Principal Change on Elementary School Achievement

    ERIC Educational Resources Information Center

    Hochbein, Craig; Cunningham, Brittany C.

    2013-01-01

    Recent reform initiatives, such as the Title I School Improvement Grants and Race to the Top, recommended a principal change to jump-start school turnaround. Yet, few educational researchers have examined principal change as way to improve schools in a state of systematic reform; furthermore, no large-scale quantitative study has determined the…

  12. Development of multitissue microfluidic dynamic array for assessing changes in gene expression associated with channel catfish appetite, growth, metabolism, and intestinal health

    USDA-ARS?s Scientific Manuscript database

    Large-scale, gene expression methods allow for high throughput analysis of physiological pathways at a fraction of the cost of individual gene expression analysis. Systems, such as the Fluidigm quantitative PCR array described here, can provide powerful assessments of the effects of diet, environme...

  13. Acquiring a Variable Structure: An Interlanguage Analysis of Second Language Mood Use in Spanish

    ERIC Educational Resources Information Center

    Gudmestad, Aarnes

    2012-01-01

    This investigation connects issues in second language (L2) acquisition to topics in quantitative sociolinguistics by exploring the relationship between native-speaker (NS) and L2 variation. It is the first large-scale analysis of L2 mood use (the subjunctive-indicative contrast) in Spanish. It applies variationist findings on the range of…

  14. Considerations for interpreting probabilistic estimates of uncertainty of forest carbon

    Treesearch

    James E. Smith; Linda S. Heath

    2000-01-01

    Quantitative estimated of carbon inventories are needed as part of nationwide attempts to reduce net release of greenhouse gases and the associated climate forcing. Naturally, an appreciable amount of uncertainty is inherent in such large-scale assessments, especially since both science and policy issues are still evolving. Decision makers need an idea of the...

  15. The Relationship between English Language Learners' Language Proficiency and Standardized Test Scores

    ERIC Educational Resources Information Center

    Thakkar, Darshan

    2013-01-01

    It is generally theorized that English Language Learner (ELL) students do not succeed on state standardized tests because ELL students lack the cognitive academic language skills necessary to function on the large scale content assessments. The purpose of this dissertation was to test that theory. Through the use of quantitative methodology, ELL…

  16. Implementation of School Choice Policy: Interpretation and Response by Parents of Students with Special Educational Needs.

    ERIC Educational Resources Information Center

    Bagley, Carl; Woods, Philip A.; Woods, Glenys

    2001-01-01

    Provides empirically based insights into preferences, perceptions, and responses of parents of students with special education needs to the 1990s restructured school system in England. Uses analyses of quantitative/qualitative data generated by a large-scale research study on school choice. Reveals depth and range of problems encountered by these…

  17. Evaluating Comprehensive School Reform Models at Scale: Focus on Implementation

    ERIC Educational Resources Information Center

    Vernez, Georges; Karam, Rita; Mariano, Louis T.; DeMartini, Christine

    2006-01-01

    This study was designed to fill the "implementation measurement" gap. A methodology to quantitatively measure the level of Comprehensive School Reform (CSR) implementation that can be used across a variety of CSR models was developed, and then applied to measure actual implementation of four different CSR models in a large number of schools. The…

  18. Researching Returns Emanating from Participation in Adult Education Courses: A Quantitative Approach

    ERIC Educational Resources Information Center

    Panitsides, Eugenia

    2013-01-01

    Throughout contemporary literature, participants in adult education courses have been reported to acquire knowledge and skills, develop understanding and enhance self-confidence, parameters that induce changes in their personal lives, while enabling them to play a more active role in their family, community or work. In this vein, a large-scale,…

  19. Advancing effects analysis for integrated, large-scale wildfire risk assessment

    Treesearch

    Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager

    2011-01-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...

  20. Women in Engineering in Turkey--A Large Scale Quantitative and Qualitative Examination

    ERIC Educational Resources Information Center

    Smith, Alice E.; Dengiz, Berna

    2010-01-01

    The underrepresentation of women in engineering is well known and unresolved. However, Turkey has witnessed a shift in trend from virtually no female participation in engineering to across-the-board proportions that dominate other industrialised countries within the 76 years of the founding of the Turkish Republic. This paper describes the largest…

  1. Assessing Student Status and Progress in Science Reasoning and Quantitative Literacy at a Very Large Undergraduate Institution

    NASA Astrophysics Data System (ADS)

    Donahue, Megan; Kaplan, J.; Ebert-May, D.; Ording, G.; Melfi, V.; Gilliland, D.; Sikorski, A.; Johnson, N.

    2009-01-01

    The typical large liberal-arts, tier-one research university requires all of its graduates to achieve some minimal standards of quantitative literacy and scientific reasoning skills. But how do we know what we are doing, as instructors and as a university, is working the way we think it should? At Michigan State University, a cross-disciplinary team of scientists, statisticians, and teacher education experts have begun a large-scale investigation about student mastery of quantitative and scientific skills, beginning with an assessment of 3,000 freshmen before they start their university careers. We will describe the process we used for developing and testing an instrument, for expanding faculty involvement and input on high-level goals. For this limited presentation, we will limit the discussion mainly to the scientific reasoning perspective, but we will briefly mention some intriguing observations regarding quantitative literacy as well. This project represents the beginning of long-term, longitudinal tracking of the progress of students at our institution. We will discuss preliminary results our 2008 assessment of incoming freshman at Michigan State, and where we plan to go from here. We acknowledge local support from the Quality Fund from the Office of the Provost at MSU. We also acknowledge the Center for Assessment at James Madison University and the NSF for their support at the very beginning of our work.

  2. A genome-wide linkage scan for quantitative trait loci underlying obesity related phenotypes in 434 Caucasian families.

    PubMed

    Zhao, Lan-Juan; Xiao, Peng; Liu, Yong-Jun; Xiong, Dong-Hai; Shen, Hui; Recker, Robert R; Deng, Hong-Wen

    2007-03-01

    To identify quantitative trait loci (QTLs) that contribute to obesity, we performed a large-scale whole genome linkage scan (WGS) involving 4,102 individuals from 434 Caucasian families. The most pronounced linkage evidence was found at the genomic region 20p11-12 for fat mass (LOD = 3.31) and percentage fat mass (PFM) (LOD = 2.92). We also identified several regions showing suggestive linkage signals (threshold LOD = 1.9) for obesity phenotypes, including 5q35, 8q13, 10p12, and 17q11.

  3. Species diversity of edaphic mites (Acari: Oribatida) and effects of topography, soil properties and litter gradients on their qualitative and quantitative composition in 64 km² of forest in Amazonia.

    PubMed

    de Moraes, Jamile; Franklin, Elizabeth; de Morais, José Wellington; de Souza, Jorge Luiz Pereira

    2011-09-01

    Small-scale spatial distribution of oribatid mites has been investigated in Amazonia. In addition, medium- and large-scale studies are needed to establish the utility of these mites in detecting natural environmental variability, and to distinguish this variability from anthropogenic impacts. We are expanding the knowledge about oribatid mites in a wet upland forest reserve, and investigate whether a standardized and integrated protocol is an efficient way to assess the effects of environmental variables on their qualitative and quantitative composition on a large spatial scale inside an ecological reserve in Central Amazonia, Brazil. Samples for Berlese-Tullgren extraction were taken in 72 plots of 250 × 6 m distributed over 64 km(2). In total 3,182 adult individuals, from 82 species and 79 morphospecies were recorded, expanding the number of species known in the reserve from 149 to 254. Galumna, Rostrozetes and Scheloribates were the most speciose genera, and 57 species were rare. Rostrozetes ovulum, Pergalumna passimpuctata and Archegozetes longisetosus were the most abundant species, and the first two were the most frequent. Species number and abundance were not correlated with clay content, slope, pH and litter quantity. However, Principal Coordinate Analysis indicated that as the percentage of clay content, litter quantity and pH changed, the oribatid mite qualitative and quantitative composition also changed. The standardized protocol effectively captured the diversity, as we collected one of the largest registers of oribatid mites' species for Amazonia. Moreover, biological and ecological data were integrated to capture the effects of environmental variables accounting for their diversity and abundance.

  4. Finite-Size Scaling of a First-Order Dynamical Phase Transition: Adaptive Population Dynamics and an Effective Model

    NASA Astrophysics Data System (ADS)

    Nemoto, Takahiro; Jack, Robert L.; Lecomte, Vivien

    2017-03-01

    We analyze large deviations of the time-averaged activity in the one-dimensional Fredrickson-Andersen model, both numerically and analytically. The model exhibits a dynamical phase transition, which appears as a singularity in the large deviation function. We analyze the finite-size scaling of this phase transition numerically, by generalizing an existing cloning algorithm to include a multicanonical feedback control: this significantly improves the computational efficiency. Motivated by these numerical results, we formulate an effective theory for the model in the vicinity of the phase transition, which accounts quantitatively for the observed behavior. We discuss potential applications of the numerical method and the effective theory in a range of more general contexts.

  5. Identifying Coherent Structures in a 3-Stream Supersonic Jet Flow using Time-Resolved Schlieren Imaging

    NASA Astrophysics Data System (ADS)

    Tenney, Andrew; Coleman, Thomas; Berry, Matthew; Magstadt, Andy; Gogineni, Sivaram; Kiel, Barry

    2015-11-01

    Shock cells and large scale structures present in a three-stream non-axisymmetric jet are studied both qualitatively and quantitatively. Large Eddy Simulation is utilized first to gain an understanding of the underlying physics of the flow and direct the focus of the physical experiment. The flow in the experiment is visualized using long exposure Schlieren photography, with time resolved Schlieren photography also a possibility. Velocity derivative diagnostics are calculated from the grey-scale Schlieren images are analyzed using continuous wavelet transforms. Pressure signals are also captured in the near-field of the jet to correlate with the velocity derivative diagnostics and assist in unraveling this complex flow. We acknowledge the support of AFRL through an SBIR grant.

  6. A real-time interferometer technique for compressible flow research

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Strengths and shortcomings in the application of interferometric techniques to transonic flow fields are examined and an improved method is elaborated. Such applications have demonstrated the value of interferometry in obtaining data for compressible flow research. With holographic techniques, interferometry may be applied in large scale facilities without the use of expensive optics or elaborate vibration isolation equipment. Results obtained using holographic interferometry and other methods demonstrate that reliable qualitative and quantitative data can be acquired. Nevertheless, the conventional method can be difficult to set up and apply, and it cannot produce real-time data. A new interferometry technique is investigated that promises to be easier to apply and can provide real-time information. This single-beam technique has the necessary insensitivity to vibration for large scale wind tunnel operations. Capabilities of the method and preliminary tests on some laboratory scale flow fluids are described.

  7. Control factors and scale analysis of annual river water, sediments and carbon transport in China.

    PubMed

    Song, Chunlin; Wang, Genxu; Sun, Xiangyang; Chang, Ruiying; Mao, Tianxu

    2016-05-11

    Under the context of dramatic human disturbances on river system, the processes that control the transport of water, sediment, and carbon from river basins to coastal seas are not completely understood. Here we performed a quantitative synthesis for 121 sites across China to find control factors of annual river exports (Rc: runoff coefficient; TSSC: total suspended sediment concentration; TSSL: total suspended sediment loads; TOCL: total organic carbon loads) at different spatial scales. The results indicated that human activities such as dam construction and vegetation restoration might have a greater influence than climate on the transport of river sediment and carbon, although climate was a major driver of Rc. Multiple spatial scale analyses indicated that Rc increased from the small to medium scale by 20% and then decreased at the sizable scale by 20%. TSSC decreased from the small to sizeable scale but increase from the sizeable to large scales; however, TSSL significantly decreased from small (768 g·m(-2)·a(-1)) to medium spatial scale basins (258 g·m(-2)·a(-1)), and TOCL decreased from the medium to large scale. Our results will improve the understanding of water, sediment and carbon transport processes and contribute better water and land resources management strategies from different spatial scales.

  8. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  9. Algorithm and Application of Gcp-Independent Block Adjustment for Super Large-Scale Domestic High Resolution Optical Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.

    2018-04-01

    The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.

  10. Determination of functional collective motions in a protein at atomic resolution using coherent neutron scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Liang; Jain, Nitin; Cheng, Xiaolin

    Protein function often depends on global, collective internal motions. However, the simultaneous quantitative experimental determination of the forms, amplitudes, and time scales of these motions has remained elusive. We demonstrate that a complete description of these large-scale dynamic modes can be obtained using coherent neutron-scattering experiments on perdeuterated samples. With this approach, a microscopic relationship between the structure, dynamics, and function in a protein, cytochrome P450cam, is established. The approach developed here should be of general applicability to protein systems.

  11. Determination of functional collective motions in a protein at atomic resolution using coherent neutron scattering

    DOE PAGES

    Hong, Liang; Jain, Nitin; Cheng, Xiaolin; ...

    2016-10-14

    Protein function often depends on global, collective internal motions. However, the simultaneous quantitative experimental determination of the forms, amplitudes, and time scales of these motions has remained elusive. We demonstrate that a complete description of these large-scale dynamic modes can be obtained using coherent neutron-scattering experiments on perdeuterated samples. With this approach, a microscopic relationship between the structure, dynamics, and function in a protein, cytochrome P450cam, is established. The approach developed here should be of general applicability to protein systems.

  12. Fast inertial particle manipulation in oscillating flows

    NASA Astrophysics Data System (ADS)

    Thameem, Raqeeb; Rallabandi, Bhargav; Hilgenfeldt, Sascha

    2017-05-01

    It is demonstrated that micron-sized particles suspended in fluid near oscillating interfaces experience strong inertial displacements above and beyond the fluid streaming. Experiments with oscillating bubbles show rectified particle lift over extraordinarily short (millisecond) times. A quantitative model on both the oscillatory and the steady time scales describes the particle displacement relative to the fluid motion. The formalism yields analytical predictions confirming the observed scaling behavior with particle size and experimental control parameters. It applies to a large class of oscillatory flows with applications from particle trapping to size sorting.

  13. A simple model to quantitatively account for periodic outbreaks of the measles in the Dutch Bible Belt

    NASA Astrophysics Data System (ADS)

    Bier, Martin; Brak, Bastiaan

    2015-04-01

    In the Netherlands there has been nationwide vaccination against the measles since 1976. However, in small clustered communities of orthodox Protestants there is widespread refusal of the vaccine. After 1976, three large outbreaks with about 3000 reported cases of the measles have occurred among these orthodox Protestants. The outbreaks appear to occur about every twelve years. We show how a simple Kermack-McKendrick-like model can quantitatively account for the periodic outbreaks. Approximate analytic formulae to connect the period, size, and outbreak duration are derived. With an enhanced model we take the latency period in account. We also expand the model to follow how different age groups are affected. Like other researchers using other methods, we conclude that large scale underreporting of the disease must occur.

  14. Old wine in new bottles: decanting systemic family process research in the era of evidence-based practice.

    PubMed

    Rohrbaugh, Michael J

    2014-09-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when "solutions" maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle-from qualitative to quantitative observation and back again. © 2014 FPI, Inc.

  15. Old Wine in New Bottles: Decanting Systemic Family Process Research in the Era of Evidence-Based Practice†

    PubMed Central

    Rohrbaugh, Michael J.

    2015-01-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when “solutions” maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation (FAMCON) approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle – from qualitative to quantitative observation and back again. PMID:24905101

  16. A Method for Label-Free, Differential Top-Down Proteomics.

    PubMed

    Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L

    2016-01-01

    Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.

  17. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    PubMed

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Representation matters: quantitative behavioral variation in wild worm strains

    NASA Astrophysics Data System (ADS)

    Brown, Andre

    Natural genetic variation in populations is the basis of genome-wide association studies, an approach that has been applied in large studies of humans to study the genetic architecture of complex traits including disease risk. Of course, the traits you choose to measure determine which associated genes you discover (or miss). In large-scale human studies, the measured traits are usually taken as a given during the association step because they are expensive to collect and standardize. Working with the nematode worm C. elegans, we do not have the same constraints. In this talk I will describe how large-scale imaging of worm behavior allows us to develop alternative representations of behavior that vary differently across wild populations. The alternative representations yield novel traits that can be used for genome-wide association studies and may reveal basic properties of the genotype-phenotype map that are obscured if only a small set of fixed traits are used.

  19. Wind power for the electric-utility industry: Policy incentives for fuel conservation

    NASA Astrophysics Data System (ADS)

    March, F.; Dlott, E. H.; Korn, D. H.; Madio, F. R.; McArthur, R. C.; Vachon, W. A.

    1982-06-01

    A systematic method for evaluating the economics of solar-electric/conservation technologies as fuel-savings investments for electric utilities in the presence of changing federal incentive policies is presented. The focus is on wind energy conversion systems (WECS) as the solar technology closest to near-term large scale implementation. Commercially available large WECS are described, along with computer models to calculate the economic impact of the inclusion of WECS as 10% of the base-load generating capacity on a grid. A guide to legal structures and relationships which impinge on large-scale WECS utilization is developed, together with a quantitative examination of the installation of 1000 MWe of WECS capacity by a utility in the northeast states. Engineering and financial analyses were performed, with results indicating government policy changes necessary to encourage the entrance of utilities into the field of windpower utilization.

  20. Large-scale structure non-Gaussianities with modal methods

    NASA Astrophysics Data System (ADS)

    Schmittfull, Marcel

    2016-10-01

    Relying on a separable modal expansion of the bispectrum, the implementation of a fast estimator for the full bispectrum of a 3d particle distribution is presented. The computational cost of accurate bispectrum estimation is negligible relative to simulation evolution, so the bispectrum can be used as a standard diagnostic whenever the power spectrum is evaluated. As an application, the time evolution of gravitational and primordial dark matter bispectra was measured in a large suite of N-body simulations. The bispectrum shape changes characteristically when the cosmic web becomes dominated by filaments and halos, therefore providing a quantitative probe of 3d structure formation. Our measured bispectra are determined by ~ 50 coefficients, which can be used as fitting formulae in the nonlinear regime and for non-Gaussian initial conditions. We also compare the measured bispectra with predictions from the Effective Field Theory of Large Scale Structures (EFTofLSS).

  1. An alternative to the search for single polymorphisms: toward molecular personality scales for the five-factor model.

    PubMed

    McCrae, Robert R; Scally, Matthew; Terracciano, Antonio; Abecasis, Gonçalo R; Costa, Paul T

    2010-12-01

    There is growing evidence that personality traits are affected by many genes, all of which have very small effects. As an alternative to the largely unsuccessful search for individual polymorphisms associated with personality traits, the authors identified large sets of potentially related single nucleotide polymorphisms (SNPs) and summed them to form molecular personality scales (MPSs) with from 4 to 2,497 SNPs. Scales were derived from two thirds of a large (N = 3,972) sample of individuals from Sardinia who completed the Revised NEO Personality Inventory (P. T. Costa, Jr., & R. R. McCrae, 1992) and were assessed in a genomewide association scan. When MPSs were correlated with the phenotype in the remaining one third of the sample, very small but significant associations were found for 4 of the 5e personality factors when the longest scales were examined. These data suggest that MPSs for Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness (but not Extraversion) contain genetic information that can be refined in future studies, and the procedures described here should be applicable to other quantitative traits. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  2. Weighing trees with lasers: advances, challenges and opportunities

    PubMed Central

    Boni Vicari, M.; Burt, A.; Calders, K.; Lewis, S. L.; Raumonen, P.; Wilkes, P.

    2018-01-01

    Terrestrial laser scanning (TLS) is providing exciting new ways to quantify tree and forest structure, particularly above-ground biomass (AGB). We show how TLS can address some of the key uncertainties and limitations of current approaches to estimating AGB based on empirical allometric scaling equations (ASEs) that underpin all large-scale estimates of AGB. TLS provides extremely detailed non-destructive measurements of tree form independent of tree size and shape. We show examples of three-dimensional (3D) TLS measurements from various tropical and temperate forests and describe how the resulting TLS point clouds can be used to produce quantitative 3D models of branch and trunk size, shape and distribution. These models can drastically improve estimates of AGB, provide new, improved large-scale ASEs, and deliver insights into a range of fundamental tree properties related to structure. Large quantities of detailed measurements of individual 3D tree structure also have the potential to open new and exciting avenues of research in areas where difficulties of measurement have until now prevented statistical approaches to detecting and understanding underlying patterns of scaling, form and function. We discuss these opportunities and some of the challenges that remain to be overcome to enable wider adoption of TLS methods. PMID:29503726

  3. A Unified Theory of Impact Crises and Mass Extinctions: Quantitative Tests

    NASA Technical Reports Server (NTRS)

    Rampino, Michael R.; Haggerty, Bruce M.; Pagano, Thomas C.

    1997-01-01

    Several quantitative tests of a general hypothesis linking impacts of large asteroids and comets with mass extinctions of life are possible based on astronomical data, impact dynamics, and geological information. The waiting of large-body impacts on the Earth derive from the flux of Earth-crossing asteroids and comets, and the estimated size of impacts capable of causing large-scale environmental disasters, predict that impacts of objects greater than or equal to 5 km in diameter (greater than or equal to 10 (exp 7) Mt TNT equivalent) could be sufficient to explain the record of approximately 25 extinction pulses in the last 540 Myr, with the 5 recorded major mass extinctions related to impacts of the largest objects of greater than or equal to 10 km in diameter (greater than or equal to 10(exp 8) Mt Events). Smaller impacts (approximately 10 (exp 6) Mt), with significant regional environmental effects, could be responsible for the lesser boundaries in the geologic record.

  4. Interdigital athlete's foot. The interaction of dermatophytes and resident bacteria.

    PubMed

    Leyden, J J; Kligman, A M

    1978-10-01

    Quantitative cultures in 140 cases of interdigital "athlete's foot" established the following clinical-microbiological correlations. In 39 cases of mild, scaling, relatively asymptomatic variety, fungi were recovered in 84% of cases. As the disease progressed to maceration, hyperkeratosis, and increased symptoms, recovery of fungi fell to 55% in moderately symptomatic and to 36% in severe cases. Symptomatic cases had increasing numbers of resident aerobic organisms, particularly large colony diphtheroids. Experimental manipulations of the interspace microflora in volunteers, monitored with quantitative cultures, demonstrated that symptomatic, macerated, hyperkeratotic process results from an overgrowth of resident organisms if the stratum corneum barrier is damaged by preexisting fungi, while overgrowth of the same organisms in normal, fungus-free interspaces does not produce lesions. These experiments support the conclusion that athlete's foot represents a continuum from a relatively asymptomatic, scaling eruption produced by fungi to a symptomatic, macerated, hyperkeratotic variety that is caused by an overgrowth of bacteria.

  5. Advances in imaging and quantification of electrical properties at the nanoscale using Scanning Microwave Impedance Microscopy (sMIM)

    NASA Astrophysics Data System (ADS)

    Friedman, Stuart; Stanke, Fred; Yang, Yongliang; Amster, Oskar

    Scanning Microwave Impedance Microscopy (sMIM) is a mode for Atomic Force Microscopy (AFM) enabling imaging of unique contrast mechanisms and measurement of local permittivity and conductivity at the 10's of nm length scale. sMIM has been applied to a variety of systems including nanotubes, nanowires, 2D materials, photovoltaics and semiconductor devices. Early results were largely semi-quantitative. This talk will focus on techniques for extracting quantitative physical parameters such as permittivity, conductivity, doping concentrations and thin film properties from sMIM data. Particular attention will be paid to non-linear materials where sMIM has been used to acquire nano-scale capacitance-voltage curves. These curves can be used to identify the dopant type (n vs p) and doping level in doped semiconductors, both bulk samples and devices. Supported in part by DOE-SBIR DE-SC0009856.

  6. Quantum and classical ripples in graphene

    NASA Astrophysics Data System (ADS)

    Hašík, Juraj; Tosatti, Erio; MartoÅák, Roman

    2018-04-01

    Thermal ripples of graphene are well understood at room temperature, but their quantum counterparts at low temperatures are in need of a realistic quantitative description. Here we present atomistic path-integral Monte Carlo simulations of freestanding graphene, which show upon cooling a striking classical-quantum evolution of height and angular fluctuations. The crossover takes place at ever-decreasing temperatures for ever-increasing wavelengths so that a completely quantum regime is never attained. Zero-temperature quantum graphene is flatter and smoother than classical graphene at large scales yet rougher at short scales. The angular fluctuation distribution of the normals can be quantitatively described by coexistence of two Gaussians, one classical strongly T -dependent and one quantum about 2° wide, of zero-point character. The quantum evolution of ripple-induced height and angular spread should be observable in electron diffraction in graphene and other two-dimensional materials, such as MoS2, bilayer graphene, boron nitride, etc.

  7. Using occupancy estimation to assess the effectiveness of a regional multiple-species conservation plan: bats in the Pacific Northwest

    Treesearch

    Theodore Weller

    2008-01-01

    Regional conservation plans are increasingly used to plan for and protect biodiversity at large spatial scales however the means of quantitatively evaluating their effectiveness are rarely specified. Multiple-species approaches, particular those which employ site-occupancy estimation, have been proposed as robust and efficient alternatives for assessing the status of...

  8. Understanding Loan Aversion in Education: Evidence from High School Seniors, Community College Students, and Adults. CEPA Working Paper No. 16-15

    ERIC Educational Resources Information Center

    Boatman, Angela; Evans, Brent; Soliz, Adela

    2016-01-01

    Student loans are a crucial aspect of financing a college education for millions of Americans, yet we have surprisingly little empirical evidence concerning individuals' unwillingness to borrow money for educational purposes. This study provides the first large-scale quantitative evidence of levels of loan aversion in the United States. Using…

  9. Continental-scale simulation of burn probabilities, flame lengths, and fire size distribution for the United States

    Treesearch

    Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley

    2010-01-01

    Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...

  10. A Generational Divide in the Academic Profession: A Mixed Quantitative and Qualitative Approach to the Polish Case

    ERIC Educational Resources Information Center

    Kwiek, Marek

    2017-01-01

    In a recently changing Polish academic environment--following the large-scale higher education reforms of 2009-2012--different academic generations have to cope with different challenges. Polish academics have been strongly divided generationally, not only in terms of what they think and how they work but also in terms of what is academically…

  11. "I Was Just so Different": The Experiences of Women Diagnosed with an Autism Spectrum Disorder in Adulthood in Relation to Gender and Social Relationships

    ERIC Educational Resources Information Center

    Kanfiszer, Lucie; Davies, Fran; Collins, Suzanne

    2017-01-01

    Existing literature exploring autism spectrum disorders within female populations predominantly utilises quantitative methodology. A limited number of small-scale, qualitative studies have explored the experiences of adolescent girls with autism spectrum disorder, but adult women have remained largely unheard. This study aims to broaden the…

  12. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  13. Quantitative Resistance: More Than Just Perception of a Pathogen

    PubMed Central

    2017-01-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. PMID:28302676

  14. High-throughput method for the quantitation of metabolites and co-factors from homocysteine-methionine cycle for nutritional status assessment.

    PubMed

    Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping

    2016-09-01

    There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.

  15. Establishing the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS): Operationalizing Community-based Research in a Large National Quantitative Study.

    PubMed

    Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela

    2016-08-19

    Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several advantages that go hand-in-hand with its obstacles. We offer guidance on implementing this approach, such that the process can be better planned and result in success.

  16. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  17. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  18. HFSB-seeding for large-scale tomographic PIV in wind tunnels

    NASA Astrophysics Data System (ADS)

    Caridi, Giuseppe Carlo Alp; Ragni, Daniele; Sciacchitano, Andrea; Scarano, Fulvio

    2016-12-01

    A new system for large-scale tomographic particle image velocimetry in low-speed wind tunnels is presented. The system relies upon the use of sub-millimetre helium-filled soap bubbles as flow tracers, which scatter light with intensity several orders of magnitude higher than micron-sized droplets. With respect to a single bubble generator, the system increases the rate of bubbles emission by means of transient accumulation and rapid release. The governing parameters of the system are identified and discussed, namely the bubbles production rate, the accumulation and release times, the size of the bubble injector and its location with respect to the wind tunnel contraction. The relations between the above parameters, the resulting spatial concentration of tracers and measurement of dynamic spatial range are obtained and discussed. Large-scale experiments are carried out in a large low-speed wind tunnel with 2.85 × 2.85 m2 test section, where a vertical axis wind turbine of 1 m diameter is operated. Time-resolved tomographic PIV measurements are taken over a measurement volume of 40 × 20 × 15 cm3, allowing the quantitative analysis of the tip-vortex structure and dynamical evolution.

  19. Large-scale multiplex absolute protein quantification of drug-metabolizing enzymes and transporters in human intestine, liver, and kidney microsomes by SWATH-MS: Comparison with MRM/SRM and HR-MRM/PRM.

    PubMed

    Nakamura, Kenji; Hirayama-Kurogi, Mio; Ito, Shingo; Kuno, Takuya; Yoneyama, Toshihiro; Obuchi, Wataru; Terasaki, Tetsuya; Ohtsuki, Sumio

    2016-08-01

    The purpose of the present study was to examine simultaneously the absolute protein amounts of 152 membrane and membrane-associated proteins, including 30 metabolizing enzymes and 107 transporters, in pooled microsomal fractions of human liver, kidney, and intestine by means of SWATH-MS with stable isotope-labeled internal standard peptides, and to compare the results with those obtained by MRM/SRM and high resolution (HR)-MRM/PRM. The protein expression levels of 27 metabolizing enzymes, 54 transporters, and six other membrane proteins were quantitated by SWATH-MS; other targets were below the lower limits of quantitation. Most of the values determined by SWATH-MS differed by less than 50% from those obtained by MRM/SRM or HR-MRM/PRM. Various metabolizing enzymes were expressed in liver microsomes more abundantly than in other microsomes. Ten, 13, and eight transporters listed as important for drugs by International Transporter Consortium were quantified in liver, kidney, and intestinal microsomes, respectively. Our results indicate that SWATH-MS enables large-scale multiplex absolute protein quantification while retaining similar quantitative capability to MRM/SRM or HR-MRM/PRM. SWATH-MS is expected to be useful methodology in the context of drug development for elucidating the molecular mechanisms of drug absorption, metabolism, and excretion in the human body based on protein profile information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Microwave Remote Sensing and the Cold Land Processes Field Experiment

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Cline, Don; Davis, Bert; Hildebrand, Peter H. (Technical Monitor)

    2001-01-01

    The Cold Land Processes Field Experiment (CLPX) has been designed to advance our understanding of the terrestrial cryosphere. Developing a more complete understanding of fluxes, storage, and transformations of water and energy in cold land areas is a critical focus of the NASA Earth Science Enterprise Research Strategy, the NASA Global Water and Energy Cycle (GWEC) Initiative, the Global Energy and Water Cycle Experiment (GEWEX), and the GEWEX Americas Prediction Project (GAPP). The movement of water and energy through cold regions in turn plays a large role in ecological activity and biogeochemical cycles. Quantitative understanding of cold land processes over large areas will require synergistic advancements in 1) understanding how cold land processes, most comprehensively understood at local or hillslope scales, extend to larger scales, 2) improved representation of cold land processes in coupled and uncoupled land-surface models, and 3) a breakthrough in large-scale observation of hydrologic properties, including snow characteristics, soil moisture, the extent of frozen soils, and the transition between frozen and thawed soil conditions. The CLPX Plan has been developed through the efforts of over 60 interested scientists that have participated in the NASA Cold Land Processes Working Group (CLPWG). This group is charged with the task of assessing, planning and implementing the required background science, technology, and application infrastructure to support successful land surface hydrology remote sensing space missions. A major product of the experiment will be a comprehensive, legacy data set that will energize many aspects of cold land processes research. The CLPX will focus on developing the quantitative understanding, models, and measurements necessary to extend our local-scale understanding of water fluxes, storage, and transformations to regional and global scales. The experiment will particularly emphasize developing a strong synergism between process-oriented understanding, land surface models and microwave remote sensing. The experimental design is a multi-sensor, multi-scale (1-ha to 160,000 km ^ {2}) approach to providing the comprehensive data set necessary to address several experiment objectives. A description focusing on the microwave remote sensing components (ground, airborne, and spaceborne) of the experiment will be presented.

  1. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  2. Molecular Imaging of Kerogen and Minerals in Shale Rocks across Micro- and Nano- Scales

    NASA Astrophysics Data System (ADS)

    Hao, Z.; Bechtel, H.; Sannibale, F.; Kneafsey, T. J.; Gilbert, B.; Nico, P. S.

    2016-12-01

    Fourier transform infrared (FTIR) spectroscopy is a reliable and non-destructive quantitative method to evaluate mineralogy and kerogen content / maturity of shale rocks, although it is traditionally difficult to assess the organic and mineralogical heterogeneity at micrometer and nanometer scales due to the diffraction limit of the infrared light. However, it is truly at these scales that the kerogen and mineral content and their formation in share rocks determines the quality of shale gas reserve, the gas flow mechanisms and the gas production. Therefore, it's necessary to develop new approaches which can image across both micro- and nano- scales. In this presentation, we will describe two new molecular imaging approaches to obtain kerogen and mineral information in shale rocks at the unprecedented high spatial resolution, and a cross-scale quantitative multivariate analysis method to provide rapid geochemical characterization of large size samples. The two imaging approaches are enhanced at nearfield respectively by a Ge-hemisphere (GE) and by a metallic scanning probe (SINS). The GE method is a modified microscopic attenuated total reflectance (ATR) method which rapidly captures a chemical image of the shale rock surface at 1 to 5 micrometer resolution with a large field of view of 600 X 600 micrometer, while the SINS probes the surface at 20 nm resolution which provides a chemically "deconvoluted" map at the nano-pore level. The detailed geochemical distribution at nanoscale is then used to build a machine learning model to generate self-calibrated chemical distribution map at micrometer scale with the input of the GE images. A number of geochemical contents across these two important scales are observed and analyzed, including the minerals (oxides, carbonates, sulphides), the organics (carbohydrates, aromatics), and the absorbed gases. These approaches are self-calibrated, optics friendly and non-destructive, so they hold the potential to monitor shale gas flow at real time inside the micro- or nano- pore network, which is of great interest for optimizing the shale gas extraction.

  3. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  4. The role of the airline transportation network in the prediction and predictability of global epidemics.

    PubMed

    Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro

    2006-02-14

    The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.

  5. Natural disasters and population mobility in Bangladesh.

    PubMed

    Gray, Clark L; Mueller, Valerie

    2012-04-17

    The consequences of environmental change for human migration have gained increasing attention in the context of climate change and recent large-scale natural disasters, but as yet relatively few large-scale and quantitative studies have addressed this issue. We investigate the consequences of climate-related natural disasters for long-term population mobility in rural Bangladesh, a region particularly vulnerable to environmental change, using longitudinal survey data from 1,700 households spanning a 15-y period. Multivariate event history models are used to estimate the effects of flooding and crop failures on local population mobility and long-distance migration while controlling for a large set of potential confounders at various scales. The results indicate that flooding has modest effects on mobility that are most visible at moderate intensities and for women and the poor. However, crop failures unrelated to flooding have strong effects on mobility in which households that are not directly affected but live in severely affected areas are the most likely to move. These results point toward an alternate paradigm of disaster-induced mobility that recognizes the significant barriers to migration for vulnerable households as well their substantial local adaptive capacity.

  6. Quantitative phylogenetic assessment of microbial communities in diverse environments.

    PubMed

    von Mering, C; Hugenholtz, P; Raes, J; Tringe, S G; Doerks, T; Jensen, L J; Ward, N; Bork, P

    2007-02-23

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. We used a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative, and accurate picture of community composition than that provided by traditional ribosomal RNA-based approaches depending on the polymerase chain reaction. Mapping marker genes from four diverse environmental data sets onto a reference species phylogeny shows that certain communities evolve faster than others. The method also enables determination of preferred habitats for entire microbial clades and provides evidence that such habitat preferences are often remarkably stable over time.

  7. Determination of plasma parameters from soft X-ray images for coronal holes /open magnetic field configurations/ and coronal large-scale structures /extended closed-field configurations/

    NASA Technical Reports Server (NTRS)

    Maxson, C. W.; Vaiana, G. S.

    1977-01-01

    In connection with high-quality solar soft X-ray images the 'quiet' features of the inner corona have been separated into two sharply different components, including the strongly reduced emission areas or coronal holes (CH) and the extended regions of looplike emission features or large-scale structures (LSS). Particular central meridian passage observations of the prominent CH1 on August 21, 1973, are selected for a quantitative study. Histogram photographic density distributions for full-disk images at other central meridian passages of CH 1 are also presented, and the techniques of converting low photographic density data to deposited energy are discussed, with particular emphasis on the problems associated with the CH data.

  8. Quantitative study of the violation of kperpendicular factorization in hadroproduction of quarks at collider energies.

    PubMed

    Fujii, Hirotsugu; Gelis, François; Venugopalan, Raju

    2005-10-14

    We demonstrate the violation of kperpendicular factorization for quark production in high energy hadronic collisions. This violation is quantified in the color glass condensate framework and studied as a function of the quark mass, the quark transverse momentum, and the saturation scale Q(s), which is a measure of large parton densities. At x values where parton densities are large but leading twist shadowing effects are still small, violations of kperpendicularkfactorization can be significant--especially for lighter quarks. At very small x, where leading twist shadowing is large, we show that violations of kperpendicular factorization are relatively weaker.

  9. Large-visual-angle microstructure inspired from quantitative design of Morpho butterflies' lamellae deviation using the FDTD/PSO method.

    PubMed

    Wang, Wanlin; Zhang, Wang; Chen, Weixin; Gu, Jiajun; Liu, Qinglei; Deng, Tao; Zhang, Di

    2013-01-15

    The wide angular range of the treelike structure in Morpho butterfly scales was investigated by finite-difference time-domain (FDTD)/particle-swarm-optimization (PSO) analysis. Using the FDTD method, different parameters in the Morpho butterflies' treelike structure were studied and their contributions to the angular dependence were analyzed. Then a wide angular range was realized by the PSO method from quantitatively designing the lamellae deviation (Δy), which was a crucial parameter with angular range. The field map of the wide-range reflection in a large area was given to confirm the wide angular range. The tristimulus values and corresponding color coordinates for various viewing directions were calculated to confirm the blue color in different observation angles. The wide angular range realized by the FDTD/PSO method will assist us in understanding the scientific principles involved and also in designing artificial optical materials.

  10. Three Advantages of Cross-National Comparative Ethnography--Methodological Reflections from a Study of Migrants and Minority Ethnic Youth in English and Spanish Schools

    ERIC Educational Resources Information Center

    Jørgensen, Clara Rübner

    2015-01-01

    This paper discusses the strengths of using ethnographic research methods in cross-national comparative research. It focuses particularly on the potential of applying such methods to the study of migrants and minority ethnic youth in education, where large-scale quantitative studies or single-sited ethnographies are currently dominant. By linking…

  11. Annual runoff and evapotranspiration of forestlands and non-forestlands in selected basins of the Loess Plateau of China.

    Treesearch

    Yanhui Wang; Pengtao Yu; Karl-Heinz Feger; Xiaohua Wei; Ge Sun; et al

    2011-01-01

    Large-scale forestation has been undertaken over decades principally to control the serious soil erosion in the Loess Plateau of China. A quantitative assessment of the hydrological effects of forestation, especially on basin water yield, is critical for the sustainable forestry development within this dry region. In this study, we constructed the multi-annual water...

  12. Engineering Digestion: Multiscale Processes of Food Digestion.

    PubMed

    Bornhorst, Gail M; Gouseti, Ourania; Wickham, Martin S J; Bakalis, Serafim

    2016-03-01

    Food digestion is a complex, multiscale process that has recently become of interest to the food industry due to the developing links between food and health or disease. Food digestion can be studied by using either in vitro or in vivo models, each having certain advantages or disadvantages. The recent interest in food digestion has resulted in a large number of studies in this area, yet few have provided an in-depth, quantitative description of digestion processes. To provide a framework to develop these quantitative comparisons, a summary is given here between digestion processes and parallel unit operations in the food and chemical industry. Characterization parameters and phenomena are suggested for each step of digestion. In addition to the quantitative characterization of digestion processes, the multiscale aspect of digestion must also be considered. In both food systems and the gastrointestinal tract, multiple length scales are involved in food breakdown, mixing, absorption. These different length scales influence digestion processes independently as well as through interrelated mechanisms. To facilitate optimized development of functional food products, a multiscale, engineering approach may be taken to describe food digestion processes. A framework for this approach is described in this review, as well as examples that demonstrate the importance of process characterization as well as the multiple, interrelated length scales in the digestion process. © 2016 Institute of Food Technologists®

  13. A Census of Large-scale (≥10 PC), Velocity-coherent, Dense Filaments in the Northern Galactic Plane: Automated Identification Using Minimum Spanning Tree

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Testi, Leonardo; Burkert, Andreas; Walmsley, C. Malcolm; Beuther, Henrik; Henning, Thomas

    2016-09-01

    Large-scale gaseous filaments with lengths up to the order of 100 pc are on the upper end of the filamentary hierarchy of the Galactic interstellar medium (ISM). Their association with respect to the Galactic structure and their role in Galactic star formation are of great interest from both an observational and theoretical point of view. Previous “by-eye” searches, combined together, have started to uncover the Galactic distribution of large filaments, yet inherent bias and small sample size limit conclusive statistical results from being drawn. Here, we present (1) a new, automated method for identifying large-scale velocity-coherent dense filaments, and (2) the first statistics and the Galactic distribution of these filaments. We use a customized minimum spanning tree algorithm to identify filaments by connecting voxels in the position-position-velocity space, using the Bolocam Galactic Plane Survey spectroscopic catalog. In the range of 7\\buildrel{\\circ}\\over{.} 5≤slant l≤slant 194^\\circ , we have identified 54 large-scale filaments and derived mass (˜ {10}3{--}{10}5 {M}⊙ ), length (10-276 pc), linear mass density (54-8625 {M}⊙ pc-1), aspect ratio, linearity, velocity gradient, temperature, fragmentation, Galactic location, and orientation angle. The filaments concentrate along major spiral arms. They are widely distributed across the Galactic disk, with 50% located within ±20 pc from the Galactic mid-plane and 27% run in the center of spiral arms. An order of 1% of the molecular ISM is confined in large filaments. Massive star formation is more favorable in large filaments compared to elsewhere. This is the first comprehensive catalog of large filaments that can be useful for a quantitative comparison with spiral structures and numerical simulations.

  14. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    PubMed

    Marzinelli, Ezequiel M; Williams, Stefan B; Babcock, Russell C; Barrett, Neville S; Johnson, Craig R; Jordan, Alan; Kendrick, Gary A; Pizarro, Oscar R; Smale, Dan A; Steinberg, Peter D

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia's Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km) and depths (15-60 m) across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves.

  15. Scaling properties of European research units

    PubMed Central

    Jamtveit, Bjørn; Jettestuen, Espen; Mathiesen, Joachim

    2009-01-01

    A quantitative characterization of the scale-dependent features of research units may provide important insight into how such units are organized and how they grow. The relative importance of top-down versus bottom-up controls on their growth may be revealed by their scaling properties. Here we show that the number of support staff in Scandinavian research units, ranging in size from 20 to 7,800 staff members, is related to the number of academic staff by a power law. The scaling exponent of ≈1.30 is broadly consistent with a simple hierarchical model of the university organization. Similar scaling behavior between small and large research units with a wide range of ambitions and strategies argues against top-down control of the growth. Top-down effects, and externally imposed effects from changing political environments, can be observed as fluctuations around the main trend. The observed scaling law implies that cost-benefit arguments for merging research institutions into larger and larger units may have limited validity unless the productivity per academic staff and/or the quality of the products are considerably higher in larger institutions. Despite the hierarchical structure of most large-scale research units in Europe, the network structures represented by the academic component of such units are strongly antihierarchical and suboptimal for efficient communication within individual units. PMID:19625626

  16. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  17. The topology of large-scale structure. III - Analysis of observations

    NASA Astrophysics Data System (ADS)

    Gott, J. Richard, III; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.; Weinberg, David H.; Gammie, Charles; Polk, Kevin; Vogeley, Michael; Jeffrey, Scott; Bhavsar, Suketu P.; Melott, Adrian L.; Giovanelli, Riccardo; Hayes, Martha P.; Tully, R. Brent; Hamilton, Andrew J. S.

    1989-05-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.

  18. The topology of large-scale structure. III - Analysis of observations. [in universe

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III; Weinberg, David H.; Miller, John; Thuan, Trinh X.; Schneider, Stephen E.

    1989-01-01

    A recently developed algorithm for quantitatively measuring the topology of large-scale structures in the universe was applied to a number of important observational data sets. The data sets included an Abell (1958) cluster sample out to Vmax = 22,600 km/sec, the Giovanelli and Haynes (1985) sample out to Vmax = 11,800 km/sec, the CfA sample out to Vmax = 5000 km/sec, the Thuan and Schneider (1988) dwarf sample out to Vmax = 3000 km/sec, and the Tully (1987) sample out to Vmax = 3000 km/sec. It was found that, when the topology is studied on smoothing scales significantly larger than the correlation length (i.e., smoothing length, lambda, not below 1200 km/sec), the topology is spongelike and is consistent with the standard model in which the structure seen today has grown from small fluctuations caused by random noise in the early universe. When the topology is studied on the scale of lambda of about 600 km/sec, a small shift is observed in the genus curve in the direction of a 'meatball' topology.

  19. Quantitative Resistance: More Than Just Perception of a Pathogen.

    PubMed

    Corwin, Jason A; Kliebenstein, Daniel J

    2017-04-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. © 2017 American Society of Plant Biologists. All rights reserved.

  20. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caillouet, Laurie; Vidal, Jean -Philippe; Sauquet, Eric

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the 1871–2012 period built on the NOAA Twentieth Century global extended atmospheric reanalysis (20CR). The objective is to fill in the spatial and temporal data gaps in surface observations in order to improve our knowledge on the local-scale climate variability from the late nineteenth century onwards. The SANDHY (Stepwise ANalogue Downscaling method for HYdrology) statistical downscaling method, initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between large-scale 20CR predictors and local-scale predictands from the Safran high-resolution near-surface reanalysis,more » available from 1958 onwards only. SANDHY provides a daily ensemble of 125 analogue dates over the 1871–2012 period for 608 climatically homogeneous zones paving France. Large precipitation biases in intermediary seasons are shown to occur in regions with high seasonal asymmetry like the Mediterranean. Moreover, winter and summer temperatures are respectively over- and under-estimated over the whole of France. Two analogue subselection methods are therefore developed with the aim of keeping the structure of the SANDHY method unchanged while reducing those seasonal biases. The calendar selection keeps the analogues closest to the target calendar day. The stepwise selection applies two new analogy steps based on similarity of the sea surface temperature (SST) and the large-scale 2 m temperature ( T). Comparisons to the Safran reanalysis over 1959–2007 and to homogenized series over the whole twentieth century show that biases in the interannual cycle of precipitation and temperature are reduced with both methods. The stepwise subselection moreover leads to a large improvement of interannual correlation and reduction of errors in seasonal temperature time series. When the calendar subselection is an easily applicable method suitable in a quantitative precipitation forecast context, the stepwise subselection method allows for potential season shifts and SST trends and is therefore better suited for climate reconstructions and climate change studies. Furthermore, the probabilistic downscaling of 20CR over the period 1871–2012 with the SANDHY probabilistic downscaling method combined with the stepwise subselection thus constitutes a perfect framework for assessing the recent observed meteorological events but also future events projected by climate change impact studies and putting them in a historical perspective.« less

  1. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    DOE PAGES

    Caillouet, Laurie; Vidal, Jean -Philippe; Sauquet, Eric; ...

    2016-03-16

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the 1871–2012 period built on the NOAA Twentieth Century global extended atmospheric reanalysis (20CR). The objective is to fill in the spatial and temporal data gaps in surface observations in order to improve our knowledge on the local-scale climate variability from the late nineteenth century onwards. The SANDHY (Stepwise ANalogue Downscaling method for HYdrology) statistical downscaling method, initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between large-scale 20CR predictors and local-scale predictands from the Safran high-resolution near-surface reanalysis,more » available from 1958 onwards only. SANDHY provides a daily ensemble of 125 analogue dates over the 1871–2012 period for 608 climatically homogeneous zones paving France. Large precipitation biases in intermediary seasons are shown to occur in regions with high seasonal asymmetry like the Mediterranean. Moreover, winter and summer temperatures are respectively over- and under-estimated over the whole of France. Two analogue subselection methods are therefore developed with the aim of keeping the structure of the SANDHY method unchanged while reducing those seasonal biases. The calendar selection keeps the analogues closest to the target calendar day. The stepwise selection applies two new analogy steps based on similarity of the sea surface temperature (SST) and the large-scale 2 m temperature ( T). Comparisons to the Safran reanalysis over 1959–2007 and to homogenized series over the whole twentieth century show that biases in the interannual cycle of precipitation and temperature are reduced with both methods. The stepwise subselection moreover leads to a large improvement of interannual correlation and reduction of errors in seasonal temperature time series. When the calendar subselection is an easily applicable method suitable in a quantitative precipitation forecast context, the stepwise subselection method allows for potential season shifts and SST trends and is therefore better suited for climate reconstructions and climate change studies. Furthermore, the probabilistic downscaling of 20CR over the period 1871–2012 with the SANDHY probabilistic downscaling method combined with the stepwise subselection thus constitutes a perfect framework for assessing the recent observed meteorological events but also future events projected by climate change impact studies and putting them in a historical perspective.« less

  2. In-depth Qualitative and Quantitative Profiling of Tyrosine Phosphorylation Using a Combination of Phosphopeptide Immunoaffinity Purification and Stable Isotope Dimethyl Labeling*

    PubMed Central

    Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.

    2010-01-01

    Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167

  3. Earthquakes in the Laboratory: Continuum-Granular Interactions

    NASA Astrophysics Data System (ADS)

    Ecke, Robert; Geller, Drew; Ward, Carl; Backhaus, Scott

    2013-03-01

    Earthquakes in nature feature large tectonic plate motion at large scales of 10-100 km and local properties of the earth on the scale of the rupture width, of the order of meters. Fault gouge often fills the gap between the large slipping plates and may play an important role in the nature and dynamics of earthquake events. We have constructed a laboratory scale experiment that represents a similitude scale model of this general earthquake description. Two photo-elastic plates (50 cm x 25 cm x 1 cm) confine approximately 3000 bi-disperse nylon rods (diameters 0.12 and 0.16 cm, height 1 cm) in a gap of approximately 1 cm. The plates are held rigidly along their outer edges with one held fixed while the other edge is driven at constant speed over a range of about 5 cm. The local stresses exerted on the plates are measured using their photo-elastic response, the local relative motions of the plates, i.e., the local strains, are determined by the relative motion of small ball bearings attached to the top surface, and the configurations of the nylon rods are investigated using particle tracking tools. We find that this system has properties similar to real earthquakes and are exploring these ``lab-quake'' events with the quantitative tools we have developed.

  4. Text mixing shapes the anatomy of rank-frequency distributions

    NASA Astrophysics Data System (ADS)

    Williams, Jake Ryland; Bagrow, James P.; Danforth, Christopher M.; Dodds, Peter Sheridan

    2015-05-01

    Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf's law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this "law" of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim.

  5. String-like collective motion in the α- and β-relaxation of a coarse-grained polymer melt

    NASA Astrophysics Data System (ADS)

    Pazmiño Betancourt, Beatriz A.; Starr, Francis W.; Douglas, Jack F.

    2018-03-01

    Relaxation in glass-forming liquids occurs as a multi-stage hierarchical process involving cooperative molecular motion. First, there is a "fast" relaxation process dominated by the inertial motion of the molecules whose amplitude grows upon heating, followed by a longer time α-relaxation process involving both large-scale diffusive molecular motion and momentum diffusion. Our molecular dynamics simulations of a coarse-grained glass-forming polymer melt indicate that the fast, collective motion becomes progressively suppressed upon cooling, necessitating large-scale collective motion by molecular diffusion for the material to relax approaching the glass-transition. In each relaxation regime, the decay of the collective intermediate scattering function occurs through collective particle exchange motions having a similar geometrical form, and quantitative relationships are derived relating the fast "stringlet" collective motion to the larger scale string-like collective motion at longer times, which governs the temperature-dependent activation energies associated with both thermally activated molecular diffusion and momentum diffusion.

  6. How much does a tokamak reactor cost?

    NASA Astrophysics Data System (ADS)

    Freidberg, J.; Cerfon, A.; Ballinger, S.; Barber, J.; Dogra, A.; McCarthy, W.; Milanese, L.; Mouratidis, T.; Redman, W.; Sandberg, A.; Segal, D.; Simpson, R.; Sorensen, C.; Zhou, M.

    2017-10-01

    The cost of a fusion reactor is of critical importance to its ultimate acceptability as a commercial source of electricity. While there are general rules of thumb for scaling both overnight cost and levelized cost of electricity the corresponding relations are not very accurate or universally agreed upon. We have carried out a series of scaling studies of tokamak reactor costs based on reasonably sophisticated plasma and engineering models. The analysis is largely analytic, requiring only a simple numerical code, thus allowing a very large number of designs. Importantly, the studies are aimed at plasma physicists rather than fusion engineers. The goals are to assess the pros and cons of steady state burning plasma experiments and reactors. One specific set of results discusses the benefits of higher magnetic fields, now possible because of the recent development of high T rare earth superconductors (REBCO); with this goal in mind, we calculate quantitative expressions, including both scaling and multiplicative constants, for cost and major radius as a function of central magnetic field.

  7. Large-scale assembly bias of dark matter halos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lazeyras, Titouan; Musso, Marcello; Schmidt, Fabian, E-mail: titouan@mpa-garching.mpg.de, E-mail: mmusso@sas.upenn.edu, E-mail: fabians@mpa-garching.mpg.de

    We present precise measurements of the assembly bias of dark matter halos, i.e. the dependence of halo bias on other properties than the mass, using curved 'separate universe' N-body simulations which effectively incorporate an infinite-wavelength matter overdensity into the background density. This method measures the LIMD (local-in-matter-density) bias parameters b {sub n} in the large-scale limit. We focus on the dependence of the first two Eulerian biases b {sup E} {sup {sub 1}} and b {sup E} {sup {sub 2}} on four halo properties: the concentration, spin, mass accretion rate, and ellipticity. We quantitatively compare our results with previous worksmore » in which assembly bias was measured on fairly small scales. Despite this difference, our findings are in good agreement with previous results. We also look at the joint dependence of bias on two halo properties in addition to the mass. Finally, using the excursion set peaks model, we attempt to shed new insights on how assembly bias arises in this analytical model.« less

  8. Text mixing shapes the anatomy of rank-frequency distributions.

    PubMed

    Williams, Jake Ryland; Bagrow, James P; Danforth, Christopher M; Dodds, Peter Sheridan

    2015-05-01

    Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf's law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this "law" of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim.

  9. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Transcriptional analysis of the Arabidopsis ovule by massively parallel signature sequencing

    PubMed Central

    Sánchez-León, Nidia; Arteaga-Vázquez, Mario; Alvarez-Mejía, César; Mendiola-Soto, Javier; Durán-Figueroa, Noé; Rodríguez-Leal, Daniel; Rodríguez-Arévalo, Isaac; García-Campayo, Vicenta; García-Aguilar, Marcelina; Olmedo-Monfil, Vianey; Arteaga-Sánchez, Mario; Martínez de la Vega, Octavio; Nobuta, Kan; Vemaraju, Kalyan; Meyers, Blake C.; Vielle-Calzada, Jean-Philippe

    2012-01-01

    The life cycle of flowering plants alternates between a predominant sporophytic (diploid) and an ephemeral gametophytic (haploid) generation that only occurs in reproductive organs. In Arabidopsis thaliana, the female gametophyte is deeply embedded within the ovule, complicating the study of the genetic and molecular interactions involved in the sporophytic to gametophytic transition. Massively parallel signature sequencing (MPSS) was used to conduct a quantitative large-scale transcriptional analysis of the fully differentiated Arabidopsis ovule prior to fertilization. The expression of 9775 genes was quantified in wild-type ovules, additionally detecting >2200 new transcripts mapping to antisense or intergenic regions. A quantitative comparison of global expression in wild-type and sporocyteless (spl) individuals resulted in 1301 genes showing 25-fold reduced or null activity in ovules lacking a female gametophyte, including those encoding 92 signalling proteins, 75 transcription factors, and 72 RNA-binding proteins not reported in previous studies based on microarray profiling. A combination of independent genetic and molecular strategies confirmed the differential expression of 28 of them, showing that they are either preferentially active in the female gametophyte, or dependent on the presence of a female gametophyte to be expressed in sporophytic cells of the ovule. Among 18 genes encoding pentatricopeptide-repeat proteins (PPRs) that show transcriptional activity in wild-type but not spl ovules, CIHUATEOTL (At4g38150) is specifically expressed in the female gametophyte and necessary for female gametogenesis. These results expand the nature of the transcriptional universe present in the ovule of Arabidopsis, and offer a large-scale quantitative reference of global expression for future genomic and developmental studies. PMID:22442422

  11. Transcriptional analysis of the Arabidopsis ovule by massively parallel signature sequencing.

    PubMed

    Sánchez-León, Nidia; Arteaga-Vázquez, Mario; Alvarez-Mejía, César; Mendiola-Soto, Javier; Durán-Figueroa, Noé; Rodríguez-Leal, Daniel; Rodríguez-Arévalo, Isaac; García-Campayo, Vicenta; García-Aguilar, Marcelina; Olmedo-Monfil, Vianey; Arteaga-Sánchez, Mario; de la Vega, Octavio Martínez; Nobuta, Kan; Vemaraju, Kalyan; Meyers, Blake C; Vielle-Calzada, Jean-Philippe

    2012-06-01

    The life cycle of flowering plants alternates between a predominant sporophytic (diploid) and an ephemeral gametophytic (haploid) generation that only occurs in reproductive organs. In Arabidopsis thaliana, the female gametophyte is deeply embedded within the ovule, complicating the study of the genetic and molecular interactions involved in the sporophytic to gametophytic transition. Massively parallel signature sequencing (MPSS) was used to conduct a quantitative large-scale transcriptional analysis of the fully differentiated Arabidopsis ovule prior to fertilization. The expression of 9775 genes was quantified in wild-type ovules, additionally detecting >2200 new transcripts mapping to antisense or intergenic regions. A quantitative comparison of global expression in wild-type and sporocyteless (spl) individuals resulted in 1301 genes showing 25-fold reduced or null activity in ovules lacking a female gametophyte, including those encoding 92 signalling proteins, 75 transcription factors, and 72 RNA-binding proteins not reported in previous studies based on microarray profiling. A combination of independent genetic and molecular strategies confirmed the differential expression of 28 of them, showing that they are either preferentially active in the female gametophyte, or dependent on the presence of a female gametophyte to be expressed in sporophytic cells of the ovule. Among 18 genes encoding pentatricopeptide-repeat proteins (PPRs) that show transcriptional activity in wild-type but not spl ovules, CIHUATEOTL (At4g38150) is specifically expressed in the female gametophyte and necessary for female gametogenesis. These results expand the nature of the transcriptional universe present in the ovule of Arabidopsis, and offer a large-scale quantitative reference of global expression for future genomic and developmental studies.

  12. Primary production in the Delta: Then and now

    USGS Publications Warehouse

    Cloern, James E.; Robinson, April; Richey, Amy; Grenier, Letitia; Grossinger, Robin; Boyer, Katharyn E.; Burau, Jon; Canuel, Elizabeth A.; DeGeorge, John F.; Drexler, Judith Z.; Enright, Chris; Howe, Emily R.; Kneib, Ronald; Mueller-Solger, Anke; Naiman, Robert J.; Pinckney, James L.; Safran, Samuel M.; Schoellhamer, David H.; Simenstad, Charles A.

    2016-01-01

    To evaluate the role of restoration in the recovery of the Delta ecosystem, we need to have clear targets and performance measures that directly assess ecosystem function. Primary production is a crucial ecosystem process, which directly limits the quality and quantity of food available for secondary consumers such as invertebrates and fish. The Delta has a low rate of primary production, but it is unclear whether this was always the case. Recent analyses from the Historical Ecology Team and Delta Landscapes Project provide quantitative comparisons of the areal extent of 14 habitat types in the modern Delta versus the historical Delta (pre-1850). Here we describe an approach for using these metrics of land use change to: (1) produce the first quantitative estimates of how Delta primary production and the relative contributions from five different producer groups have been altered by large-scale drainage and conversion to agriculture; (2) convert these production estimates into a common currency so the contributions of each producer group reflect their food quality and efficiency of transfer to consumers; and (3) use simple models to discover how tidal exchange between marshes and open water influences primary production and its consumption. Application of this approach could inform Delta management in two ways. First, it would provide a quantitative estimate of how large-scale conversion to agriculture has altered the Delta's capacity to produce food for native biota. Second, it would provide restoration practitioners with a new approach—based on ecosystem function—to evaluate the success of restoration projects and gauge the trajectory of ecological recovery in the Delta region.

  13. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  14. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  15. Advancing the speed, sensitivity and accuracy of biomolecular detection using multi-length-scale engineering

    PubMed Central

    Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.

    2015-01-01

    Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices. PMID:25466541

  16. Advancing the speed, sensitivity and accuracy of biomolecular detection using multi-length-scale engineering

    NASA Astrophysics Data System (ADS)

    Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.

    2014-12-01

    Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices.

  17. Fiber networks amplify active stress

    PubMed Central

    Ronceray, Pierre; Broedersz, Chase P.

    2016-01-01

    Large-scale force generation is essential for biological functions such as cell motility, embryonic development, and muscle contraction. In these processes, forces generated at the molecular level by motor proteins are transmitted by disordered fiber networks, resulting in large-scale active stresses. Although these fiber networks are well characterized macroscopically, this stress generation by microscopic active units is not well understood. Here we theoretically study force transmission in these networks. We find that collective fiber buckling in the vicinity of a local active unit results in a rectification of stress towards strongly amplified isotropic contraction. This stress amplification is reinforced by the networks’ disordered nature, but saturates for high densities of active units. Our predictions are quantitatively consistent with experiments on reconstituted tissues and actomyosin networks and shed light on the role of the network microstructure in shaping active stresses in cells and tissue. PMID:26921325

  18. Multiscale/multiresolution landslides susceptibility mapping

    NASA Astrophysics Data System (ADS)

    Grozavu, Adrian; Cătălin Stanga, Iulian; Valeriu Patriche, Cristian; Toader Juravle, Doru

    2014-05-01

    Within the European strategies, landslides are considered an important threatening that requires detailed studies to identify areas where these processes could occur in the future and to design scientific and technical plans for landslide risk mitigation. In this idea, assessing and mapping the landslide susceptibility is an important preliminary step. Generally, landslide susceptibility at small scale (for large regions) can be assessed through qualitative approach (expert judgements), based on a few variables, while studies at medium and large scale requires quantitative approach (e.g. multivariate statistics), a larger set of variables and, necessarily, the landslide inventory. Obviously, the results vary more or less from a scale to another, depending on the available input data, but also on the applied methodology. Since it is almost impossible to have a complete landslide inventory on large regions (e.g. at continental level), it is very important to verify the compatibility and the validity of results obtained at different scales, identifying the differences and fixing the inherent errors. This paper aims at assessing and mapping the landslide susceptibility at regional level through a multiscale-multiresolution approach from small scale and low resolution to large scale and high resolution of data and results, comparing the compatibility of results. While the first ones could be used for studies at european and national level, the later ones allows results validation, including through fields surveys. The test area, namely the Barlad Plateau (more than 9000 sq.km) is located in Eastern Romania, covering a region where both the natural environment and the human factor create a causal context that favor these processes. The landslide predictors were initially derived from various databases available at pan-european level and progressively completed and/or enhanced together with scale and the resolution: the topography (from SRTM at 90 meters to digital elevation models based on topographical maps, 1:25,000 and 1:5,000), the lithology (from geological maps, 1:200,000), land cover and land use (from CLC 2006 to maps derived from orthorectified aerial images, 0.5 meters resolution), rainfall (from Worldclim, ECAD to our own data), the seismicity (the seismic zonation of Romania) etc. The landslide inventory was created as polygonal data based on aerial images (resolution 0.5 meters), the information being considered at county level (NUTS 3) and, eventually, at communal level (LAU2). The methodological framework is based on the logistic regression as a quantitative method and the analytic hierarchy process as a semi-qualitative methods, both being applied once identically for all scales and once recalibrated for each scale and resolution (from 1:1,000,000 and one km pixel resolution to 1:25,000 and ten meters resolution). The predictive performance of the two models was assessed using the ROC (Receiver Operating Characteristic) curve and the AUC (Area Under Curve) parameter and the results indicate a good correspondence between the susceptibility estimated for the test samples (0.855-0.890) and for the validation samples (0.830-0.865). Finally, the results were compared in pairs in order to fix the errors at small scale and low resolution and to optimize the methodology for landslide susceptibility mapping on large areas.

  19. The Effectiveness of a Mixed-Mode Survey on Domestic Violence in Curaçao: Response and Data Quality

    ERIC Educational Resources Information Center

    van Wijk, Nikil; de Leeuw, Edith; de Bruijn, Jeanne

    2015-01-01

    To collect reliable statistical data on domestic violence in Curaçao, we conducted a large-scale quantitative study (n = 816). To meet with the special needs of the population and topic, we designed a tailored mixed-mode survey to assess the prevalence of domestic violence in Curaçao and its health consequences. Great care was taken to reduce…

  20. Extra dimension searches at hadron colliders to next-to-leading order-QCD

    NASA Astrophysics Data System (ADS)

    Kumar, M. C.; Mathews, Prakash; Ravindran, V.

    2007-11-01

    The quantitative impact of NLO-QCD corrections for searches of large and warped extra dimensions at hadron colliders are investigated for the Drell-Yan process. The K-factor for various observables at hadron colliders are presented. Factorisation, renormalisation scale dependence and uncertainties due to various parton distribution functions are studied. Uncertainties arising from the error on experimental data are estimated using the MRST parton distribution functions.

  1. Scaling Up, "Writ Small": Using an Assessment for Learning Audit Instrument to Stimulate Site-Based Professional Development, One School at a Time

    ERIC Educational Resources Information Center

    Lysaght, Zita; O'Leary, Michael

    2017-01-01

    Exploiting the potential that Assessment for Learning (AfL) offers to optimise student learning is contingent on both teachers' knowledge and use of AfL and the fidelity with which this translates into their daily classroom practices. Quantitative data derived from the use of an Assessment for Learning Audit Instrument (AfLAI) with a large sample…

  2. Kinetic method for the large-scale analysis of the binding mechanism of histone deacetylase inhibitors.

    PubMed

    Meyners, Christian; Baud, Matthias G J; Fuchter, Matthew J; Meyer-Almes, Franz-Josef

    2014-09-01

    Performing kinetic studies on protein ligand interactions provides important information on complex formation and dissociation. Beside kinetic parameters such as association rates and residence times, kinetic experiments also reveal insights into reaction mechanisms. Exploiting intrinsic tryptophan fluorescence a parallelized high-throughput Förster resonance energy transfer (FRET)-based reporter displacement assay with very low protein consumption was developed to enable the large-scale kinetic characterization of the binding of ligands to recombinant human histone deacetylases (HDACs) and a bacterial histone deacetylase-like amidohydrolase (HDAH) from Bordetella/Alcaligenes. For the binding of trichostatin A (TSA), suberoylanilide hydroxamic acid (SAHA), and two other SAHA derivatives to HDAH, two different modes of action, simple one-step binding and a two-step mechanism comprising initial binding and induced fit, were verified. In contrast to HDAH, all compounds bound to human HDAC1, HDAC6, and HDAC8 through a two-step mechanism. A quantitative view on the inhibitor-HDAC systems revealed two types of interaction, fast binding and slow dissociation. We provide arguments for the thesis that the relationship between quantitative kinetic and mechanistic information and chemical structures of compounds will serve as a valuable tool for drug optimization. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  4. Launching of Active Galactic Nuclei Jets

    NASA Astrophysics Data System (ADS)

    Tchekhovskoy, Alexander

    As black holes accrete gas, they often produce relativistic, collimated outflows, or jets. Jets are expected to form in the vicinity of a black hole, making them powerful probes of strong-field gravity. However, how jet properties (e.g., jet power) connect to those of the accretion flow (e.g., mass accretion rate) and the black hole (e.g., black hole spin) remains an area of active research. This is because what determines a crucial parameter that controls jet properties—the strength of large-scale magnetic flux threading the black hole—remains largely unknown. First-principles computer simulations show that due to this, even if black hole spin and mass accretion rate are held constant, the simulated jet powers span a wide range, with no clear winner. This limits our ability to use jets as a quantitative diagnostic tool of accreting black holes. Recent advances in computer simulations demonstrated that accretion disks can accumulate large-scale magnetic flux on the black hole, until the magnetic flux becomes so strong that it obstructs gas infall and leads to a magnetically-arrested disk (MAD). Recent evidence suggests that central black holes in jetted active galactic nuclei and tidal disruptions are surrounded by MADs. Since in MADs both the black hole magnetic flux and the jet power are at their maximum, well-defined values, this opens up a new vista in the measurements of black hole masses and spins and quantitative tests of accretion and jet theory.

  5. Machine learning for large-scale wearable sensor data in Parkinson's disease: Concepts, promises, pitfalls, and futures.

    PubMed

    Kubota, Ken J; Chen, Jason A; Little, Max A

    2016-09-01

    For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, "wearable," sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that "learn" from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.

  6. Should fatty acid signature proportions sum to 1 for diet estimation?

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Budge, Suzanne M.; Thiemann, Gregory W.

    2016-01-01

    Knowledge of predator diets, including how diets might change through time or differ among predators, provides essential insights into their ecology. Diet estimation therefore remains an active area of research within quantitative ecology. Quantitative fatty acid signature analysis (QFASA) is an increasingly common method of diet estimation. QFASA is based on a data library of prey signatures, which are vectors of proportions summarizing the fatty acid composition of lipids, and diet is estimated as the mixture of prey signatures that most closely approximates a predator’s signature. Diets are typically estimated using proportions from a subset of all fatty acids that are known to be solely or largely influenced by diet. Given the subset of fatty acids selected, the current practice is to scale their proportions to sum to 1.0. However, scaling signature proportions has the potential to distort the structural relationships within a prey library and between predators and prey. To investigate that possibility, we compared the practice of scaling proportions with two alternatives and found that the traditional scaling can meaningfully bias diet estimators under some conditions. Two aspects of the prey types that contributed to a predator’s diet influenced the magnitude of the bias: the degree to which the sums of unscaled proportions differed among prey types and the identifiability of prey types within the prey library. We caution investigators against the routine scaling of signature proportions in QFASA.

  7. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  8. Effect of Logarithmic and Linear Frequency Scales on Parametric Modelling of Tissue Dielectric Data.

    PubMed

    Salahuddin, Saqib; Porter, Emily; Meaney, Paul M; O'Halloran, Martin

    2017-02-01

    The dielectric properties of biological tissues have been studied widely over the past half-century. These properties are used in a vast array of applications, from determining the safety of wireless telecommunication devices to the design and optimisation of medical devices. The frequency-dependent dielectric properties are represented in closed-form parametric models, such as the Cole-Cole model, for use in numerical simulations which examine the interaction of electromagnetic (EM) fields with the human body. In general, the accuracy of EM simulations depends upon the accuracy of the tissue dielectric models. Typically, dielectric properties are measured using a linear frequency scale; however, use of the logarithmic scale has been suggested historically to be more biologically descriptive. Thus, the aim of this paper is to quantitatively compare the Cole-Cole fitting of broadband tissue dielectric measurements collected with both linear and logarithmic frequency scales. In this way, we can determine if appropriate choice of scale can minimise the fit error and thus reduce the overall error in simulations. Using a well-established fundamental statistical framework, the results of the fitting for both scales are quantified. It is found that commonly used performance metrics, such as the average fractional error, are unable to examine the effect of frequency scale on the fitting results due to the averaging effect that obscures large localised errors. This work demonstrates that the broadband fit for these tissues is quantitatively improved when the given data is measured with a logarithmic frequency scale rather than a linear scale, underscoring the importance of frequency scale selection in accurate wideband dielectric modelling of human tissues.

  9. Effect of Logarithmic and Linear Frequency Scales on Parametric Modelling of Tissue Dielectric Data

    PubMed Central

    Salahuddin, Saqib; Porter, Emily; Meaney, Paul M.; O’Halloran, Martin

    2016-01-01

    The dielectric properties of biological tissues have been studied widely over the past half-century. These properties are used in a vast array of applications, from determining the safety of wireless telecommunication devices to the design and optimisation of medical devices. The frequency-dependent dielectric properties are represented in closed-form parametric models, such as the Cole-Cole model, for use in numerical simulations which examine the interaction of electromagnetic (EM) fields with the human body. In general, the accuracy of EM simulations depends upon the accuracy of the tissue dielectric models. Typically, dielectric properties are measured using a linear frequency scale; however, use of the logarithmic scale has been suggested historically to be more biologically descriptive. Thus, the aim of this paper is to quantitatively compare the Cole-Cole fitting of broadband tissue dielectric measurements collected with both linear and logarithmic frequency scales. In this way, we can determine if appropriate choice of scale can minimise the fit error and thus reduce the overall error in simulations. Using a well-established fundamental statistical framework, the results of the fitting for both scales are quantified. It is found that commonly used performance metrics, such as the average fractional error, are unable to examine the effect of frequency scale on the fitting results due to the averaging effect that obscures large localised errors. This work demonstrates that the broadband fit for these tissues is quantitatively improved when the given data is measured with a logarithmic frequency scale rather than a linear scale, underscoring the importance of frequency scale selection in accurate wideband dielectric modelling of human tissues. PMID:28191324

  10. Organizing "mountains of words" for data analysis, both qualitative and quantitative.

    PubMed

    Johnson, Bruce D; Dunlap, Eloise; Benoit, Ellen

    2010-04-01

    Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semiquantitative coding and analysis. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed.

  11. Scaling laws in the dynamics of crime growth rate

    NASA Astrophysics Data System (ADS)

    Alves, Luiz G. A.; Ribeiro, Haroldo V.; Mendes, Renio S.

    2013-06-01

    The increasing number of crimes in areas with large concentrations of people have made cities one of the main sources of violence. Understanding characteristics of how crime rate expands and its relations with the cities size goes beyond an academic question, being a central issue for contemporary society. Here, we characterize and analyze quantitative aspects of murders in the period from 1980 to 2009 in Brazilian cities. We find that the distribution of the annual, biannual and triannual logarithmic homicide growth rates exhibit the same functional form for distinct scales, that is, a scale invariant behavior. We also identify asymptotic power-law decay relations between the standard deviations of these three growth rates and the initial size. Further, we discuss similarities with complex organizations.

  12. Analysis on the restriction factors of the green building scale promotion based on DEMATEL

    NASA Astrophysics Data System (ADS)

    Wenxia, Hong; Zhenyao, Jiang; Zhao, Yang

    2017-03-01

    In order to promote the large-scale development of the green building in our country, DEMATEL method was used to classify influence factors of green building development into three parts, including green building market, green technology and macro economy. Through the DEMATEL model, the interaction mechanism of each part was analyzed. The mutual influence degree of each barrier factor that affects the green building promotion was quantitatively analysed and key factors for the development of green building in China were also finally determined. In addition, some implementation strategies of promoting green building scale development in our country were put forward. This research will show important reference value and practical value for making policies of the green building promotion.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, Arka; Dalal, Neal, E-mail: abanerj6@illinois.edu, E-mail: dalaln@illinois.edu

    We present a new method for simulating cosmologies that contain massive particles with thermal free streaming motion, such as massive neutrinos or warm/hot dark matter. This method combines particle and fluid descriptions of the thermal species to eliminate the shot noise known to plague conventional N-body simulations. We describe this method in detail, along with results for a number of test cases to validate our method, and check its range of applicability. Using this method, we demonstrate that massive neutrinos can produce a significant scale-dependence in the large-scale biasing of deep voids in the matter field. We show that thismore » scale-dependence may be quantitatively understood using an extremely simple spherical expansion model which reproduces the behavior of the void bias for different neutrino parameters.« less

  14. The nature of micro CMEs within coronal holes

    NASA Astrophysics Data System (ADS)

    Bothmer, Volker; Nistico, Giuseppe; Zimbardo, Gaetano; Patsourakos, Spiros; Bosman, Eckhard

    Whilst investigating the origin and characteristics of coronal jets and large-scale CMEs identi-fied in data from the SECCHI (Sun Earth Connection Coronal and Heliospheric Investigation) instrument suites on board the two STEREO satellites, we discovered transient events that originated in the low corona with a morphology resembling that of typical three-part struc-tured coronal mass ejections (CMEs). However, the CMEs occurred on considerably smaller spatial scales. In this presentation we show evidence for the existence of small-scale CMEs from inside coronal holes and present quantitative estimates of their speeds and masses. We interprete the origin and evolution of micro CMEs as a natural consequence of the emergence of small-scale magnetic bipoles related to the Sun's ever changing photospheric magnetic flux on various scales and their interactions with the ambient plasma and magnetic field. The analysis of CMEs is performed within the framework of the EU Erasmus and FP7 SOTERIA projects.

  15. Parameterizing atmosphere-land surface exchange for climate models with satellite data: A case study for the Southern Great Plains CART site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, W.

    High-resolution satellite data provide detailed, quantitative descriptions of land surface characteristics over large areas so that objective scale linkage becomes feasible. With the aid of satellite data, Sellers et al. and Wood and Lakshmi examined the linearity of processes scaled up from 30 m to 15 km. If the phenomenon is scale invariant, then the aggregated value of a function or flux is equivalent to the function computed from aggregated values of controlling variables. The linear relation may be realistic for limited land areas having no large surface contrasts to cause significant horizontal exchange. However, for areas with sharp surfacemore » contrasts, horizontal exchange and different dynamics in the atmospheric boundary may induce nonlinear interactions, such as at interfaces of land-water, forest-farm land, and irrigated crops-desert steppe. The linear approach, however, represents the simplest scenario, and is useful for developing an effective scheme for incorporating subgrid land surface processes into large-scale models. Our studies focus on coupling satellite data and ground measurements with a satellite-data-driven land surface model to parameterize surface fluxes for large-scale climate models. In this case study, we used surface spectral reflectance data from satellite remote sensing to characterize spatial and temporal changes in vegetation and associated surface parameters in an area of about 350 {times} 400 km covering the southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site of the US Department of Energy`s Atmospheric Radiation Measurement (ARM) Program.« less

  16. Sex Differences in Magical Ideation: A Community-Based Twin Study

    PubMed Central

    Karcher, Nicole R.; Slutske, Wendy S.; Kerns, John G.; Piasecki, Thomas M.; Martin, Nicholas G.

    2014-01-01

    Two questions regarding sex differences in magical ideation were investigated in this study: (1) whether there are mean level sex differences on the Magical Ideation Scale (MIS), and (2) whether there are quantitative and/or qualitative sex differences in the genetic contributions to variation on this scale. These questions were evaluated using data obtained from a large community sample of adult Australian twins (N=4,355) that included opposite-sex pairs. Participants completed a modified 15-item version of the MIS within a larger assessment battery. Women reported both higher means and variability on the MIS than men; this was also observed within families (in opposite-sex twin pairs). Biometric modeling indicated that the proportion of variation in MIS scores due to genetic influences (indicating quantitative sex differences) and the specific latent genetic contributions to this variation (indicating qualitative sex differences) were the same in men and women. These findings clarify the nature of sex differences in magical ideation and point to avenues for future research. PMID:24364500

  17. Quantifying adsorption-induced deformation of nanoporous materials on different length scales

    PubMed Central

    Morak, Roland; Braxmeier, Stephan; Ludescher, Lukas; Hüsing, Nicola; Reichenauer, Gudrung

    2017-01-01

    A new in situ setup combining small-angle neutron scattering (SANS) and dilatometry was used to measure water-adsorption-induced deformation of a monolithic silica sample with hierarchical porosity. The sample exhibits a disordered framework consisting of macropores and struts containing two-dimensional hexagonally ordered cylindrical mesopores. The use of an H2O/D2O water mixture with zero scattering length density as an adsorptive allows a quantitative determination of the pore lattice strain from the shift of the corresponding diffraction peak. This radial strut deformation is compared with the simultaneously measured macroscopic length change of the sample with dilatometry, and differences between the two quantities are discussed on the basis of the deformation mechanisms effective at the different length scales. It is demonstrated that the SANS data also provide a facile way to quantitatively determine the adsorption isotherm of the material by evaluating the incoherent scattering contribution of H2O at large scattering vectors. PMID:29021735

  18. The “unreasonable effectiveness” of stratigraphic and geomorphic experiments

    NASA Astrophysics Data System (ADS)

    Paola, Chris; Straub, Kyle; Mohrig, David; Reinhardt, Liam

    2009-12-01

    The growth of quantitative analysis and prediction in Earth-surface science has been accompanied by growth in experimental stratigraphy and geomorphology. Experimenters have grown increasingly bold in targeting landscape elements from channel reaches up to the entire erosional networks and depositional basins, often using very small facilities. The experiments produce spatial structure and kinematics that, although imperfect, compare well with natural systems despite differences of spatial scale, time scale, material properties, and number of active processes. Experiments have been particularly useful in studying a wide range of forms of self-organized (autogenic) complexity that occur in morphodynamic systems. Autogenic dynamics creates much of the spatial structure we see in the landscape and in preserved strata, and is strongly associated with sediment storage and release. The observed consistency between experimental and field systems despite large differences in governing dimensionless numbers is what we mean by "unreasonable effectiveness". We suggest that unreasonable experimental effectiveness arises from natural scale independence. We generalize existing ideas to relate internal similarity, in which a small part of a system is similar to the larger system, to external similarity, in which a small copy of a system is similar to the larger system. We propose that internal similarity implies external similarity, though not the converse. The external similarity of landscape experiments to natural landscapes suggests that natural scale independence may be even more characteristic of morphodynamics than it is of better studied cases such as turbulence. We urge a shift in emphasis in experimental stratigraphy and geomorphology away from classical dynamical scaling and towards a quantitative understanding of the origins and limits of scale independence. Other research areas with strong growth potential in experimental surface dynamics include physical-biotic interactions, cohesive effects, stochastic processes, the interplay of structural and geomorphic self-organization, extraction of quantitative process information from landscape and stratigraphic records, and closer interaction between experimentation and theory.

  19. MRMPROBS: a data assessment and metabolite identification tool for large-scale multiple reaction monitoring based widely targeted metabolomics.

    PubMed

    Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro

    2013-05-21

    We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.

  20. Natural disasters and population mobility in Bangladesh

    PubMed Central

    Gray, Clark L.; Mueller, Valerie

    2012-01-01

    The consequences of environmental change for human migration have gained increasing attention in the context of climate change and recent large-scale natural disasters, but as yet relatively few large-scale and quantitative studies have addressed this issue. We investigate the consequences of climate-related natural disasters for long-term population mobility in rural Bangladesh, a region particularly vulnerable to environmental change, using longitudinal survey data from 1,700 households spanning a 15-y period. Multivariate event history models are used to estimate the effects of flooding and crop failures on local population mobility and long-distance migration while controlling for a large set of potential confounders at various scales. The results indicate that flooding has modest effects on mobility that are most visible at moderate intensities and for women and the poor. However, crop failures unrelated to flooding have strong effects on mobility in which households that are not directly affected but live in severely affected areas are the most likely to move. These results point toward an alternate paradigm of disaster-induced mobility that recognizes the significant barriers to migration for vulnerable households as well their substantial local adaptive capacity. PMID:22474361

  1. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  2. A stochastic two-scale model for pressure-driven flow between rough surfaces

    PubMed Central

    Larsson, Roland; Lundström, Staffan; Wall, Peter; Almqvist, Andreas

    2016-01-01

    Seal surface topography typically consists of global-scale geometric features as well as local-scale roughness details and homogenization-based approaches are, therefore, readily applied. These provide for resolving the global scale (large domain) with a relatively coarse mesh, while resolving the local scale (small domain) in high detail. As the total flow decreases, however, the flow pattern becomes tortuous and this requires a larger local-scale domain to obtain a converged solution. Therefore, a classical homogenization-based approach might not be feasible for simulation of very small flows. In order to study small flows, a model allowing feasibly-sized local domains, for really small flow rates, is developed. Realization was made possible by coupling the two scales with a stochastic element. Results from numerical experiments, show that the present model is in better agreement with the direct deterministic one than the conventional homogenization type of model, both quantitatively in terms of flow rate and qualitatively in reflecting the flow pattern. PMID:27436975

  3. A robust quantitative near infrared modeling approach for blend monitoring.

    PubMed

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  4. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  5. Quantitative patterns of stylistic influence in the evolution of literature.

    PubMed

    Hughes, James M; Foti, Nicholas J; Krakauer, David C; Rockmore, Daniel N

    2012-05-15

    Literature is a form of expression whose temporal structure, both in content and style, provides a historical record of the evolution of culture. In this work we take on a quantitative analysis of literary style and conduct the first large-scale temporal stylometric study of literature by using the vast holdings in the Project Gutenberg Digital Library corpus. We find temporal stylistic localization among authors through the analysis of the similarity structure in feature vectors derived from content-free word usage, nonhomogeneous decay rates of stylistic influence, and an accelerating rate of decay of influence among modern authors. Within a given time period we also find evidence for stylistic coherence with a given literary topic, such that writers in different fields adopt different literary styles. This study gives quantitative support to the notion of a literary "style of a time" with a strong trend toward increasingly contemporaneous stylistic influence.

  6. On the buckling of an elastic holey column

    PubMed Central

    Hazel, A. L.; Pihler-Puzović, D.

    2017-01-01

    We report the results of a numerical and theoretical study of buckling in elastic columns containing a line of holes. Buckling is a common failure mode of elastic columns under compression, found over scales ranging from metres in buildings and aircraft to tens of nanometers in DNA. This failure usually occurs through lateral buckling, described for slender columns by Euler’s theory. When the column is perforated with a regular line of holes, a new buckling mode arises, in which adjacent holes collapse in orthogonal directions. In this paper, we firstly elucidate how this alternate hole buckling mode coexists and interacts with classical Euler buckling modes, using finite-element numerical calculations with bifurcation tracking. We show how the preferred buckling mode is selected by the geometry, and discuss the roles of localized (hole-scale) and global (column-scale) buckling. Secondly, we develop a novel predictive model for the buckling of columns perforated with large holes. This model is derived without arbitrary fitting parameters, and quantitatively predicts the critical strain for buckling. We extend the model to sheets perforated with a regular array of circular holes and use it to provide quantitative predictions of their buckling. PMID:29225498

  7. Electrical properties of 0.4 cm long single walled nanotubes

    NASA Astrophysics Data System (ADS)

    Yu, Zhen

    2005-03-01

    Centimeter scale aligned carbon nanotube arrays are grown from nanoparticle/metal catalyst pads[1]. We find the nanotubes grow both with and ``against the wind.'' A metal underlayer provides in-situ electrical contact to these long nanotubes with no post growth processing needed. Using the electrically contacted nanotubes, we study electrical transport of 0.4 cm long nanotubes[2]. Using this data, we are able to determine the resistance of a nanotube as a function of length quantitatively, since the contact resistance is negligible in these long nanotubes. The source drain I-V curves are quantitatively described by a classical, diffusive model. Our measurements show that the outstanding transport properties of nanotubes can be extended to the cm scale and open the door to large scale integrated nanotube circuits with macroscopic dimensions. These are the longest electrically contacted single walled nanotubes measured to date. [1] Zhen Yu, Shengdong Li, Peter J. Burke, ``Synthesis of Aligned Arrays of Millimeter Long, Straight Single-Walled Carbon Nanotubes,'' Chemistry of Materials, 16(18), 3414-3416 (2004). [2] Shengdong Li, Zhen Yu, Christopher Rutherglen, Peter J. Burke, ``Electrical properties of 0.4 cm long single-walled carbon nanotubes'' Nano Letters, 4(10), 2003-2007 (2004).

  8. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  9. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  10. Phenazines affect biofilm formation by Pseudomonas aeruginosa in similar ways at various scales

    PubMed Central

    Ramos, Itzel; Dietrich, Lars E. P.; Price-Whelan, Alexa; Newman, Dianne K.

    2010-01-01

    Pseudomonads produce phenazines, a group of small, redox-active compounds with diverse physiological functions. In this study, we compared the phenotypes of Pseudomonas aeruginosa strain PA14 and a mutant unable to synthesize phenazines in flow cell and colony biofilms quantitatively. Although phenazine production does not impact the ability of PA14 to attach to surfaces, as has been shown for Pseudomonas chlororaphis (Maddula, 2006; Maddula, 2008), it influences swarming motility and the surface-to-volume ratio of mature biofilms. These results indicate that phenazines affect biofilm development across a large range of scales, but in unique ways for different Pseudomonas species. PMID:20123017

  11. Conducting pilot and feasibility studies.

    PubMed

    Cope, Diane G

    2015-03-01

    Planning a well-designed research study can be tedious and laborious work. However, this process is critical and ultimately can produce valid, reliable study findings. Designing a large-scale randomized, controlled trial (RCT)-the gold standard in quantitative research-can be even more challenging. Even the most well-planned study potentially can result in issues with research procedures and design, such as recruitment, retention, or methodology. One strategy that may facilitate sound study design is the completion of a pilot or feasibility study prior to the initiation of a larger-scale trial. This article will discuss pilot and feasibility studies, their advantages and disadvantages, and implications for oncology nursing research. 
.

  12. Recurrent patterning in the daily foraging routes of hamadryas baboons (Papio hamadryas): spatial memory in large-scale versus small-scale space.

    PubMed

    Schreier, Amy L; Grove, Matt

    2014-05-01

    The benefits of spatial memory for foraging animals can be assessed on two distinct spatial scales: small-scale space (travel within patches) and large-scale space (travel between patches). While the patches themselves may be distributed at low density, within patches resources are likely densely distributed. We propose, therefore, that spatial memory for recalling the particular locations of previously visited feeding sites will be more advantageous during between-patch movement, where it may reduce the distances traveled by animals that possess this ability compared to those that must rely on random search. We address this hypothesis by employing descriptive statistics and spectral analyses to characterize the daily foraging routes of a band of wild hamadryas baboons in Filoha, Ethiopia. The baboons slept on two main cliffs--the Filoha cliff and the Wasaro cliff--and daily travel began and ended on a cliff; thus four daily travel routes exist: Filoha-Filoha, Filoha-Wasaro, Wasaro-Wasaro, Wasaro-Filoha. We use newly developed partial sum methods and distribution-fitting analyses to distinguish periods of area-restricted search from more extensive movements. The results indicate a single peak in travel activity in the Filoha-Filoha and Wasaro-Filoha routes, three peaks of travel activity in the Filoha-Wasaro routes, and two peaks in the Wasaro-Wasaro routes; and are consistent with on-the-ground observations of foraging and ranging behavior of the baboons. In each of the four daily travel routes the "tipping points" identified by the partial sum analyses indicate transitions between travel in small- versus large-scale space. The correspondence between the quantitative analyses and the field observations suggest great utility for using these types of analyses to examine primate travel patterns and especially in distinguishing between movement in small versus large-scale space. Only the distribution-fitting analyses are inconsistent with the field observations, which may be due to the scale at which these analyses were conducted. © 2013 Wiley Periodicals, Inc.

  13. Multi-filter spectrophotometry simulations

    NASA Technical Reports Server (NTRS)

    Callaghan, Kim A. S.; Gibson, Brad K.; Hickson, Paul

    1993-01-01

    To complement both the multi-filter observations of quasar environments described in these proceedings, as well as the proposed UBC 2.7 m Liquid Mirror Telescope (LMT) redshift survey, we have initiated a program of simulated multi-filter spectrophotometry. The goal of this work, still very much in progress, is a better quantitative assessment of the multiband technique as a viable mechanism for obtaining useful redshift and morphological class information from large scale multi-filter surveys.

  14. JPRS Report. Soviet Union: World Economy & International Relations, No. 5, May 1987

    DTIC Science & Technology

    1987-09-22

    at the same time substitute for their creative assertiveness and intuition. This applies not only to the elite group of researchers and designers ...achieved such large-scale results in such a short period of time. Not only quantitative but primarily qualitative results. And not simply in the...into A GROUP OF PERSONALITIES, each of which and all together in time acquiring growing importance in history? Today, in the era of reconstruction

  15. Statistical mechanics of neocortical interactions: A scaling paradigm applied to electroencephalography

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1991-09-01

    A series of papers has developed a statistical mechanics of neocortical interactions (SMNI), deriving aggregate behavior of experimentally observed columns of neurons from statistical electrical-chemical properties of synaptic interactions. While not useful to yield insights at the single-neuron level, SMNI has demonstrated its capability in describing large-scale properties of short-term memory and electroencephalographic (EEG) systematics. The necessity of including nonlinear and stochastic structures in this development has been stressed. In this paper, a more stringent test is placed on SMNI: The algebraic and numerical algorithms previously developed in this and similar systems are brought to bear to fit large sets of EEG and evoked-potential data being collected to investigate genetic predispositions to alcoholism and to extract brain ``signatures'' of short-term memory. Using the numerical algorithm of very fast simulated reannealing, it is demonstrated that SMNI can indeed fit these data within experimentally observed ranges of its underlying neuronal-synaptic parameters, and the quantitative modeling results are used to examine physical neocortical mechanisms to discriminate high-risk and low-risk populations genetically predisposed to alcoholism. Since this study is a control to span relatively long time epochs, similar to earlier attempts to establish such correlations, this discrimination is inconclusive because of other neuronal activity which can mask such effects. However, the SMNI model is shown to be consistent with EEG data during selective attention tasks and with neocortical mechanisms describing short-term memory previously published using this approach. This paper explicitly identifies similar nonlinear stochastic mechanisms of interaction at the microscopic-neuronal, mesoscopic-columnar, and macroscopic-regional scales of neocortical interactions. These results give strong quantitative support for an accurate intuitive picture, portraying neocortical interactions as having common algebraic or physics mechanisms that scale across quite disparate spatial scales and functional or behavioral phenomena, i.e., describing interactions among neurons, columns of neurons, and regional masses of neurons.

  16. In the eye of the beholder: the effect of rater variability and different rating scales on QTL mapping.

    PubMed

    Poland, Jesse A; Nelson, Rebecca J

    2011-02-01

    The agronomic importance of developing durably resistant cultivars has led to substantial research in the field of quantitative disease resistance (QDR) and, in particular, mapping quantitative trait loci (QTL) for disease resistance. The assessment of QDR is typically conducted by visual estimation of disease severity, which raises concern over the accuracy and precision of visual estimates. Although previous studies have examined the factors affecting the accuracy and precision of visual disease assessment in relation to the true value of disease severity, the impact of this variability on the identification of disease resistance QTL has not been assessed. In this study, the effects of rater variability and rating scales on mapping QTL for northern leaf blight resistance in maize were evaluated in a recombinant inbred line population grown under field conditions. The population of 191 lines was evaluated by 22 different raters using a direct percentage estimate, a 0-to-9 ordinal rating scale, or both. It was found that more experienced raters had higher precision and that using a direct percentage estimation of diseased leaf area produced higher precision than using an ordinal scale. QTL mapping was then conducted using the disease estimates from each rater using stepwise general linear model selection (GLM) and inclusive composite interval mapping (ICIM). For GLM, the same QTL were largely found across raters, though some QTL were only identified by a subset of raters. The magnitudes of estimated allele effects at identified QTL varied drastically, sometimes by as much as threefold. ICIM produced highly consistent results across raters and for the different rating scales in identifying the location of QTL. We conclude that, despite variability between raters, the identification of QTL was largely consistent among raters, particularly when using ICIM. However, care should be taken in estimating QTL allele effects, because this was highly variable and rater dependent.

  17. A study to explore the use of orbital remote sensing to determine native arid plant distribution. [Arizona

    NASA Technical Reports Server (NTRS)

    Mcginnies, W. G. (Principal Investigator); Conn, J. S.; Haase, E. F.; Lepley, L. K.; Musick, H. B.; Foster, K. E.

    1975-01-01

    The author has identified the following significant results. Research results include a method for determining the reflectivities of natural areas from ERTS data taking into account sun angle and atmospheric effects on the radiance seen by the satellite sensor. Ground truth spectral signature data for various types of scenes, including ground with and without annuals, and various shrubs were collected. Large areas of varnished desert pavement are visible and mappable on ERTS and high altitude aircraft imagery. A large scale and a small scale vegetation pattern were found to be correlated with presence of desert pavement. A comparison of radiometric data with video recordings shows quantitatively that for most areas of desert vegetation, soils are the most influential factor in determining the signature of a scene. Additive and subtractive image processing techniques were applied in the dark room to enhance vegetational aspects of ERTS.

  18. The causality analysis of climate change and large-scale human crisis

    PubMed Central

    Zhang, David D.; Lee, Harry F.; Wang, Cong; Li, Baosheng; Pei, Qing; Zhang, Jane; An, Yulun

    2011-01-01

    Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500–1800 in Europe. Results show that cooling from A.D. 1560–1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined “golden” and “dark” ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere. PMID:21969578

  19. The causality analysis of climate change and large-scale human crisis.

    PubMed

    Zhang, David D; Lee, Harry F; Wang, Cong; Li, Baosheng; Pei, Qing; Zhang, Jane; An, Yulun

    2011-10-18

    Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500-1800 in Europe. Results show that cooling from A.D. 1560-1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined "golden" and "dark" ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere.

  20. Unfolding large-scale online collaborative human dynamics

    PubMed Central

    Zha, Yilong; Zhou, Tao; Zhou, Changsong

    2016-01-01

    Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766

  1. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues.

    PubMed

    Farisco, Michele; Kotaleski, Jeanette H; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain's operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs.

  2. A Priori Analysis of Subgrid-Scale Models for Large Eddy Simulations of Supercritical Binary-Species Mixing Layers

    NASA Technical Reports Server (NTRS)

    Okong'o, Nora; Bellan, Josette

    2005-01-01

    Models for large eddy simulation (LES) are assessed on a database obtained from direct numerical simulations (DNS) of supercritical binary-species temporal mixing layers. The analysis is performed at the DNS transitional states for heptane/nitrogen, oxygen/hydrogen and oxygen/helium mixing layers. The incorporation of simplifying assumptions that are validated on the DNS database leads to a set of LES equations that requires only models for the subgrid scale (SGS) fluxes, which arise from filtering the convective terms in the DNS equations. Constant-coefficient versions of three different models for the SGS fluxes are assessed and calibrated. The Smagorinsky SGS-flux model shows poor correlations with the SGS fluxes, while the Gradient and Similarity models have high correlations, as well as good quantitative agreement with the SGS fluxes when the calibrated coefficients are used.

  3. Transcriptome sequencing and annotation of the halophytic microalga Dunaliella salina * #

    PubMed Central

    Hong, Ling; Liu, Jun-li; Midoun, Samira Z.; Miller, Philip C.

    2017-01-01

    The unicellular green alga Dunaliella salina is well adapted to salt stress and contains compounds (including β-carotene and vitamins) with potential commercial value. A large transcriptome database of D. salina during the adjustment, exponential and stationary growth phases was generated using a high throughput sequencing platform. We characterized the metabolic processes in D. salina with a focus on valuable metabolites, with the aim of manipulating D. salina to achieve greater economic value in large-scale production through a bioengineering strategy. Gene expression profiles under salt stress verified using quantitative polymerase chain reaction (qPCR) implied that salt can regulate the expression of key genes. This study generated a substantial fraction of D. salina transcriptional sequences for the entire growth cycle, providing a basis for the discovery of novel genes. This first full-scale transcriptome study of D. salina establishes a foundation for further comparative genomic studies. PMID:28990374

  4. Dynamical evolution of domain walls in an expanding universe

    NASA Technical Reports Server (NTRS)

    Press, William H.; Ryden, Barbara S.; Spergel, David N.

    1989-01-01

    Whenever the potential of a scalar field has two or more separated, degenerate minima, domain walls form as the universe cools. The evolution of the resulting network of domain walls is calculated for the case of two potential minima in two and three dimensions, including wall annihilation, crossing, and reconnection effects. The nature of the evolution is found to be largely independent of the rate at which the universe expands. Wall annihilation and reconnection occur almost as fast as causality allows, so that the horizon volume is 'swept clean' and contains, at any time, only about one, fairly smooth, wall. Quantitative statistics are given. The total area of wall per volume decreases as the first power of time. The relative slowness of the decrease and the smoothness of the wall on the horizon scale make it impossible for walls to both generate large-scale structure and be consistent with quadrupole microwave background anisotropy limits.

  5. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues

    PubMed Central

    Farisco, Michele; Kotaleski, Jeanette H.; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain’s operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs. PMID:29740372

  6. Floods, floodplains, delta plains — A satellite imaging approach

    NASA Astrophysics Data System (ADS)

    Syvitski, James P. M.; Overeem, Irina; Brakenridge, G. Robert; Hannon, Mark

    2012-08-01

    Thirty-three lowland floodplains and their associated delta plains are characterized with data from three remote sensing systems (AMSR-E, SRTM and MODIS). These data provide new quantitative information to characterize Late Quaternary floodplain landscapes and their penchant for flooding over the last decade. Daily proxy records for discharge since 2002 and for each of the 33 river systems can be derived with novel Advanced Microwave Scanning Radiometer (AMSR-E) methods. A descriptive framework based on analysis of Shuttle Radar Topography Mission (SRTM) data is used to capture the major landscape-scale floodplain elements or zones: 1) container valleys with their long and narrow pathways of largely sediment transit and bypass, 2) floodplain depressions that act as loci for frequent flooding and sediment storage, 3) zones of nodal avulsions common to many continental scale rivers, and often located seaward of container valleys, and 4) coastal floodplains and delta plains that offer both sediment bypass and storage but under the influence of marine processes. The SRTM data allow mapping of smaller-scale architectural elements in unprecedented systematic manner. Floodplain depressions were found to play a major role, which may largely be overlooked in conceptual floodplain models. Lastly, MODIS data (independently and combined with AMSR-E) allows the tracking of flood hydrographs and pathways and sedimentation patterns on a near-daily timescale worldwide. These remote-sensing data show that 85% of the studied major river systems experienced extensive flooding in the last decade. A new quantitative paradigm of floodplain processes, honoring the frequency and extent of floods, can be develop by careful analysis of these new remotely sensed data.

  7. A Small-scale Physical Model of the Lower Mississippi River for Studying the Potential of Medium- and Large-scale Diversions

    NASA Astrophysics Data System (ADS)

    Willson, C. S.

    2011-12-01

    Over the past several thousand years the Mississippi River has formed one of the world's largest deltas and much of the Louisiana coast. However, in the last 100 years or so, anthropogenic controls have been placed on the system to maintain important navigation routes and for flood control resulting in the loss of the natural channel shifting necessary for replenishment of the deltaic coast with fresh sediment and resources. In addition, the high relative sea level rise in the lowermost portion of the river is causing a change in the distributary flow patterns of the river and deposition center. River and sediment diversions are being proposed as way to re-create some of the historical distribution of river water and sediments into the delta region. In response to a need for improving the understanding of the potential for medium- and large-scale river and sediment diversions, the state of Louisiana funded the construction of a small-scale physical model (SSPM) of the lower ~76 river miles (RM). The SSPM is a 1:12,000 horizontal, 1:500 vertical, highly-distorted, movable bed physical model designed to provide qualitative and semi-quantitative results regarding bulk noncohesive sediment transport characteristics in the river and through medium- and large-scale diversion structures. The SSPM was designed based on Froude similarity for the hydraulics and Shields similarity for sand transport and has a sediment time scale of 1 year prototype to 30 minutes model allowing for decadal length studies of the land building potential of diversions. Annual flow and sediment hydrographs were developed from historical records and a uniform relative sea level rise of 3 feet in 100 years is used to account for the combined effects of eustatic sea level rise and subsidence. Data collected during the experiments include river stages, dredging amounts and high-resolution video of transport patterns within the main channel and photographs of the sand deposition patterns in the diversion receiving areas. First, the similarity analysis that went into the model design along with a discussion of the resulting limitations will be presented. Next, calibration and validation results will be shown demonstrating the ability of the SSPM to capture the general lower Mississippi River sediment transport trends and deposition patterns. Third, results from a series of diversion experiments will be presented to semi-quantitatively show the effectiveness of diversion locations, sizes, and operating strategies on the quantities of sand diverted from the main river and the changes in main channel dredging volumes. These results will are then correlated with recent field and numerical studies of the study area. This talk will then close with a brief discussion of a new and improved physical model that will cover a larger domain and be designed to provide more quantitative results.

  8. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  9. Improving quantitative structure-activity relationship models using Artificial Neural Networks trained with dropout.

    PubMed

    Mendenhall, Jeffrey; Meiler, Jens

    2016-02-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both enrichment false positive rate and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22-46 % over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods.

  10. Improving Quantitative Structure-Activity Relationship Models using Artificial Neural Networks Trained with Dropout

    PubMed Central

    Mendenhall, Jeffrey; Meiler, Jens

    2016-01-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery (LB-CADD) pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both Enrichment false positive rate (FPR) and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22–46% over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods. PMID:26830599

  11. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  12. Neural circuits. Labeling of active neural circuits in vivo with designed calcium integrators.

    PubMed

    Fosque, Benjamin F; Sun, Yi; Dana, Hod; Yang, Chao-Tsung; Ohyama, Tomoko; Tadross, Michael R; Patel, Ronak; Zlatic, Marta; Kim, Douglas S; Ahrens, Misha B; Jayaraman, Vivek; Looger, Loren L; Schreiter, Eric R

    2015-02-13

    The identification of active neurons and circuits in vivo is a fundamental challenge in understanding the neural basis of behavior. Genetically encoded calcium (Ca(2+)) indicators (GECIs) enable quantitative monitoring of cellular-resolution activity during behavior. However, such indicators require online monitoring within a limited field of view. Alternatively, post hoc staining of immediate early genes (IEGs) indicates highly active cells within the entire brain, albeit with poor temporal resolution. We designed a fluorescent sensor, CaMPARI, that combines the genetic targetability and quantitative link to neural activity of GECIs with the permanent, large-scale labeling of IEGs, allowing a temporally precise "activity snapshot" of a large tissue volume. CaMPARI undergoes efficient and irreversible green-to-red conversion only when elevated intracellular Ca(2+) and experimenter-controlled illumination coincide. We demonstrate the utility of CaMPARI in freely moving larvae of zebrafish and flies, and in head-fixed mice and adult flies. Copyright © 2015, American Association for the Advancement of Science.

  13. Chromatin as active matter

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankit; Ganai, Nirmalendu; Sengupta, Surajit; Menon, Gautam I.

    2017-01-01

    Active matter models describe a number of biophysical phenomena at the cell and tissue scale. Such models explore the macroscopic consequences of driving specific soft condensed matter systems of biological relevance out of equilibrium through ‘active’ processes. Here, we describe how active matter models can be used to study the large-scale properties of chromosomes contained within the nuclei of human cells in interphase. We show that polymer models for chromosomes that incorporate inhomogeneous activity reproduce many general, yet little understood, features of large-scale nuclear architecture. These include: (i) the spatial separation of gene-rich, low-density euchromatin, predominantly found towards the centre of the nucleus, vis a vis. gene-poor, denser heterochromatin, typically enriched in proximity to the nuclear periphery, (ii) the differential positioning of individual gene-rich and gene-poor chromosomes, (iii) the formation of chromosome territories, as well as (iv), the weak size-dependence of the positions of individual chromosome centres-of-mass relative to the nuclear centre that is seen in some cell types. Such structuring is induced purely by the combination of activity and confinement and is absent in thermal equilibrium. We systematically explore active matter models for chromosomes, discussing how our model can be generalized to study variations in chromosome positioning across different cell types. The approach and model we outline here represent a preliminary attempt towards a quantitative, first-principles description of the large-scale architecture of the cell nucleus.

  14. Large-scale precipitation estimation using Kalpana-1 IR measurements and its validation using GPCP and GPCC data

    NASA Astrophysics Data System (ADS)

    Prakash, Satya; Mahesh, C.; Gairola, Rakesh M.

    2011-12-01

    Large-scale precipitation estimation is very important for climate science because precipitation is a major component of the earth's water and energy cycles. In the present study, the GOES precipitation index technique has been applied to the Kalpana-1 satellite infrared (IR) images of every three-hourly, i.e., of 0000, 0300, 0600,…., 2100 hours UTC, for rainfall estimation as a preparatory to the INSAT-3D. After the temperatures of all the pixels in a grid are known, they are distributed to generate a three-hourly 24-class histogram of brightness temperatures of IR (10.5-12.5 μm) images for a 1.0° × 1.0° latitude/longitude box. The daily, monthly, and seasonal rainfall have been estimated using these three-hourly rain estimates for the entire south-west monsoon period of 2009 in the present study. To investigate the potential of these rainfall estimates, the validation of monthly and seasonal rainfall estimates has been carried out using the Global Precipitation Climatology Project and Global Precipitation Climatology Centre data. The validation results show that the present technique works very well for the large-scale precipitation estimation qualitatively as well as quantitatively. The results also suggest that the simple IR-based estimation technique can be used to estimate rainfall for tropical areas at a larger temporal scale for climatological applications.

  15. LOD significance thresholds for QTL analysis in experimental populations of diploid species

    PubMed

    Van Ooijen JW

    1999-11-01

    Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.

  16. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    PubMed

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Forest Connectivity Regions of Canada Using Circuit Theory and Image Analysis

    PubMed Central

    Pelletier, David; Lapointe, Marc-Élie; Wulder, Michael A.; White, Joanne C.; Cardille, Jeffrey A.

    2017-01-01

    Ecological processes are increasingly well understood over smaller areas, yet information regarding interconnections and the hierarchical nature of ecosystems remains less studied and understood. Information on connectivity over large areas with high resolution source information provides for both local detail and regional context. The emerging capacity to apply circuit theory to create maps of omnidirectional connectivity provides an opportunity for improved and quantitative depictions of forest connectivity, supporting the formation and testing of hypotheses about the density of animal movement, ecosystem structure, and related links to natural and anthropogenic forces. In this research, our goal was to delineate regions where connectivity regimes are similar across the boreal region of Canada using new quantitative analyses for characterizing connectivity over large areas (e.g., millions of hectares). Utilizing the Earth Observation for Sustainable Development of forests (EOSD) circa 2000 Landsat-derived land-cover map, we created and analyzed a national-scale map of omnidirectional forest connectivity at 25m resolution over 10000 tiles of 625 km2 each, spanning the forested regions of Canada. Using image recognition software to detect corridors, pinch points, and barriers to movements at multiple spatial scales in each tile, we developed a simple measure of the structural complexity of connectivity patterns in omnidirectional connectivity maps. We then mapped the Circuitscape resistance distance measure and used it in conjunction with the complexity data to study connectivity characteristics in each forested ecozone. Ecozone boundaries masked substantial systematic patterns in connectivity characteristics that are uncovered using a new classification of connectivity patterns that revealed six clear groups of forest connectivity patterns found in Canada. The resulting maps allow exploration of omnidirectional forest connectivity patterns at full resolution while permitting quantitative analyses of connectivity over broad areas, informing modeling, planning and monitoring efforts. PMID:28146573

  18. Strain localisation in the continental lithosphere, a scale-dependent process

    NASA Astrophysics Data System (ADS)

    Jolivet, Laurent; Burov, Evguenii

    2013-04-01

    Strain localisation in continents is a general question tackled by specialists of various disciplines in Earth Sciences. Field geologists working at regional scale are able to describe the succession of events leading to the formation of large strain zones that accommodate large displacement within plate boundaries. On the other end of the spectrum, laboratory experiments provide numbers that quantitatively describe the rheology of rock material at the scale of a few mm and at deformation rates up to 8-10 orders of magnitude faster than in nature. Extrapolating from the scale of the experiment to the scale of the continental lithosphere is a considerable leap across 8-10 orders of magnitude both in space and time. It is however quite obvious that different processes are at work for each scale considered. At the scale of a grain aggregate diffusion within individual grains, dislocation or grain boundary sliding, depending on temperature and fluid conditions, are of primary importance. But at the scale of a mountain belt, a major detachment or a strike-slip shear zone that have accommodated tens or hundreds of kilometres of relative displacement, other parameters will take over such as structural softening and the heterogeneity of the crust inherited from past tectonic events that have juxtaposed rock units of very different compositions and induced a strong orientation of rocks. Once the deformation is localised along major shear zones, grain size reduction, interaction between rocks and fluids and metamorphic reactions and other small-scale processes tend to further localise the strain. Because the crust is colder and more lithologically complex this heterogeneity is likely much more prominent in the crust than in the mantle and then the relative importance of "small-scale" and "large-scale" parameters will be very different in the crust and in the mantle. Thus, depending upon the relative thickness of the crust and mantle in the deforming lithosphere, the role of each mechanism will have more or less important consequences on strain localisation. This complexity sometimes leads to disregard of experimental parameters in large-scale thermo-mechanical models and to use instead ad hoc "large-scale" numbers that better fit the observed geological history. The goal of the ERC RHEOLITH project is to associate to each tectonic process the relevant rheological parameters depending upon the scale considered, in an attempt to elaborate a generalized "Preliminary Rheology Model Set for Lithosphere" (PReMSL), which will cover the entire time and spatial scale range of deformation.

  19. A statics-dynamics equivalence through the fluctuation–dissipation ratio provides a window into the spin-glass phase from nonequilibrium measurements

    PubMed Central

    Baity-Jesi, Marco; Calore, Enrico; Cruz, Andres; Fernandez, Luis Antonio; Gil-Narvión, José Miguel; Gordillo-Guerrero, Antonio; Iñiguez, David; Maiorano, Andrea; Marinari, Enzo; Martin-Mayor, Victor; Monforte-Garcia, Jorge; Muñoz Sudupe, Antonio; Navarro, Denis; Parisi, Giorgio; Perez-Gaviro, Sergio; Ricci-Tersenghi, Federico; Ruiz-Lorenzo, Juan Jesus; Schifano, Sebastiano Fabio; Tarancón, Alfonso; Tripiccione, Raffaele; Yllanes, David

    2017-01-01

    We have performed a very accurate computation of the nonequilibrium fluctuation–dissipation ratio for the 3D Edwards–Anderson Ising spin glass, by means of large-scale simulations on the special-purpose computers Janus and Janus II. This ratio (computed for finite times on very large, effectively infinite, systems) is compared with the equilibrium probability distribution of the spin overlap for finite sizes. Our main result is a quantitative statics-dynamics dictionary, which could allow the experimental exploration of important features of the spin-glass phase without requiring uncontrollable extrapolations to infinite times or system sizes. PMID:28174274

  20. Quantitative controls on submarine slope failure morphology

    USGS Publications Warehouse

    Lee, H.J.; Schwab, W.C.; Edwards, B.D.; Kayen, R.E.

    1991-01-01

    The concept of the steady-state of deformation can be applied to predicting the ultimate form a landslide will take. The steady-state condition, defined by a line in void ratio-effective stress space, exists at large levels of strain and remolding. Conceptually, if sediment initially exists with void ratio-effective stress conditions above the steady-state line, the sediment shear strength will decrease during a transient loading event, such as an earthquake or storm. If the reduced shear strength existing at the steady state is less than the downslope shear stress induced by gravity, then large-scale internal deformation, disintegration, and flow will occur. -from Authors

  1. Big Hydrophobic Capillary Fluidics; Basically Water Ping Pong in Space

    NASA Astrophysics Data System (ADS)

    Weislogel, Mark; Attari, Babak; Wollman, Andrew; Cardin, Karl; Geile, John; Lindner, Thomas

    2016-11-01

    Capillary surfaces can be enormous in environments where the effects of gravity are small. In this presentation we review a number of interesting examples from demonstrative experiments performed in drop towers and aboard the International Space Station. The topic then focuses on large length scale hydrophobic phenomena including puddle jumping, spontaneous particle ejections, and large drop rebounds akin to water ping pong in space. Unseen footage of NASA Astronaut Scott Kelly playing water ping pong in space will be shown. Quantitative and qualitative results are offered to assist in the design of experiments for ongoing research. NASA NNX12A047A.

  2. Origin of the cosmic network in ΛCDM: Nature vs nurture

    NASA Astrophysics Data System (ADS)

    Shandarin, Sergei; Habib, Salman; Heitmann, Katrin

    2010-05-01

    The large-scale structure of the Universe, as traced by the distribution of galaxies, is now being revealed by large-volume cosmological surveys. The structure is characterized by galaxies distributed along filaments, the filaments connecting in turn to form a percolating network. Our objective here is to quantitatively specify the underlying mechanisms that drive the formation of the cosmic network: By combining percolation-based analyses with N-body simulations of gravitational structure formation, we elucidate how the network has its origin in the properties of the initial density field (nature) and how its contrast is then amplified by the nonlinear mapping induced by the gravitational instability (nurture).

  3. Magnetic flux concentrations from dynamo-generated fields

    NASA Astrophysics Data System (ADS)

    Jabbari, S.; Brandenburg, A.; Losada, I. R.; Kleeorin, N.; Rogachevskii, I.

    2014-08-01

    Context. The mean-field theory of magnetized stellar convection gives rise to two distinct instabilities: the large-scale dynamo instability, operating in the bulk of the convection zone and a negative effective magnetic pressure instability (NEMPI) operating in the strongly stratified surface layers. The latter might be important in connection with magnetic spot formation. However, as follows from theoretical analysis, the growth rate of NEMPI is suppressed with increasing rotation rates. On the other hand, recent direct numerical simulations (DNS) have shown a subsequent increase in the growth rate. Aims: We examine quantitatively whether this increase in the growth rate of NEMPI can be explained by an α2 mean-field dynamo, and whether both NEMPI and the dynamo instability can operate at the same time. Methods: We use both DNS and mean-field simulations (MFS) to solve the underlying equations numerically either with or without an imposed horizontal field. We use the test-field method to compute relevant dynamo coefficients. Results: DNS show that magnetic flux concentrations are still possible up to rotation rates above which the large-scale dynamo effect produces mean magnetic fields. The resulting DNS growth rates are quantitatively reproduced with MFS. As expected for weak or vanishing rotation, the growth rate of NEMPI increases with increasing gravity, but there is a correction term for strong gravity and large turbulent magnetic diffusivity. Conclusions: Magnetic flux concentrations are still possible for rotation rates above which dynamo action takes over. For the solar rotation rate, the corresponding turbulent turnover time is about 5 h, with dynamo action commencing in the layers beneath.

  4. Large-Scale Geographic Variation in Distribution and Abundance of Australian Deep-Water Kelp Forests

    PubMed Central

    Marzinelli, Ezequiel M.; Williams, Stefan B.; Babcock, Russell C.; Barrett, Neville S.; Johnson, Craig R.; Jordan, Alan; Kendrick, Gary A.; Pizarro, Oscar R.; Smale, Dan A.; Steinberg, Peter D.

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia’s Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10–100 m to 100–1,000 km) and depths (15–60 m) across several regions ca 2–6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40–50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves. PMID:25693066

  5. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    NASA Astrophysics Data System (ADS)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.

  6. Quantitative extraction of the bedrock exposure rate based on unmanned aerial vehicle data and Landsat-8 OLI image in a karst environment

    NASA Astrophysics Data System (ADS)

    Wang, Hongyan; Li, Qiangzi; Du, Xin; Zhao, Longcai

    2017-12-01

    In the karst regions of southwest China, rocky desertification is one of the most serious problems in land degradation. The bedrock exposure rate is an important index to assess the degree of rocky desertification in karst regions. Because of the inherent merits of macro-scale, frequency, efficiency, and synthesis, remote sensing is a promising method to monitor and assess karst rocky desertification on a large scale. However, actual measurement of the bedrock exposure rate is difficult and existing remote-sensing methods cannot directly be exploited to extract the bedrock exposure rate owing to the high complexity and heterogeneity of karst environments. Therefore, using unmanned aerial vehicle (UAV) and Landsat-8 Operational Land Imager (OLI) data for Xingren County, Guizhou Province, quantitative extraction of the bedrock exposure rate based on multi-scale remote-sensing data was developed. Firstly, we used an object-oriented method to carry out accurate classification of UAVimages. From the results of rock extraction, the bedrock exposure rate was calculated at the 30 m grid scale. Parts of the calculated samples were used as training data; other data were used for model validation. Secondly, in each grid the band reflectivity of Landsat-8 OLI data was extracted and a variety of rock and vegetation indexes (e.g., NDVI and SAVI) were calculated. Finally, a network model was established to extract the bedrock exposure rate. The correlation coefficient of the network model was 0.855, that of the validation model was 0.677 and the root mean square error of the validation model was 0.073. This method is valuable for wide-scale estimation of bedrock exposure rate in karst environments. Using the quantitative inversion model, a distribution map of the bedrock exposure rate in Xingren County was obtained.

  7. How Many Are Enough? A Quantitative Analysis of the Effects of the Number of Response Options on the Academic Performance of Students with Disabilities on Large-Scale Assessments

    ERIC Educational Resources Information Center

    Freeman, Sarah Reives

    2013-01-01

    The main focus of this study is to determine the effect of test design on the academic performance of students with disabilities participating in the NCEXTEND2 modified assessment program during the 2010-2011 school year. Participation of all students in state and federal accountability measure is required by No Child Left Behind (2001) and the…

  8. A Large-Scale Quantitative Proteomic Approach to Identifying Sulfur Mustard-Induced Protein Phosphorylation Cascades

    DTIC Science & Technology

    2010-01-01

    snapshot of SM-induced toxicity. Over the past few years, innovations in systems biology and biotechnology have led to important advances in our under...perturbations. SILAC has been used to study tumor metastasis (3, 4), focal adhesion- associated proteins, growth factor signaling, and insulin regula- tion (5...stained with colloidal Coomassie blue. After it was destained, the gel lane was excised into six regions, and each region was cut into 1 mm cubes

  9. Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595

  10. The impact of library services in primary care trusts in NHS North West England: a large-scale retrospective quantitative study of online resource usage in relation to types of service.

    PubMed

    Bell, Katherine; Glover, Steven William; Brodie, Colin; Roberts, Anne; Gleghorn, Colette

    2009-06-01

    Within NHS North West England there are 24 primary care trusts (PCTs), all with access to different types of library services. This study aims to evaluate the impact the type of library service has on online resource usage. We conducted a large-scale retrospective quantitative study across all PCT staff in NHS NW England using Athens sessions log data. We studied the Athens log usage of 30,381 staff, with 8,273 active Athens accounts and 100,599 sessions from 1 January 2007 to 31 December 2007. In 2007, PCTs with outreach librarians achieved 43% penetration of staff with active Athens accounts compared with PCTs with their own library service (28.23%); PCTs with service level agreements (SLAs) with acute hospital library services (22.5%) and with no library service (19.68%). This pattern was also observed when we looked at the average number of Athens user sessions per person, and usage of Dialog Datastar databases and Proquest full text journal collections. Our findings have shown a correlation of e-resource usage and type of library service. Outreach librarians have proved to be an efficient model for promoting and driving up resources usage. PCTs with no library service have shown the lowest level of resource usage.

  11. Quantifying the Hierarchical Order in Self-Aligned Carbon Nanotubes from Atomic to Micrometer Scale.

    PubMed

    Meshot, Eric R; Zwissler, Darwin W; Bui, Ngoc; Kuykendall, Tevye R; Wang, Cheng; Hexemer, Alexander; Wu, Kuang Jen J; Fornasiero, Francesco

    2017-06-27

    Fundamental understanding of structure-property relationships in hierarchically organized nanostructures is crucial for the development of new functionality, yet quantifying structure across multiple length scales is challenging. In this work, we used nondestructive X-ray scattering to quantitatively map the multiscale structure of hierarchically self-organized carbon nanotube (CNT) "forests" across 4 orders of magnitude in length scale, from 2.0 Å to 1.5 μm. Fully resolved structural features include the graphitic honeycomb lattice and interlayer walls (atomic), CNT diameter (nano), as well as the greater CNT ensemble (meso) and large corrugations (micro). Correlating orientational order across hierarchical levels revealed a cascading decrease as we probed finer structural feature sizes with enhanced sensitivity to small-scale disorder. Furthermore, we established qualitative relationships for single-, few-, and multiwall CNT forest characteristics, showing that multiscale orientational order is directly correlated with number density spanning 10 9 -10 12 cm -2 , yet order is inversely proportional to CNT diameter, number of walls, and atomic defects. Lastly, we captured and quantified ultralow-q meridional scattering features and built a phenomenological model of the large-scale CNT forest morphology, which predicted and confirmed that these features arise due to microscale corrugations along the vertical forest direction. Providing detailed structural information at multiple length scales is important for design and synthesis of CNT materials as well as other hierarchically organized nanostructures.

  12. Knowledge-Guided Robust MRI Brain Extraction for Diverse Large-Scale Neuroimaging Studies on Humans and Non-Human Primates

    PubMed Central

    Wang, Yaping; Nie, Jingxin; Yap, Pew-Thian; Li, Gang; Shi, Feng; Geng, Xiujuan; Guo, Lei; Shen, Dinggang

    2014-01-01

    Accurate and robust brain extraction is a critical step in most neuroimaging analysis pipelines. In particular, for the large-scale multi-site neuroimaging studies involving a significant number of subjects with diverse age and diagnostic groups, accurate and robust extraction of the brain automatically and consistently is highly desirable. In this paper, we introduce population-specific probability maps to guide the brain extraction of diverse subject groups, including both healthy and diseased adult human populations, both developing and aging human populations, as well as non-human primates. Specifically, the proposed method combines an atlas-based approach, for coarse skull-stripping, with a deformable-surface-based approach that is guided by local intensity information and population-specific prior information learned from a set of real brain images for more localized refinement. Comprehensive quantitative evaluations were performed on the diverse large-scale populations of ADNI dataset with over 800 subjects (55∼90 years of age, multi-site, various diagnosis groups), OASIS dataset with over 400 subjects (18∼96 years of age, wide age range, various diagnosis groups), and NIH pediatrics dataset with 150 subjects (5∼18 years of age, multi-site, wide age range as a complementary age group to the adult dataset). The results demonstrate that our method consistently yields the best overall results across almost the entire human life span, with only a single set of parameters. To demonstrate its capability to work on non-human primates, the proposed method is further evaluated using a rhesus macaque dataset with 20 subjects. Quantitative comparisons with popularly used state-of-the-art methods, including BET, Two-pass BET, BET-B, BSE, HWA, ROBEX and AFNI, demonstrate that the proposed method performs favorably with superior performance on all testing datasets, indicating its robustness and effectiveness. PMID:24489639

  13. Bridging the gap between small and large scale sediment budgets? - A scaling challenge in the Upper Rhone Basin, Switzerland

    NASA Astrophysics Data System (ADS)

    Schoch, Anna; Blöthe, Jan; Hoffmann, Thomas; Schrott, Lothar

    2016-04-01

    A large number of sediment budgets have been compiled on different temporal and spatial scales in alpine regions. Detailed sediment budgets based on the quantification of a number of sediment storages (e.g. talus cones, moraine deposits) exist only for a few small scale drainage basins (up to 10² km²). In contrast, large scale sediment budgets (> 10³ km²) consider only long term sediment sinks such as valley fills and lakes. Until now, these studies often neglect small scale sediment storages in the headwaters. However, the significance of these sediment storages have been reported. A quantitative verification whether headwaters function as sediment source regions is lacking. Despite substantial transport energy in mountain environments due to steep gradients and high relief, sediment flux in large river systems is frequently disconnected from alpine headwaters. This leads to significant storage of coarse-grained sediment along the flow path from rockwall source regions to large sedimentary sinks in major alpine valleys. To improve the knowledge on sediment budgets in large scale alpine catchments and to bridge the gap between small and large scale sediment budgets, we apply a multi-method approach comprising investigations on different spatial scales in the Upper Rhone Basin (URB). The URB is the largest inneralpine basin in the European Alps with a size of > 5400 km². It is a closed system with Lake Geneva acting as an ultimate sediment sink for suspended and clastic sediment. We examine the spatial pattern and volumes of sediment storages as well as the morphometry on the local and catchment-wide scale. We mapped sediment storages and bedrock in five sub-regions of the study area (Goms, Lötschen valley, Val d'Illiez, Vallée de la Liène, Turtmann valley) in the field and from high-resolution remote sensing imagery to investigate the spatial distribution of different sediment storage types (e.g. talus deposits, debris flow cones, alluvial fans). These sub-regions cover all three litho-tectonic units of the URB (Helvetic nappes, Penninic nappes, External massifs) and different catchment sizes to capture the inherent variability. Different parameters characterizing topography, surface characteristics, and vegetation cover are analyzed for each storage type. The data is then used in geostatistical models (PCA, stepwise logistic regression) to predict the spatial distribution of sediment storage for the whole URB. We further conduct morphometric analyses of the URB to gain information on the varying degree of glacial imprint and postglacial landscape evolution and their control on the spatial distribution of sediment storage in a large scale drainage basin. Geophysical methods (ground penetrating radar and electrical resistivity tomography) are applied on different sediment storage types on the local scale to estimate mean thicknesses. Additional data from published studies are used to complement our dataset. We integrate the local data in the statistical model on the spatial distribution of sediment storages for the whole URB. Hence, we can extrapolate the stored sediment volumes to the regional scale in order to bridge the gap between small and large scale studies.

  14. Quantitative Analysis of Repertoire-Scale Immunoglobulin Properties in Vaccine-Induced B-Cell Responses

    DTIC Science & Technology

    2017-05-10

    repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1

  15. Proposal for a quantitative index of flood disasters.

    PubMed

    Feng, Lihua; Luo, Gaoyuan

    2010-07-01

    Drawing on calculations of wind scale and earthquake magnitude, this paper develops a new quantitative method for measuring flood magnitude and disaster intensity. Flood magnitude is the quantitative index that describes the scale of a flood; the flood's disaster intensity is the quantitative index describing the losses caused. Both indices have numerous theoretical and practical advantages with definable concepts and simple applications, which lend them key practical significance.

  16. Ketamine as a novel treatment for major depressive disorder and bipolar depression: a systematic review and quantitative meta-analysis.

    PubMed

    Lee, Ellen E; Della Selva, Megan P; Liu, Anson; Himelhoch, Seth

    2015-01-01

    Given the significant disability, morbidity and mortality associated with depression, the promising recent trials of ketamine highlight a novel intervention. A meta-analysis was conducted to assess the efficacy of ketamine in comparison with placebo for the reduction of depressive symptoms in patients who meet criteria for a major depressive episode. Two electronic databases were searched in September 2013 for English-language studies that were randomized placebo-controlled trials of ketamine treatment for patients with major depressive disorder or bipolar depression and utilized a standardized rating scale. Studies including participants receiving electroconvulsive therapy and adolescent/child participants were excluded. Five studies were included in the quantitative meta-analysis. The quantitative meta-analysis showed that ketamine significantly reduced depressive symptoms. The overall effect size at day 1 was large and statistically significant with an overall standardized mean difference of 1.01 (95% confidence interval 0.69-1.34) (P<.001), with the effects sustained at 7 days postinfusion. The heterogeneity of the studies was low and not statistically significant, and the funnel plot showed no publication bias. The large and statistically significant effect of ketamine on depressive symptoms supports a promising, new and effective pharmacotherapy with rapid onset, high efficacy and good tolerability. Copyright © 2015. Published by Elsevier Inc.

  17. Preservation of large-scale chromatin structure in FISH experiments

    PubMed Central

    Hepperger, Claudia; Otten, Simone; von Hase, Johann

    2006-01-01

    The nuclear organization of specific endogenous chromatin regions can be investigated only by fluorescence in situ hybridization (FISH). One of the two fixation procedures is typically applied: (1) buffered formaldehyde or (2) hypotonic shock with methanol acetic acid fixation followed by dropping of nuclei on glass slides and air drying. In this study, we compared the effects of these two procedures and some variations on nuclear morphology and on FISH signals. We analyzed mouse erythroleukemia and mouse embryonic stem cells because their clusters of subcentromeric heterochromatin provide an easy means to assess preservation of chromatin. Qualitative and quantitative analyses revealed that formaldehyde fixation provided good preservation of large-scale chromatin structures, while classical methanol acetic acid fixation after hypotonic treatment severely impaired nuclear shape and led to disruption of chromosome territories, heterochromatin structures, and large transgene arrays. Our data show that such preparations do not faithfully reflect in vivo nuclear architecture. Electronic supplementary material Supplementary material is available in the online version of this article at http://dx.doi.org/10.1007/s00412-006-0084-2 and is accessible for authorized users. PMID:17119992

  18. Allan deviation analysis of financial return series

    NASA Astrophysics Data System (ADS)

    Hernández-Pérez, R.

    2012-05-01

    We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.

  19. Flow turbulence topology in regular porous media: From macroscopic to microscopic scale with direct numerical simulation

    NASA Astrophysics Data System (ADS)

    Chu, Xu; Weigand, Bernhard; Vaikuntanathan, Visakh

    2018-06-01

    Microscopic analysis of turbulence topology in a regular porous medium is presented with a series of direct numerical simulation. The regular porous media are comprised of square cylinders in a staggered array. Triply periodic boundary conditions enable efficient investigations in a representative elementary volume. Three flow patterns—channel with sudden contraction, impinging surface, and wake—are observed and studied quantitatively in contrast to the qualitative experimental studies reported in the literature. Among these, shear layers in the channel show the highest turbulence intensity due to a favorable pressure gradient and shed due to an adverse pressure gradient downstream. The turbulent energy budget indicates a strong production rate after the flow contraction and a strong dissipation on both shear and impinging walls. Energy spectra and pre-multiplied spectra detect large scale energetic structures in the shear layer and a breakup of scales in the impinging layer. However, these large scale structures break into less energetic small structures at high Reynolds number conditions. This suggests an absence of coherent structures in densely packed porous media at high Reynolds numbers. Anisotropy analysis with a barycentric map shows that the turbulence in porous media is highly isotropic in the macro-scale, which is not the case in the micro-scale. In the end, proper orthogonal decomposition is employed to distinguish the energy-conserving structures. The results support the pore scale prevalence hypothesis. However, energetic coherent structures are observed in the case with sparsely packed porous media.

  20. Large wood in the Snowy River estuary, Australia

    NASA Astrophysics Data System (ADS)

    Hinwood, Jon B.; McLean, Errol J.

    2017-02-01

    In this paper we report on 8 years of data collection and interpretation of large wood in the Snowy River estuary in southeastern Australia, providing quantitative data on the amount, sources, transport, decay, and geomorphic actions. No prior census data for an estuary is known to the authors despite their environmental and economic importance and the significant differences between a fluvial channel and an estuarine channel. Southeastern Australian estuaries contain a significant quantity of large wood that is derived from many sources, including river flood flows, local bank erosion, and anthropogenic sources. Wind and tide are shown to be as important as river flow in transporting and stranding large wood. Tidal action facilitates trapping of large wood on intertidal bars and shoals; but channels are wider and generally deeper, so log jams are less likely than in rivers. Estuarine large wood contributes to localised scour and accretion and hence to the modification of estuarine habitat, but in the study area it did not have large-scale impacts on the hydraulic gradients nor the geomorphology.

  1. Quantitative Assessment the Relationship between p21 rs1059234 Polymorphism and Cancer Risk.

    PubMed

    Huang, Yong-Sheng; Fan, Qian-Qian; Li, Chuang; Nie, Meng; Quan, Hong-Yang; Wang, Lin

    2015-01-01

    p21 is a cyclin-dependent kinase inhibitor, which can arrest cell proliferation and serve as a tumor suppressor. Though many studies were published to assess the relationship between p21 rs1059234 polymorphism and various cancer risks, there was no definite conclusion on this association. To derive a more precise quantitative assessment of the relationship, a large scale meta-analysis of 5,963 cases and 8,405 controls from 16 eligible published case-control studies was performed. Our analysis suggested that rs1059234 was not associated with the integral cancer risk for both dominant model [(T/T+C/T) vs C/C, OR=1.00, 95% CI: 0.84-1.18] and recessive model [T/T vs (C/C+C/T), OR=1.03, 95% CI: 0.93-1.15)]. However, further stratified analysis showed rs1059234 was greatly associated with the risk of squamous cell carcinoma of head and neck (SCCHN). Thus, larger scale primary studies are still required to further evaluate the interaction of p21 rs1059234 polymorphism and cancer risk in specific cancer subtypes.

  2. Quantitative elemental imaging of heterogeneous catalysts using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Trichard, F.; Sorbier, L.; Moncayo, S.; Blouët, Y.; Lienemann, C.-P.; Motto-Ros, V.

    2017-07-01

    Currently, the use of catalysis is widespread in almost all industrial processes; its use improves productivity, synthesis yields and waste treatment as well as decreases energy costs. The increasingly stringent requirements, in terms of reaction selectivity and environmental standards, impose progressively increasing accuracy and control of operations. Meanwhile, the development of characterization techniques has been challenging, and the techniques often require equipment with high complexity. In this paper, we demonstrate a novel elemental approach for performing quantitative space-resolved analysis with ppm-scale quantification limits and μm-scale resolution. This approach, based on laser-induced breakdown spectroscopy (LIBS), is distinguished by its simplicity, all-optical design, and speed of operation. This work analyzes palladium-based porous alumina catalysts, which are commonly used in the selective hydrogenation process, using the LIBS method. We report an exhaustive study of the quantification capability of LIBS and its ability to perform imaging measurements over a large dynamic range, typically from a few ppm to wt%. These results offer new insight into the use of LIBS-based imaging in the industry and paves the way for innumerable applications.

  3. To what extent does immigration affect inequality?

    NASA Astrophysics Data System (ADS)

    Berman, Yonatan; Aste, Tomaso

    2016-11-01

    The current surge in income and wealth inequality in most western countries, along with the continuous immigration to those countries demand a quantitative analysis of the effect immigration has on economic inequality. This paper presents a quantitative analysis framework providing a way to calculate this effect. It shows that in most cases, the effect of immigration on wealth and income inequality is limited, mainly due to the relative small scale of immigration waves. For a large scale flow of immigrants, such as the immigration to the US, the UK and Australia in the past few decades, we estimate that 10 % ÷ 15 % of the wealth and income inequality increase can be attributed to immigration. The results demonstrate that immigration could possibly decrease inequality substantially, if the characteristics of the immigrants resemble the characteristics of the destination middle class population in terms of wealth or income. We empirically found that the simple linear relation ΔS = 0.18 ρ roughly describes the increase in the wealth share of the top 10 % due to immigration of a fraction ρ of the population.

  4. Quantitative spatiotemporal analysis of antibody fragment diffusion and endocytic consumption in tumor spheroids.

    PubMed

    Thurber, Greg M; Wittrup, K Dane

    2008-05-01

    Antibody-based cancer treatment depends upon distribution of the targeting macromolecule throughout tumor tissue, and spatial heterogeneity could significantly limit efficacy in many cases. Antibody distribution in tumor tissue is a function of drug dosage, antigen concentration, binding affinity, antigen internalization, drug extravasation from blood vessels, diffusion in the tumor extracellular matrix, and systemic clearance rates. We have isolated the effects of a subset of these variables by live-cell microscopic imaging of single-chain antibody fragments against carcinoembryonic antigen in LS174T tumor spheroids. The measured rates of scFv penetration and retention were compared with theoretical predictions based on simple scaling criteria. The theory predicts that antibody dose must be large enough to drive a sufficient diffusive flux of antibody to overcome cellular internalization, and exposure time must be long enough to allow penetration to the spheroid center. The experimental results in spheroids are quantitatively consistent with these predictions. Therefore, simple scaling criteria can be applied to accurately predict antibody and antibody fragment penetration distance in tumor tissue.

  5. Quantitative Spatiotemporal Analysis of Antibody Fragment Diffusion and Endocytic Consumption in Tumor Spheroids

    PubMed Central

    Thurber, Greg M.; Wittrup, K. Dane

    2010-01-01

    Antibody-based cancer treatment depends upon distribution of the targeting macromolecule throughout tumor tissue, and spatial heterogeneity could significantly limit efficacy in many cases. Antibody distribution in tumor tissue is a function of drug dosage, antigen concentration, binding affinity, antigen internalization, drug extravasation from blood vessels, diffusion in the tumor extracellular matrix, and systemic clearance rates. We have isolated the effects of a subset of these variables by live-cell microscopic imaging of single-chain antibody fragments against carcinoembryonic antigen in LS174T tumor spheroids. The measured rates of scFv penetration and retention were compared with theoretical predictions based on simple scaling criteria. The theory predicts that antibody dose must be large enough to drive a sufficient diffusive flux of antibody to overcome cellular internalization, and exposure time must be long enough to allow penetration to the spheroid center. The experimental results in spheroids are quantitatively consistent with these predictions. Therefore, simple scaling criteria can be applied to accurately predict antibody and antibody fragment penetration distance in tumor tissue. PMID:18451160

  6. Uncertainties in ecosystem service maps: a comparison on the European scale.

    PubMed

    Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H

    2014-01-01

    Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.

  7. Accreting transition discs with large cavities created by X-ray photoevaporation in C and O depleted discs

    NASA Astrophysics Data System (ADS)

    Ercolano, Barbara; Weber, Michael L.; Owen, James E.

    2018-01-01

    Circumstellar discs with large dust depleted cavities and vigorous accretion on to the central star are often considered signposts for (multiple) giant planet formation. In this Letter, we show that X-ray photoevaporation operating in discs with modest (factors 3-10) gas-phase depletion of carbon and oxygen at large radii ( > 15 au) yields the inner radius and accretion rates for most of the observed discs, without the need to invoke giant planet formation. We present one-dimensional viscous evolution models of discs affected by X-ray photoevaporation assuming moderate gas-phase depletion of carbon and oxygen, well within the range reported by recent observations. Our models use a simplified prescription for scaling the X-ray photoevaporation rates and profiles at different metallicity, and our quantitative result depends on this scaling. While more rigorous hydrodynamical modelling of mass-loss profiles at low metallicities is required to constrain the observational parameter space that can be explained by our models, the general conclusion that metal sequestering at large radii may be responsible for the observed diversity of transition discs is shown to be robust. Gap opening by giant planet formation may still be responsible for a number of observed transition discs with large cavities and very high accretion rate.

  8. Anomalous transport theory for the reversed field pinch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terry, P.W.; Hegna, C.C; Sovinec, C.R.

    1996-09-01

    Physically motivated transport models with predictive capabilities and significance beyond the reversed field pinch (RFP) are presented. It is shown that the ambipolar constrained electron heat loss observed in MST can be quantitatively modeled by taking account of the clumping in parallel streaming electrons and the resultant self-consistent interaction with collective modes; that the discrete dynamo process is a relaxation oscillation whose dependence on the tearing instability and profile relaxation physics leads to amplitude and period scaling predictions consistent with experiment; that the Lundquist number scaling in relaxed plasmas driven by magnetic turbulence has a weak S{sup {minus}1/4} scaling; andmore » that radial E{times}B shear flow can lead to large reductions in the edge particle flux with little change in the heat flux, as observed in the RFP and tokamak. 24 refs.« less

  9. The Role of Endocytosis during Morphogenetic Signaling

    PubMed Central

    Gonzalez-Gaitan, Marcos; Jülicher, Frank

    2014-01-01

    Morphogens are signaling molecules that are secreted by a localized source and spread in a target tissue where they are involved in the regulation of growth and patterning. Both the activity of morphogenetic signaling and the kinetics of ligand spreading in a tissue depend on endocytosis and intracellular trafficking. Here, we review quantitative approaches to study how large-scale morphogen profiles and signals emerge in a tissue from cellular trafficking processes and endocytic pathways. Starting from the kinetics of endosomal networks, we discuss the role of cellular trafficking and receptor dynamics in the formation of morphogen gradients. These morphogen gradients scale during growth, which implies that overall tissue size influences cellular trafficking kinetics. Finally, we discuss how such morphogen profiles can be used to control tissue growth. We emphasize the role of theory in efforts to bridge between scales. PMID:24984777

  10. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    NASA Astrophysics Data System (ADS)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  11. Harnessing Connectivity in a Large-Scale Small-Molecule Sensitivity Dataset | Office of Cancer Genomics

    Cancer.gov

    Identifying genetic alterations that prime a cancer cell to respond to a particular therapeutic agent can facilitate the development of precision cancer medicines. Cancer cell-line (CCL) profiling of small-molecule sensitivity has emerged as an unbiased method to assess the relationships between genetic or cellular features of CCLs and small-molecule response. Here, we developed annotated cluster multidimensional enrichment analysis to explore the associations between groups of small molecules and groups of CCLs in a new, quantitative sensitivity dataset.

  12. Transnational Terrorism in East Africa: A Qualitative and Quantitative Analysis of the Recent Rise in Kenyan Violence

    DTIC Science & Technology

    2014-06-01

    vibrant coastal beach tourism industry that is at odds with the locally dominant Islamic religion and culture; the perception that the country’s...other large scale cross-border attack by Al- Shabaab was the July 2010 twin bombing of a popular sports bar and restaurant in the Ugandan capital...Kenyan nationals from neighboring countries, as the majority of them frequently travel by road to and from Kenya for business, leisure, and tourism , 35

  13. Recent advances in research on climate and human conflict

    NASA Astrophysics Data System (ADS)

    Hsiang, S. M.

    2014-12-01

    A rapidly growing body of empirical, quantitative research examines whether rates of human conflict can be systematically altered by climatic changes. We discuss recent advances in this field, including Bayesian meta-analyses of the effect of temperature and rainfall on current and future large-scale conflicts, the impact of climate variables on gang violence and suicides in Mexico, and probabilistic projections of personal violence and property crime in the United States under RCP scenarios. Criticisms of this research field will also be explained and addressed.

  14. Quantitative mapping of rainfall rates over the oceans utilizing Nimbus-5 ESMR data

    NASA Technical Reports Server (NTRS)

    Rao, M. S. V.; Abbott, W. V.

    1976-01-01

    The electrically scanning microwave radiometer (ESMR) data from the Nimbus 5 satellite was used to deduce estimates of precipitation amount over the oceans. An atlas of the global oceanic rainfall was prepared and the global rainfall maps analyzed and related to available ground truth information as well as to large scale processes in the atmosphere. It was concluded that the ESMR system provides the most reliable and direct approach yet known for the estimation of rainfall over sparsely documented, wide oceanic regions.

  15. Hydrodynamic predictions for 5.44 TeV Xe+Xe collisions

    NASA Astrophysics Data System (ADS)

    Giacalone, Giuliano; Noronha-Hostler, Jacquelyn; Luzum, Matthew; Ollitrault, Jean-Yves

    2018-03-01

    We argue that relativistic hydrodynamics is able to make robust predictions for soft particle production in Xe+Xe collisions at the CERN Large Hadron Collider (LHC). The change of system size from Pb+Pb to Xe+Xe provides a unique opportunity to test the scaling laws inherent to fluid dynamics. Using event-by-event hydrodynamic simulations, we make quantitative predictions for several observables: mean transverse momentum, anisotropic flow coefficients, and their fluctuations. Results are shown as a function of collision centrality.

  16. Sampling large landscapes with small-scale stratification-User's Manual

    USGS Publications Warehouse

    Bart, Jonathan

    2011-01-01

    This manual explains procedures for partitioning a large landscape into plots, assigning the plots to strata, and selecting plots in each stratum to be surveyed. These steps are referred to as the "sampling large landscapes (SLL) process." We assume that users of the manual have a moderate knowledge of ArcGIS and Microsoft ® Excel. The manual is written for a single user but in many cases, some steps will be carried out by a biologist designing the survey and some steps will be carried out by a quantitative assistant. Thus, the manual essentially may be passed back and forth between these users. The SLL process primarily has been used to survey birds, and we refer to birds as subjects of the counts. The process, however, could be used to count any objects. ®

  17. Quantifying patterns of research interest evolution

    NASA Astrophysics Data System (ADS)

    Jia, Tao; Wang, Dashun; Szymanski, Boleslaw

    Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.

  18. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  19. Combining Phage and Yeast Cell Surface Antibody Display to Identify Novel Cell Type-Selective Internalizing Human Monoclonal Antibodies.

    PubMed

    Bidlingmaier, Scott; Su, Yang; Liu, Bin

    2015-01-01

    Using phage antibody display, large libraries can be generated and screened to identify monoclonal antibodies with affinity for target antigens. However, while library size and diversity is an advantage of the phage display method, there is limited ability to quantitatively enrich for specific binding properties such as affinity. One way of overcoming this limitation is to combine the scale of phage display selections with the flexibility and quantitativeness of FACS-based yeast surface display selections. In this chapter we describe protocols for generating yeast surface antibody display libraries using phage antibody display selection outputs as starting material and FACS-based enrichment of target antigen-binding clones from these libraries. These methods should be widely applicable for the identification of monoclonal antibodies with specific binding properties.

  20. Photosynthetic Control of Atmospheric Carbonyl Sulfide during the Growing Season

    NASA Technical Reports Server (NTRS)

    Campbell, J. Elliott; Carmichael, Gregory R.; Chai, T.; Mena-Carrasco, M.; Tang, Y.; Blake, D. R.; Blake, N. J.; Vay, Stephanie A.; Collatz, G. James; Baker, I.; hide

    2008-01-01

    Climate models incorporate photosynthesis-climate feedbacks, yet we lack robust tools for large-scale assessments of these processes. Recent work suggests that carbonyl sulfide (COS), a trace gas consumed by plants, could provide a valuable constraint on photosynthesis. Here we analyze airborne observations of COS and carbon dioxide concentrations during the growing season over North America with a three-dimensional atmospheric transport model. We successfully modeled the persistent vertical drawdown of atmospheric COS using the quantitative relation between COS and photosynthesis that has been measured in plant chamber experiments. Furthermore, this drawdown is driven by plant uptake rather than other continental and oceanic fluxes in the model. These results provide quantitative evidence that COS gradients in the continental growing season may have broad use as a measurement-based photosynthesis tracer.

  1. A Systematic Review of Barriers and Facilitators to Minority Research Participation Among African Americans, Latinos, Asian Americans, and Pacific Islanders

    PubMed Central

    Duran, Nelida; Norris, Keith

    2014-01-01

    To assess the experienced or perceived barriers and facilitators to health research participation for major US racial/ethnic minority populations, we conducted a systematic review of qualitative and quantitative studies from a search on PubMed and Web of Science from January 2000 to December 2011. With 44 articles included in the review, we found distinct and shared barriers and facilitators. Despite different expressions of mistrust, all groups represented in these studies were willing to participate for altruistic reasons embedded in cultural and community priorities. Greater comparative understanding of barriers and facilitators to racial/ethnic minorities’ research participation can improve population-specific recruitment and retention strategies and could better inform future large-scale prospective quantitative and in-depth ethnographic studies. PMID:24328648

  2. Quantitative Measurement of Local Infrared Absorption and Dielectric Function with Tip-Enhanced Near-Field Microscopy.

    PubMed

    Govyadinov, Alexander A; Amenabar, Iban; Huth, Florian; Carney, P Scott; Hillenbrand, Rainer

    2013-05-02

    Scattering-type scanning near-field optical microscopy (s-SNOM) and Fourier transform infrared nanospectroscopy (nano-FTIR) are emerging tools for nanoscale chemical material identification. Here, we push s-SNOM and nano-FTIR one important step further by enabling them to quantitatively measure local dielectric constants and infrared absorption. Our technique is based on an analytical model, which allows for a simple inversion of the near-field scattering problem. It yields the dielectric permittivity and absorption of samples with 2 orders of magnitude improved spatial resolution compared to far-field measurements and is applicable to a large class of samples including polymers and biological matter. We verify the capabilities by determining the local dielectric permittivity of a PMMA film from nano-FTIR measurements, which is in excellent agreement with far-field ellipsometric data. We further obtain local infrared absorption spectra with unprecedented accuracy in peak position and shape, which is the key to quantitative chemometrics on the nanometer scale.

  3. MathPatch - Raising Retention and Performance in an Intro-geoscience Class by Raising Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Baer, E. M.; Whittington, C.; Burn, H.

    2008-12-01

    The geological sciences are fundamentally quantitative. However, the diversity of students' mathematical preparation and skills makes the successful use of quantitative concepts difficult in introductory level classes. At Highline Community College, we have implemented a one-credit co-requisite course to give students supplemental instruction for quantitative skills used in the course. The course, formally titled "Quantitative Geology," nicknamed "MathPatch," runs parallel to our introductory Physical Geology course. MathPatch teaches the quantitative skills required for the geology class right before they are needed. Thus, students learn only the skills they need and are given opportunities to apply them immediately. Topics include complex-graph reading, unit conversions, large numbers, scientific notation, scale and measurement, estimation, powers of 10, and other fundamental mathematical concepts used in basic geological concepts. Use of this course over the past 8 years has successfully accomplished the goals of increasing students' quantitative skills, success and retention. Students master the quantitative skills to a greater extent than before the course was implemented, and less time is spent covering basic quantitative skills in the classroom. Because the course supports the use of quantitative skills, the large number of faculty that teach Geology 101 are more comfortable in using quantitative analysis, and indeed see it as an expectation of the course at Highline. Also significant, retention in the geology course has increased substantially, from 75% to 85%. Although successful, challenges persist with requiring MathPatch as a supplementary course. One, we have seen enrollments decrease in Geology 101, which may be the result of adding this co-requisite. Students resist mandatory enrollment in the course, although they are not good at evaluating their own need for the course. The logistics utilizing MathPatch in an evening class with fewer and longer class meetings has been challenging. Finally, in order to better serve our students' needs, we began to offer on-line sections of MathPatch; this mode of instruction is not as clearly effective, although it is very popular. Through the new The Math You Need project, we hope to improve the effectiveness of the on-line instruction so it can provide comparable results to the face-to-face sections of this class.

  4. Will Systems Biology Deliver Its Promise and Contribute to the Development of New or Improved Vaccines? What Really Constitutes the Study of "Systems Biology" and How Might Such an Approach Facilitate Vaccine Design.

    PubMed

    Germain, Ronald N

    2017-10-16

    A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  5. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  6. Probing the non-equilibrium force fluctuation spectrum of actomyosin cortices in vivo

    NASA Astrophysics Data System (ADS)

    Tan, Tzer Han; Swartz, Zachary; Keren, Kinneret; Fakhri, Nikta

    Mechanics of the cortex govern the shape of animal cells, and its dynamics underlie cell migration, cytokinesis and embryogenesis. The molecular players involved are largely known, yet it is unclear how their collective dynamics give rise to large scale behavior. This is mostly due to the lack of experimental tools to probe the spatially varying active mechanical properties of the cortex. Here, we introduce a novel technique based on fluorescent single walled carbon nanotubes to generate non-equilibrium force fluctuation spectrum of actomysion cortices in starfish oocytes. The quantitative measurements combined with a theoretical model reveal the role of stress organization in active mechanics and dynamics of the cortex.

  7. Response of human populations to large-scale emergencies

    NASA Astrophysics Data System (ADS)

    Bagrow, James; Wang, Dashun; Barabási, Albert-László

    2010-03-01

    Until recently, little quantitative data regarding collective human behavior during dangerous events such as bombings and riots have been available, despite its importance for emergency management, safety and urban planning. Understanding how populations react to danger is critical for prediction, detection and intervention strategies. Using a large telecommunications dataset, we study for the first time the spatiotemporal, social and demographic response properties of people during several disasters, including a bombing, a city-wide power outage, and an earthquake. Call activity rapidly increases after an event and we find that, when faced with a truly life-threatening emergency, information rapidly propagates through a population's social network. Other events, such as sports games, do not exhibit this propagation.

  8. Origin of the cosmic network in {Lambda}CDM: Nature vs nurture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shandarin, Sergei; Habib, Salman; Heitmann, Katrin

    The large-scale structure of the Universe, as traced by the distribution of galaxies, is now being revealed by large-volume cosmological surveys. The structure is characterized by galaxies distributed along filaments, the filaments connecting in turn to form a percolating network. Our objective here is to quantitatively specify the underlying mechanisms that drive the formation of the cosmic network: By combining percolation-based analyses with N-body simulations of gravitational structure formation, we elucidate how the network has its origin in the properties of the initial density field (nature) and how its contrast is then amplified by the nonlinear mapping induced by themore » gravitational instability (nurture).« less

  9. From Fibrils to Toughness: Multi-Scale Mechanics of Fibrillating Interfaces in Stretchable Electronics

    PubMed Central

    van der Sluis, Olaf; Vossen, Bart; Geers, Marc

    2018-01-01

    Metal-elastomer interfacial systems, often encountered in stretchable electronics, demonstrate remarkably high interface fracture toughness values. Evidently, a large gap exists between the rather small adhesion energy levels at the microscopic scale (‘intrinsic adhesion’) and the large measured macroscopic work-of-separation. This energy gap is closed here by unravelling the underlying dissipative mechanisms through a systematic numerical/experimental multi-scale approach. This self-containing contribution collects and reviews previously published results and addresses the remaining open questions by providing new and independent results obtained from an alternative experimental set-up. In particular, the experimental studies on Cu-PDMS (Poly(dimethylsiloxane)) samples conclusively reveal the essential role of fibrillation mechanisms at the micro-meter scale during the metal-elastomer delamination process. The micro-scale numerical analyses on single and multiple fibrils show that the dynamic release of the stored elastic energy by multiple fibril fracture, including the interaction with the adjacent deforming bulk PDMS and its highly nonlinear behaviour, provide a mechanistic understanding of the high work-of-separation. An experimentally validated quantitative relation between the macroscopic work-of-separation and peel front height is established from the simulation results. Finally, it is shown that a micro-mechanically motivated shape of the traction-separation law in cohesive zone models is essential to describe the delamination process in fibrillating metal-elastomer systems in a physically meaningful way. PMID:29393908

  10. Spectral saliency via automatic adaptive amplitude spectrum analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  11. Appplication of statistical mechanical methods to the modeling of social networks

    NASA Astrophysics Data System (ADS)

    Strathman, Anthony Robert

    With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.

  12. Primordial Magnetic Field Effects on the CMB and Large-Scale Structure

    DOE PAGES

    Yamazaki, Dai G.; Ichiki, Kiyotomo; Kajino, Toshitaka; ...

    2010-01-01

    Mmore » agnetic fields are everywhere in nature, and they play an important role in every astronomical environment which involves the formation of plasma and currents. It is natural therefore to suppose that magnetic fields could be present in the turbulent high-temperature environment of the big bang. Such a primordial magnetic field (PF) would be expected to manifest itself in the cosmic microwave background (CB) temperature and polarization anisotropies, and also in the formation of large-scale structure. In this paper, we summarize the theoretical framework which we have developed to calculate the PF power spectrum to high precision. Using this formulation, we summarize calculations of the effects of a PF which take accurate quantitative account of the time evolution of the cutoff scale. We review the constructed numerical program, which is without approximation, and an improvement over the approach used in a number of previous works for studying the effect of the PF on the cosmological perturbations. We demonstrate how the PF is an important cosmological physical process on small scales. We also summarize the current constraints on the PF amplitude B λ and the power spectral index n B which have been deduced from the available CB observational data by using our computational framework.« less

  13. pGlyco 2.0 enables precision N-glycoproteomics with comprehensive quality control and one-step mass spectrometry for intact glycopeptide identification.

    PubMed

    Liu, Ming-Qi; Zeng, Wen-Feng; Fang, Pan; Cao, Wei-Qian; Liu, Chao; Yan, Guo-Quan; Zhang, Yang; Peng, Chao; Wu, Jian-Qiang; Zhang, Xiao-Jin; Tu, Hui-Jun; Chi, Hao; Sun, Rui-Xiang; Cao, Yong; Dong, Meng-Qiu; Jiang, Bi-Yun; Huang, Jiang-Ming; Shen, Hua-Li; Wong, Catherine C L; He, Si-Min; Yang, Peng-Yuan

    2017-09-05

    The precise and large-scale identification of intact glycopeptides is a critical step in glycoproteomics. Owing to the complexity of glycosylation, the current overall throughput, data quality and accessibility of intact glycopeptide identification lack behind those in routine proteomic analyses. Here, we propose a workflow for the precise high-throughput identification of intact N-glycopeptides at the proteome scale using stepped-energy fragmentation and a dedicated search engine. pGlyco 2.0 conducts comprehensive quality control including false discovery rate evaluation at all three levels of matches to glycans, peptides and glycopeptides, improving the current level of accuracy of intact glycopeptide identification. The N-glycoproteome of samples metabolically labeled with 15 N/ 13 C were analyzed quantitatively and utilized to validate the glycopeptide identification, which could be used as a novel benchmark pipeline to compare different search engines. Finally, we report a large-scale glycoproteome dataset consisting of 10,009 distinct site-specific N-glycans on 1988 glycosylation sites from 955 glycoproteins in five mouse tissues.Protein glycosylation is a heterogeneous post-translational modification that generates greater proteomic diversity that is difficult to analyze. Here the authors describe pGlyco 2.0, a workflow for the precise one step identification of intact N-glycopeptides at the proteome scale.

  14. Estimation of regional-scale groundwater flow properties in the Bengal Basin of India and Bangladesh

    USGS Publications Warehouse

    Michael, H.A.; Voss, C.I.

    2009-01-01

    Quantitative evaluation of management strategies for long-term supply of safe groundwater for drinking from the Bengal Basin aquifer (India and Bangladesh) requires estimation of the large-scale hydrogeologic properties that control flow. The Basin consists of a stratified, heterogeneous sequence of sediments with aquitards that may separate aquifers locally, but evidence does not support existence of regional confining units. Considered at a large scale, the Basin may be aptly described as a single aquifer with higher horizontal than vertical hydraulic conductivity. Though data are sparse, estimation of regional-scale aquifer properties is possible from three existing data types: hydraulic heads, 14C concentrations, and driller logs. Estimation is carried out with inverse groundwater modeling using measured heads, by model calibration using estimated water ages based on 14C, and by statistical analysis of driller logs. Similar estimates of hydraulic conductivities result from all three data types; a resulting typical value of vertical anisotropy (ratio of horizontal to vertical conductivity) is 104. The vertical anisotropy estimate is supported by simulation of flow through geostatistical fields consistent with driller log data. The high estimated value of vertical anisotropy in hydraulic conductivity indicates that even disconnected aquitards, if numerous, can strongly control the equivalent hydraulic parameters of an aquifer system. ?? US Government 2009.

  15. Reliable determination of oxygen and hydrogen isotope ratios in atmospheric water vapour adsorbed on 3A molecular sieve.

    PubMed

    Han, Liang-Feng; Gröning, Manfred; Aggarwal, Pradeep; Helliker, Brent R

    2006-01-01

    The isotope ratio of atmospheric water vapour is determined by wide-ranging feedback effects from the isotope ratio of water in biological water pools, soil surface horizons, open water bodies and precipitation. Accurate determination of atmospheric water vapour isotope ratios is important for a broad range of research areas from leaf-scale to global-scale isotope studies. In spite of the importance of stable isotopic measurements of atmospheric water vapour, there is a paucity of published data available, largely because of the requirement for liquid nitrogen or dry ice for quantitative trapping of water vapour. We report results from a non-cryogenic method for quantitatively trapping atmospheric water vapour using 3A molecular sieve, although water is removed from the column using standard cryogenic methods. The molecular sieve column was conditioned with water of a known isotope ratio to 'set' the background signature of the molecular sieve. Two separate prototypes were developed, one for large collection volumes (3 mL) and one for small collection volumes (90 microL). Atmospheric water vapour was adsorbed to the column by pulling air through the column for several days to reach the desired final volume. Water was recovered from the column by baking at 250 degrees C in a dry helium or nitrogen air stream and cryogenically trapped. For the large-volume apparatus, the recovered water differed from water that was simultaneously trapped by liquid nitrogen (the experimental control) by 2.6 per thousand with a standard deviation (SD) of 1.5 per thousand for delta(2)H and by 0.3 per thousand with a SD of 0.2 per thousand for delta(18)O. Water-vapour recovery was not satisfactory for the small volume apparatus. Copyright (c) 2006 John Wiley & Sons, Ltd.

  16. Radar-based Quantitative Precipitation Forecasting using Spatial-scale Decomposition Method for Urban Flood Management

    NASA Astrophysics Data System (ADS)

    Yoon, S.; Lee, B.; Nakakita, E.; Lee, G.

    2016-12-01

    Recent climate changes and abnormal weather phenomena have resulted in increased occurrences of localized torrential rainfall. Urban areas in Korea have suffered from localized heavy rainfall, including the notable Seoul flood disaster in 2010 and 2011. The urban hydrological environment has changed in relation to precipitation, such as reduced concentration time, a decreased storage rate, and increased peak discharge. These changes have altered and accelerated the severity of damage to urban areas. In order to prevent such urban flash flood damages, we have to secure the lead time for evacuation through the improvement of radar-based quantitative precipitation forecasting (QPF). The purpose of this research is to improve the QPF products using spatial-scale decomposition method for considering the life time of storm and to assess the accuracy between traditional QPF method and proposed method in terms of urban flood management. The layout of this research is as below. First, this research applies the image filtering to separate the spatial-scale of rainfall field. Second, the separated small and large-scale rainfall fields are extrapolated by each different forecasting method. Third, forecasted rainfall fields are combined at each lead time. Finally, results of this method are evaluated and compared with the results of uniform advection model for urban flood modeling. It is expected that urban flood information using improved QPF will help to reduce casualties and property damage caused by urban flooding through this research.

  17. Quantitative NDE of Composite Structures at NASA

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  18. Optimization of a resazurin-based microplate assay for large-scale compound screenings against Klebsiella pneumoniae.

    PubMed

    Kim, Hyung Jun; Jang, Soojin

    2018-01-01

    A new resazurin-based assay was evaluated and optimized using a microplate (384-well) format for high-throughput screening of antibacterial molecules against Klebsiella pneumoniae . Growth of the bacteria in 384-well plates was more effectively measured and had a > sixfold higher signal-to-background ratio using the resazurin-based assay compared with absorbance measurements at 600 nm. Determination of minimum inhibitory concentrations of the antibiotics revealed that the optimized assay quantitatively measured antibacterial activity of various antibiotics. An edge effect observed in the initial assay was significantly reduced using a 1-h incubation of the bacteria-containing plates at room temperature. There was an approximately 10% decrease in signal variability between the edge and the middle wells along with improvement in the assay robustness ( Z ' = 0.99). This optimized resazurin-based assay is an efficient, inexpensive, and robust assay that can quantitatively measure antibacterial activity using a high-throughput screening system to assess a large number of compounds for discovery of new antibiotics against K. pneumoniae .

  19. Calibration of HST wide field camera for quantitative analysis of faint galaxy images

    NASA Technical Reports Server (NTRS)

    Ratnatunga, Kavan U.; Griffiths, Richard E.; Casertano, Stefano; Neuschaefer, Lyman W.; Wyckoff, Eric W.

    1994-01-01

    We present the methods adopted to optimize the calibration of images obtained with the Hubble Space Telescope (HST) Wide Field Camera (WFC) (1991-1993). Our main goal is to improve quantitative measurement of faint images, with special emphasis on the faint (I approximately 20-24 mag) stars and galaxies observed as a part of the Medium-Deep Survey. Several modifications to the standard calibration procedures have been introduced, including improved bias and dark images, and a new supersky flatfield obtained by combining a large number of relatively object-free Medium-Deep Survey exposures of random fields. The supersky flat has a pixel-to-pixel rms error of about 2.0% in F555W and of 2.4% in F785LP; large-scale variations are smaller than 1% rms. Overall, our modifications improve the quality of faint images with respect to the standard calibration by about a factor of five in photometric accuracy and about 0.3 mag in sensitivity, corresponding to about a factor of two in observing time. The relevant calibration images have been made available to the scientific community.

  20. Quantitative computational infrared imaging of buoyant diffusion flames

    NASA Astrophysics Data System (ADS)

    Newale, Ashish S.

    Studies of infrared radiation from turbulent buoyant diffusion flames impinging on structural elements have applications to the development of fire models. A numerical and experimental study of radiation from buoyant diffusion flames with and without impingement on a flat plate is reported. Quantitative images of the radiation intensity from the flames are acquired using a high speed infrared camera. Large eddy simulations are performed using fire dynamics simulator (FDS version 6). The species concentrations and temperature from the simulations are used in conjunction with a narrow-band radiation model (RADCAL) to solve the radiative transfer equation. The computed infrared radiation intensities rendered in the form of images and compared with the measurements. The measured and computed radiation intensities reveal necking and bulging with a characteristic frequency of 7.1 Hz which is in agreement with previous empirical correlations. The results demonstrate the effects of stagnation point boundary layer on the upstream buoyant shear layer. The coupling between these two shear layers presents a model problem for sub-grid scale modeling necessary for future large eddy simulations.

  1. SONAR Discovers RNA-Binding Proteins from Analysis of Large-Scale Protein-Protein Interactomes.

    PubMed

    Brannan, Kristopher W; Jin, Wenhao; Huelga, Stephanie C; Banks, Charles A S; Gilmore, Joshua M; Florens, Laurence; Washburn, Michael P; Van Nostrand, Eric L; Pratt, Gabriel A; Schwinn, Marie K; Daniels, Danette L; Yeo, Gene W

    2016-10-20

    RNA metabolism is controlled by an expanding, yet incomplete, catalog of RNA-binding proteins (RBPs), many of which lack characterized RNA binding domains. Approaches to expand the RBP repertoire to discover non-canonical RBPs are currently needed. Here, HaloTag fusion pull down of 12 nuclear and cytoplasmic RBPs followed by quantitative mass spectrometry (MS) demonstrates that proteins interacting with multiple RBPs in an RNA-dependent manner are enriched for RBPs. This motivated SONAR, a computational approach that predicts RNA binding activity by analyzing large-scale affinity precipitation-MS protein-protein interactomes. Without relying on sequence or structure information, SONAR identifies 1,923 human, 489 fly, and 745 yeast RBPs, including over 100 human candidate RBPs that contain zinc finger domains. Enhanced CLIP confirms RNA binding activity and identifies transcriptome-wide RNA binding sites for SONAR-predicted RBPs, revealing unexpected RNA binding activity for disease-relevant proteins and DNA binding proteins. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Helium ion microscopy of Lepidoptera scales.

    PubMed

    Boden, Stuart A; Asadollahbaik, Asa; Rutt, Harvey N; Bagnall, Darren M

    2012-01-01

    In this report, helium ion microscopy (HIM) is used to study the micro and nanostructures responsible for structural color in the wings of two species of Lepidotera from the Papilionidae family: Papilio ulysses (Blue Mountain Butterfly) and Parides sesostris (Emerald-patched Cattleheart). Electronic charging of uncoated scales from the wings of these butterflies, due to the incident ion beam, is successfully neutralized, leading to images displaying a large depth-of-field and a high level of surface detail, which would normally be obscured by traditional coating methods used for scanning electron microscopy (SEM). The images are compared with those from variable pressure SEM, demonstrating the superiority of HIM at high magnifications. In addition, the large depth-of-field capabilities of HIM are exploited through the creation of stereo pairs that allows the exploration of the third dimension. Furthermore, the extraction of quantitative height information which matches well with cross-sectional transmission electron microscopy measurements from the literature is demonstrated. © Wiley Periodicals, Inc.

  3. Large-scale 3D modeling of projectile impact damage in brittle plates

    NASA Astrophysics Data System (ADS)

    Seagraves, A.; Radovitzky, R.

    2015-10-01

    The damage and failure of brittle plates subjected to projectile impact is investigated through large-scale three-dimensional simulation using the DG/CZM approach introduced by Radovitzky et al. [Comput. Methods Appl. Mech. Eng. 2011; 200(1-4), 326-344]. Two standard experimental setups are considered: first, we simulate edge-on impact experiments on Al2O3 tiles by Strassburger and Senf [Technical Report ARL-CR-214, Army Research Laboratory, 1995]. Qualitative and quantitative validation of the simulation results is pursued by direct comparison of simulations with experiments at different loading rates and good agreement is obtained. In the second example considered, we investigate the fracture patterns in normal impact of spheres on thin, unconfined ceramic plates over a wide range of loading rates. For both the edge-on and normal impact configurations, the full field description provided by the simulations is used to interpret the mechanisms underlying the crack propagation patterns and their strong dependence on loading rate.

  4. Identifying and modeling the structural discontinuities of human interactions

    NASA Astrophysics Data System (ADS)

    Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo

    2017-04-01

    The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales.

  5. Identifying and modeling the structural discontinuities of human interactions

    PubMed Central

    Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo

    2017-01-01

    The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales. PMID:28443647

  6. Fabrication of 2-inch nano patterned sapphire substrate with high uniformity by two-beam laser interference lithography

    NASA Astrophysics Data System (ADS)

    Dai, LongGui; Yang, Fan; Yue, Gen; Jiang, Yang; Jia, Haiqiang; Wang, Wenxin; Chen, Hong

    2014-11-01

    Generally, nano-scale patterned sapphire substrate (NPSS) has better performance than micro-scale patterned sapphire substrate (MPSS) in improving the light extraction efficiency of LEDs. Laser interference lithography (LIL) is one of the powerful fabrication methods for periodic nanostructures without photo-masks for different designs. However, Lloyd's mirror LIL system has the disadvantage that fabricated patterns are inevitably distorted, especially for large-area twodimensional (2D) periodic nanostructures. Herein, we introduce two-beam LIL system to fabricate consistent large-area NPSS. Quantitative analysis and characterization indicate that the high uniformity of the photoresist arrays is achieved. Through the combination of dry etching and wet etching techniques, the well-defined NPSS with period of 460 nm were prepared on the whole sapphire substrate. The deviation is 4.34% for the bottom width of the triangle truncated pyramid arrays on the whole 2-inch sapphire substrate, which is suitable for the application in industrial production of NPSS.

  7. Non-radial pulsations and large-scale structure in stellar winds

    NASA Astrophysics Data System (ADS)

    Blomme, R.

    2009-07-01

    Almost all early-type stars show Discrete Absorption Components (DACs) in their ultraviolet spectral lines. These can be attributed to Co-rotating Interaction Regions (CIRs): large-scale spiral-shaped structures that sweep through the stellar wind. We used the Zeus hydrodynamical code to model the CIRs. In the model, the CIRs are caused by ``spots" on the stellar surface. Through the radiative acceleration these spots create fast streams in the stellar wind material. Where the fast and slow streams collide, a CIR is formed. By varying the parameters of the spots, we quantitatively fit the observed DACs in HD~64760. An important result from our work is that the spots do not rotate with the same velocity as the stellar surface. The fact that the cause of the CIRs is not fixed on the surface eliminates many potential explanations. The only remaining explanation is that the CIRs are due to the interference pattern of a number of non-radial pulsations.

  8. A scalable double-barcode sequencing platform for characterization of dynamic protein-protein interactions.

    PubMed

    Schlecht, Ulrich; Liu, Zhimin; Blundell, Jamie R; St Onge, Robert P; Levy, Sasha F

    2017-05-25

    Several large-scale efforts have systematically catalogued protein-protein interactions (PPIs) of a cell in a single environment. However, little is known about how the protein interactome changes across environmental perturbations. Current technologies, which assay one PPI at a time, are too low throughput to make it practical to study protein interactome dynamics. Here, we develop a highly parallel protein-protein interaction sequencing (PPiSeq) platform that uses a novel double barcoding system in conjunction with the dihydrofolate reductase protein-fragment complementation assay in Saccharomyces cerevisiae. PPiSeq detects PPIs at a rate that is on par with current assays and, in contrast with current methods, quantitatively scores PPIs with enough accuracy and sensitivity to detect changes across environments. Both PPI scoring and the bulk of strain construction can be performed with cell pools, making the assay scalable and easily reproduced across environments. PPiSeq is therefore a powerful new tool for large-scale investigations of dynamic PPIs.

  9. Identifying and modeling the structural discontinuities of human interactions.

    PubMed

    Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo

    2017-04-26

    The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales.

  10. Maritime transport in the Gulf of Bothnia 2030.

    PubMed

    Pekkarinen, Annukka; Repka, Sari

    2014-10-01

    Scenarios for shipping traffic in the Gulf of Bothnia (GoB) by 2030 are described in order to identify the main factors that should be taken into account when preparing a Maritime Spatial Plan (MSP) for the area. The application of future research methodology to planning of marine areas was also assessed. The methods include applying existing large scale quantitative scenarios for maritime traffic in the GoB and using real-time Delphi in which an expert group discussed different factors contributing to future maritime traffic in the GoB to find out the probability and significance of the factors having an impact on maritime traffic. MSP was tested on transnational scale in the Bothnian sea area as a pilot project.

  11. Measuring safety climate in health care.

    PubMed

    Flin, R; Burns, C; Mearns, K; Yule, S; Robertson, E M

    2006-04-01

    To review quantitative studies of safety climate in health care to examine the psychometric properties of the questionnaires designed to measure this construct. A systematic literature review was undertaken to study sample and questionnaire design characteristics (source, no of items, scale type), construct validity (content validity, factor structure and internal reliability, concurrent validity), within group agreement, and level of analysis. Twelve studies were examined. There was a lack of explicit theoretical underpinning for most questionnaires and some instruments did not report standard psychometric criteria. Where this information was available, several questionnaires appeared to have limitations. More consideration should be given to psychometric factors in the design of healthcare safety climate instruments, especially as these are beginning to be used in large scale surveys across healthcare organisations.

  12. Determination of burn patient outcome by large-scale quantitative discovery proteomics

    PubMed Central

    Finnerty, Celeste C.; Jeschke, Marc G.; Qian, Wei-Jun; Kaushal, Amit; Xiao, Wenzhong; Liu, Tao; Gritsenko, Marina A.; Moore, Ronald J.; Camp, David G.; Moldawer, Lyle L.; Elson, Constance; Schoenfeld, David; Gamelli, Richard; Gibran, Nicole; Klein, Matthew; Arnoldo, Brett; Remick, Daniel; Smith, Richard D.; Davis, Ronald; Tompkins, Ronald G.; Herndon, David N.

    2013-01-01

    Objective Emerging proteomics techniques can be used to establish proteomic outcome signatures and to identify candidate biomarkers for survival following traumatic injury. We applied high-resolution liquid chromatography-mass spectrometry (LC-MS) and multiplex cytokine analysis to profile the plasma proteome of survivors and non-survivors of massive burn injury to determine the proteomic survival signature following a major burn injury. Design Proteomic discovery study. Setting Five burn hospitals across the U.S. Patients Thirty-two burn patients (16 non-survivors and 16 survivors), 19–89 years of age, were admitted within 96 h of injury to the participating hospitals with burns covering >20% of the total body surface area and required at least one surgical intervention. Interventions None. Measurements and Main Results We found differences in circulating levels of 43 proteins involved in the acute phase response, hepatic signaling, the complement cascade, inflammation, and insulin resistance. Thirty-two of the proteins identified were not previously known to play a role in the response to burn. IL-4, IL-8, GM-CSF, MCP-1, and β2-microglobulin correlated well with survival and may serve as clinical biomarkers. Conclusions These results demonstrate the utility of these techniques for establishing proteomic survival signatures and for use as a discovery tool to identify candidate biomarkers for survival. This is the first clinical application of a high-throughput, large-scale LC-MS-based quantitative plasma proteomic approach for biomarker discovery for the prediction of patient outcome following burn, trauma or critical illness. PMID:23507713

  13. Confirmatory factor analytic structure and measurement invariance of quantitative autistic traits measured by the social responsiveness scale-2.

    PubMed

    Frazier, Thomas W; Ratliff, Kristin R; Gruber, Chris; Zhang, Yi; Law, Paul A; Constantino, John N

    2014-01-01

    Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large (N = 9635) accumulated collection of reports on quantitative autistic traits using the Social Responsiveness Scale, representing a broad diversity of age, severity, and reporter type. A two-factor structure (corresponding to social communication impairment and restricted, repetitive behavior) as elaborated in the updated Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) criteria for autism spectrum disorder exhibited acceptable model fit in confirmatory factor analysis. Measurement invariance was appreciable across age, sex, and reporter (self vs other), but somewhat less apparent between clinical and nonclinical populations in this sample comprised of both familial and sporadic autism spectrum disorders. The statistical power afforded by this large sample allowed relative differentiation of three factors among items encompassing social communication impairment (emotion recognition, social avoidance, and interpersonal relatedness) and two factors among items encompassing restricted, repetitive behavior (insistence on sameness and repetitive mannerisms). Cross-trait correlations remained extremely high, that is, on the order of 0.66-0.92. These data clarify domains of statistically significant factoral separation that may relate to partially-but not completely-overlapping biological mechanisms, contributing to variation in human social competency. Given such robust intercorrelations among symptom domains, understanding their co-emergence remains a high priority in conceptualizing common neural mechanisms underlying autistic syndromes.

  14. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leclercq, Florent; Wandelt, Benjamin; Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: jasche@iap.fr, E-mail: wandelt@iap.fr

    Recent application of the Bayesian algorithm \\textsc(borg) to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of themore » tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.« less

  15. [Modeling continuous scaling of NDVI based on fractal theory].

    PubMed

    Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng

    2013-07-01

    Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.

  16. Assessing Psychodynamic Conflict.

    PubMed

    Simmonds, Joshua; Constantinides, Prometheas; Perry, J Christopher; Drapeau, Martin; Sheptycki, Amanda R

    2015-09-01

    Psychodynamic psychotherapies suggest that symptomatic relief is provided, in part, with the resolution of psychic conflicts. Clinical researchers have used innovative methods to investigate such phenomenon. This article aims to review the literature on quantitative psychodynamic conflict rating scales. An electronic search of the literature was conducted to retrieve quantitative observer-rated scales used to assess conflict noting each measure's theoretical model, information source, and training and clinical experience required. Scales were also examined for levels of reliability and validity. Five quantitative observer-rated conflict scales were identified. Reliability varied from poor to excellent with each measure demonstrating good validity. However a small number of studies and limited links to current conflict theory suggest further clinical research is needed.

  17. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  18. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  19. Identifying Preserved Storm Events on Beaches from Trenches and Cores

    NASA Astrophysics Data System (ADS)

    Wadman, H. M.; Gallagher, E. L.; McNinch, J.; Reniers, A.; Koktas, M.

    2014-12-01

    Recent research suggests that even small scale variations in grain size in the shallow stratigraphy of sandy beaches can significantly influence large-scale morphology change. However, few quantitative studies of variations in shallow stratigraphic layers, as differentiated by variations in mean grain size, have been conducted, in no small part due to the difficulty of collecting undisturbed sediment cores in the energetic lower beach and swash zone. Due to this lack of quantitative stratigraphic grain size data, most coastal morphology models assume that uniform grain sizes dominate sandy beaches, allowing for little to no temporal or spatial variations in grain size heterogeneity. In a first-order attempt to quantify small-scale, temporal and spatial variations in beach stratigraphy, thirty-five vibracores were collected at the USACE Field Research Facility (FRF), Duck, NC, in March-April of 2014 using the FRF's Coastal Research and Amphibious Buggy (CRAB). Vibracores were collected at set locations along a cross-shore profile from the toe of the dune to a water depth of ~1m in the surf zone. Vibracores were repeatedly collected from the same locations throughout a tidal cycle, as well as pre- and post a nor'easter event. In addition, two ~1.5m deep trenches were dug in the cross-shore and along-shore directions (each ~14m in length) after coring was completed to allow better interpretation of the stratigraphic sequences observed in the vibracores. The elevations of coherent stratigraphic layers, as revealed in vibracore-based fence diagrams and trench data, are used to relate specific observed stratigraphic sequences to individual storm events observed at the FRF. These data provide a first-order, quantitative examination of the small-scale temporal and spatial variability of shallow grain size along an open, sandy coastline. The data will be used to refine morphological model predictions to include variations in grain size and associated shallow stratigraphy.

  20. Formation of fold-and-thrust belts on Venus by thick-skinned deformation

    NASA Astrophysics Data System (ADS)

    Zuber, M. T.; Parmentier, E. M.

    1995-10-01

    ON Venus, fold-and-thrust belts—which accommodate large-scale horizontal crustal convergence—are often located at the margins of kilometre-high plateaux1-5. Such mountain belts, typically hundreds of kilometres long and tens to hundreds of kilometres wide, surround the Lakshmi Planum plateau in the Ishtar Terra highland (Fig. 1). In explaining the origin of fold-and-thrust belts, it is important to understand the relative importance of thick-skinned deformation of the whole lithosphere and thin-skinned, large-scale overthrusting of near-surface layers. Previous quantitative analyses of mountain belts on Venus have been restricted to thin-skinned models6-8, but this style of deformation does not account for the pronounced topographic highs at the plateau edge. We propose that the long-wavelength topography of these venusian fold-and-thrust belts is more readily explained by horizontal shortening of a laterally heterogeneous lithosphere. In this thick-skinned model, deformation within the mechanically strong outer layer of Venus controls mountain building. Our results suggest that lateral variations in either the thermal or mechanical structure of the interior provide a mechanism for focusing deformation due to convergent, global-scale forces on Venus.

  1. Urban area thermal monitoring: Liepaja case study using satellite and aerial thermal data

    NASA Astrophysics Data System (ADS)

    Gulbe, Linda; Caune, Vairis; Korats, Gundars

    2017-12-01

    The aim of this study is to explore large (60 m/pixel) and small scale (individual building level) temperature distribution patterns from thermal remote sensing data and to conclude what kind of information could be extracted from thermal remote sensing on regular basis. Landsat program provides frequent large scale thermal images useful for analysis of city temperature patterns. During the study correlation between temperature patterns and vegetation content based on NDVI and building coverage based on OpenStreetMap data was studied. Landsat based temperature patterns were independent from the season, negatively correlated with vegetation content and positively correlated with building coverage. Small scale analysis included spatial and raster descriptor analysis for polygons corresponding to roofs of individual buildings for evaluating insulation of roofs. Remote sensing and spatial descriptors are poorly related to heat consumption data, however, thermal aerial data median and entropy can help to identify poorly insulated roofs. Automated quantitative roof analysis has high potential for acquiring city wide information about roof insulation, but quality is limited by reference data quality and information on building types, and roof materials would be crucial for further studies.

  2. Extracting Primordial Non-Gaussianity from Large Scale Structure in the Post-Planck Era

    NASA Astrophysics Data System (ADS)

    Dore, Olivier

    Astronomical observations have become a unique tool to probe fundamental physics. Cosmology, in particular, emerged as a data-driven science whose phenomenological modeling has achieved great success: in the post-Planck era, key cosmological parameters are measured to percent precision. A single model reproduces a wealth of astronomical observations involving very distinct physical processes at different times. This success leads to fundamental physical questions. One of the most salient is the origin of the primordial perturbations that grew to form the large-scale structures we now observe. More and more cosmological observables point to inflationary physics as the origin of the structure observed in the universe. Inflationary physics predict the statistical properties of the primordial perturbations and it is thought to be slightly non-Gaussian. The detection of this small deviation from Gaussianity represents the next frontier in early Universe physics. To measure it would provide direct, unique and quantitative insights about the physics at play when the Universe was only a fraction of a second old, thus probing energies untouchable otherwise. En par with the well-known relic gravitational wave radiation -- the famous ``B-modes'' -- it is one the few probes of inflation. This departure from Gaussianity leads to very specific signature in the large scale clustering of galaxies. Observing large-scale structure, we can thus establish a direct connection with fundamental theories of the early universe. In the post-Planck era, large-scale structures are our most promising pathway to measuring this primordial signal. Current estimates suggests that the next generation of space or ground based large scale structure surveys (e.g. the ESA EUCLID or NASA WFIRST missions) might enable a detection of this signal. This potential huge payoff requires us to solidify the theoretical predictions supporting these measurements. Even if the exact signal we are looking for is of unknown amplitude, it is obvious that we must measure it as well as these ground breaking data set will permit. We propose to develop the supporting theoretical work to the point where the complete non-gaussianian signature can be extracted from these data sets. We will do so by developing three complementary directions: - We will develop the appropriate formalism to measure and model galaxy clustering on the largest scales. - We will study the impact of non-Gaussianity on higher-order statistics, the most promising statistics for our purpose.. - We will explicit the connection between these observables and the microphysics of a large class of inflation models, but also identify fundamental limitations to this interpretation.

  3. Development of four self-report measures of job stressors and strain: Interpersonal Conflict at Work Scale, Organizational Constraints Scale, Quantitative Workload Inventory, and Physical Symptoms Inventory.

    PubMed

    Spector, P E; Jex, S M

    1998-10-01

    Despite the widespread use of self-report measures of both job-related stressors and strains, relatively few carefully developed scales for which validity data exist are available. In this article, we discuss 3 job stressor scales (Interpersonal Conflict at Work Scale, Organizational Constraints Scale, and Quantitative Workload Inventory) and 1 job strain scale (Physical Symptoms Inventory). Using meta-analysis, we combined the results of 18 studies to provide estimates of relations between our scales and other variables. Data showed moderate convergent validity for the 3 job stressor scales, suggesting some objectively to these self-reports. Norms for each scale are provided.

  4. Random cascade model in the limit of infinite integral scale as the exponential of a nonstationary 1/f noise: Application to volatility fluctuations in stock markets

    NASA Astrophysics Data System (ADS)

    Muzy, Jean-François; Baïle, Rachel; Bacry, Emmanuel

    2013-04-01

    In this paper we propose a new model for volatility fluctuations in financial time series. This model relies on a nonstationary Gaussian process that exhibits aging behavior. It turns out that its properties, over any finite time interval, are very close to continuous cascade models. These latter models are indeed well known to reproduce faithfully the main stylized facts of financial time series. However, it involves a large-scale parameter (the so-called “integral scale” where the cascade is initiated) that is hard to interpret in finance. Moreover, the empirical value of the integral scale is in general deeply correlated to the overall length of the sample. This feature is precisely predicted by our model, which, as illustrated by various examples from daily stock index data, quantitatively reproduces the empirical observations.

  5. Scaling identity connects human mobility and social interactions.

    PubMed

    Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D; Barabási, Albert-László; Wang, Dashun

    2016-06-28

    Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality.

  6. Intrinsic fluctuations of the proton saturation momentum scale in high multiplicity p+p collisions

    DOE PAGES

    McLerran, Larry; Tribedy, Prithwish

    2015-11-02

    High multiplicity events in p+p collisions are studied using the theory of the Color Glass Condensate. Here, we show that intrinsic fluctuations of the proton saturation momentum scale are needed in addition to the sub-nucleonic color charge fluctuations to explain the very high multiplicity tail of distributions in p+p collisions. It is presumed that the origin of such intrinsic fluctuations is non-perturbative in nature. Classical Yang Mills simulations using the IP-Glasma model are performed to make quantitative estimations. Furthermore, we find that fluctuations as large as O(1) of the average values of the saturation momentum scale can lead to raremore » high multiplicity events seen in p+p data at RHIC and LHC energies. Using the available data on multiplicity distributions we try to constrain the distribution of the proton saturation momentum scale and make predictions for the multiplicity distribution in 13 TeV p+p collisions.« less

  7. Valley s'Asymmetric Characteristics of the Loess Plateau in Northwestern Shanxi Based on DEM

    NASA Astrophysics Data System (ADS)

    Duan, J.

    2016-12-01

    The valleys of the Loess Plateau in northwestern Shanxi show great asymmetry. This study using multi-scale DEMs, high-resolution satellite images and digital terrain analysis method, put forward a quantitative index to describe the asymmetric morphology. Several typical areas are selected to test and verify the spatial variability. Results show: (1) Considering the difference of spatial distribution, Pianguanhe basin, Xianchuanhe basin and Yangjiachuan basin are the areas where show most significant asymmetric characteristics . (2) Considering the difference of scale, the shape of large-scale valleys represents three characteristics: randomness, equilibrium and relative symmetry, while small-scale valleys show directionality and asymmetry. (3) Asymmetric morphology performs orientation, and the east-west valleys extremely obvious. Combined with field survey, its formation mechanism can be interpreted as follows :(1)Loess uneven distribution in the valleys. (2) The distribution diversities of vegetation, water , heat conditions and other factors, make a difference in water erosion capability which leads to asymmetric characteristics.

  8. Scaling identity connects human mobility and social interactions

    PubMed Central

    Deville, Pierre; Song, Chaoming; Eagle, Nathan; Blondel, Vincent D.; Barabási, Albert-László; Wang, Dashun

    2016-01-01

    Massive datasets that capture human movements and social interactions have catalyzed rapid advances in our quantitative understanding of human behavior during the past years. One important aspect affecting both areas is the critical role space plays. Indeed, growing evidence suggests both our movements and communication patterns are associated with spatial costs that follow reproducible scaling laws, each characterized by its specific critical exponents. Although human mobility and social networks develop concomitantly as two prolific yet largely separated fields, we lack any known relationships between the critical exponents explored by them, despite the fact that they often study the same datasets. Here, by exploiting three different mobile phone datasets that capture simultaneously these two aspects, we discovered a new scaling relationship, mediated by a universal flux distribution, which links the critical exponents characterizing the spatial dependencies in human mobility and social networks. Therefore, the widely studied scaling laws uncovered in these two areas are not independent but connected through a deeper underlying reality. PMID:27274050

  9. Numerical simulation of a plane turbulent mixing layer, with applications to isothermal, rapid reactions

    NASA Technical Reports Server (NTRS)

    Lin, P.; Pratt, D. T.

    1987-01-01

    A hybrid method has been developed for the numerical prediction of turbulent mixing in a spatially-developing, free shear layer. Most significantly, the computation incorporates the effects of large-scale structures, Schmidt number and Reynolds number on mixing, which have been overlooked in the past. In flow field prediction, large-eddy simulation was conducted by a modified 2-D vortex method with subgrid-scale modeling. The predicted mean velocities, shear layer growth rates, Reynolds stresses, and the RMS of longitudinal velocity fluctuations were found to be in good agreement with experiments, although the lateral velocity fluctuations were overpredicted. In scalar transport, the Monte Carlo method was extended to the simulation of the time-dependent pdf transport equation. For the first time, the mixing frequency in Curl's coalescence/dispersion model was estimated by using Broadwell and Breidenthal's theory of micromixing, which involves Schmidt number, Reynolds number and the local vorticity. Numerical tests were performed for a gaseous case and an aqueous case. Evidence that pure freestream fluids are entrained into the layer by large-scale motions was found in the predicted pdf. Mean concentration profiles were found to be insensitive to Schmidt number, while the unmixedness was higher for higher Schmidt number. Applications were made to mixing layers with isothermal, fast reactions. The predicted difference in product thickness of the two cases was in reasonable quantitative agreement with experimental measurements.

  10. Thymidylate synthase (TS) gene expression in primary lung cancer patients: a large-scale study in Japanese population.

    PubMed

    Tanaka, F; Wada, H; Fukui, Y; Fukushima, M

    2011-08-01

    Previous small-sized studies showed lower thymidylate synthase (TS) expression in adenocarcinoma of the lung, which may explain higher antitumor activity of TS-inhibiting agents such as pemetrexed. To quantitatively measure TS gene expression in a large-scale Japanese population (n = 2621) with primary lung cancer, laser-captured microdissected sections were cut from primary tumors, surrounding normal lung tissues and involved nodes. TS gene expression level in primary tumor was significantly higher than that in normal lung tissue (mean TS/β-actin, 3.4 and 1.0, respectively; P < 0.01), and TS gene expression level was further higher in involved node (mean TS/β-actin, 7.7; P < 0.01). Analyses of TS gene expression levels in primary tumor according to histologic cell type revealed that small-cell carcinoma showed highest TS expression (mean TS/β-actin, 13.8) and that squamous cell carcinoma showed higher TS expression as compared with adenocarcinoma (mean TS/β-actin, 4.3 and 2.3, respectively; P < 0.01); TS gene expression was significantly increased along with a decrease in the grade of tumor cell differentiation. There was no significant difference in TS gene expression according to any other patient characteristics including tumor progression. Lower TS expression in adenocarcinoma of the lung was confirmed in a large-scale study.

  11. Single myelin fiber imaging in living rodents without labeling by deep optical coherence microscopy.

    PubMed

    Ben Arous, Juliette; Binding, Jonas; Léger, Jean-François; Casado, Mariano; Topilko, Piotr; Gigan, Sylvain; Boccara, A Claude; Bourdieu, Laurent

    2011-11-01

    Myelin sheath disruption is responsible for multiple neuropathies in the central and peripheral nervous system. Myelin imaging has thus become an important diagnosis tool. However, in vivo imaging has been limited to either low-resolution techniques unable to resolve individual fibers or to low-penetration imaging of single fibers, which cannot provide quantitative information about large volumes of tissue, as required for diagnostic purposes. Here, we perform myelin imaging without labeling and at micron-scale resolution with >300-μm penetration depth on living rodents. This was achieved with a prototype [termed deep optical coherence microscopy (deep-OCM)] of a high-numerical aperture infrared full-field optical coherence microscope, which includes aberration correction for the compensation of refractive index mismatch and high-frame-rate interferometric measurements. We were able to measure the density of individual myelinated fibers in the rat cortex over a large volume of gray matter. In the peripheral nervous system, deep-OCM allows, after minor surgery, in situ imaging of single myelinated fibers over a large fraction of the sciatic nerve. This allows quantitative comparison of normal and Krox20 mutant mice, in which myelination in the peripheral nervous system is impaired. This opens promising perspectives for myelin chronic imaging in demyelinating diseases and for minimally invasive medical diagnosis.

  12. Single myelin fiber imaging in living rodents without labeling by deep optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Ben Arous, Juliette; Binding, Jonas; Léger, Jean-François; Casado, Mariano; Topilko, Piotr; Gigan, Sylvain; Claude Boccara, A.; Bourdieu, Laurent

    2011-11-01

    Myelin sheath disruption is responsible for multiple neuropathies in the central and peripheral nervous system. Myelin imaging has thus become an important diagnosis tool. However, in vivo imaging has been limited to either low-resolution techniques unable to resolve individual fibers or to low-penetration imaging of single fibers, which cannot provide quantitative information about large volumes of tissue, as required for diagnostic purposes. Here, we perform myelin imaging without labeling and at micron-scale resolution with >300-μm penetration depth on living rodents. This was achieved with a prototype [termed deep optical coherence microscopy (deep-OCM)] of a high-numerical aperture infrared full-field optical coherence microscope, which includes aberration correction for the compensation of refractive index mismatch and high-frame-rate interferometric measurements. We were able to measure the density of individual myelinated fibers in the rat cortex over a large volume of gray matter. In the peripheral nervous system, deep-OCM allows, after minor surgery, in situ imaging of single myelinated fibers over a large fraction of the sciatic nerve. This allows quantitative comparison of normal and Krox20 mutant mice, in which myelination in the peripheral nervous system is impaired. This opens promising perspectives for myelin chronic imaging in demyelinating diseases and for minimally invasive medical diagnosis.

  13. Activity-Based Introductory Physics Reform *

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald

    2004-05-01

    Physics education research has shown that learning environments that engage students and allow them to take an active part in their learning can lead to large conceptual gains compared to those of good traditional instruction. Examples of successful curricula and methods include Peer Instruction, Just in Time Teaching, RealTime Physics, Workshop Physics, Scale-Up, and Interactive Lecture Demonstrations (ILDs). RealTime Physics promotes interaction among students in a laboratory setting and makes use of powerful real-time data logging tools to teach concepts as well as quantitative relationships. An active learning environment is often difficult to achieve in large lecture sessions and Workshop Physics and Scale-Up largely eliminate lectures in favor of collaborative student activities. Peer Instruction, Just in Time Teaching, and Interactive Lecture Demonstrations (ILDs) make lectures more interactive in complementary ways. This presentation will introduce these reforms and use Interactive Lecture Demonstrations (ILDs) with the audience to illustrate the types of curricula and tools used in the curricula above. ILDs make use real experiments, real-time data logging tools and student interaction to create an active learning environment in large lecture classes. A short video of students involved in interactive lecture demonstrations will be shown. The results of research studies at various institutions to measure the effectiveness of these methods will be presented.

  14. Molecular cytogenetic mapping of 24 CEPH YACs and 24 gene-specific large insert probes to chromosome 17.

    PubMed

    Bärlund, M; Nupponen, N N; Karhu, R; Tanner, M M; Paavola, P; Kallioniemi, O P; Kallioniemi, A

    1998-01-01

    Defining boundaries of chromosomal rearrangements at the molecular level would benefit from landmarks that link the cytogenetic map to physical, genetic, and transcript maps, as well as from large-insert FISH probes for such loci to detect numerical and structural rearrangements in metaphase or interphase cells. Here, we determined the locations of 24 genetically mapped CEPH-Mega YACs along the FLpter scale (fractional length from p-telomere) by quantitative fluorescence in situ hybridization analysis. This generated a set of cytogenetically mapped probes for chromosome 17 with an average spacing of about 5 cM. We then developed large-insert YAC, BAC, PAC, or P1 clones to the following 24 known genes, and determined refined map locations along the same FLpter scale: pter-TP53-TOP3-cen-TNFAIP1-ERBB2-TOP2A- BRCA1-TCF11-NME1-HLF-ZNF147/CL N80-BCL5/MPO/SFRS1-TBX2-PECAM1-DDX5/ PRKCA-ICAM2-GH1/PRKAR1A-GRB2-CDK3 /FKHL13-qter. Taken together, these 48 cytogenetically mapped large-insert probes provide tools for the molecular analysis of chromosome 17 rearrangements, such as mapping amplification, deletion, and translocation breakpoints in this chromosome, in cancer and other diseases.

  15. Analytical methods for toxic gases from thermal degradation of polymers

    NASA Technical Reports Server (NTRS)

    Hsu, M.-T. S.

    1977-01-01

    Toxic gases evolved from the thermal oxidative degradation of synthetic or natural polymers in small laboratory chambers or in large scale fire tests are measured by several different analytical methods. Gas detector tubes are used for fast on-site detection of suspect toxic gases. The infrared spectroscopic method is an excellent qualitative and quantitative analysis for some toxic gases. Permanent gases such as carbon monoxide, carbon dioxide, methane and ethylene, can be quantitatively determined by gas chromatography. Highly toxic and corrosive gases such as nitrogen oxides, hydrogen cyanide, hydrogen fluoride, hydrogen chloride and sulfur dioxide should be passed into a scrubbing solution for subsequent analysis by either specific ion electrodes or spectrophotometric methods. Low-concentration toxic organic vapors can be concentrated in a cold trap and then analyzed by gas chromatography and mass spectrometry. The limitations of different methods are discussed.

  16. Past and present cosmic structure in the SDSS DR7 main sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasche, J.; Leclercq, F.; Wandelt, B.D., E-mail: jasche@iap.fr, E-mail: florent.leclercq@polytechnique.org, E-mail: wandelt@iap.fr

    2015-01-01

    We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structuremore » formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than ∼ 3 Mpc/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.« less

  17. A Spatial Method to Calculate Small-Scale Fisheries Extent

    NASA Astrophysics Data System (ADS)

    Johnson, A. F.; Moreno-Báez, M.; Giron-Nava, A.; Corominas, J.; Erisman, B.; Ezcurra, E.; Aburto-Oropeza, O.

    2016-02-01

    Despite global catch per unit effort having redoubled since the 1950's, the global fishing fleet is estimated to be twice the size that the oceans can sustainably support. In order to gauge the collateral impacts of fishing intensity, we must be able to estimate the spatial extent and amount of fishing vessels in the oceans. Methods that do currently exist are built around electronic tracking and log book systems and generally focus on industrial fisheries. Spatial extent for small-scale fisheries therefore remains elusive for many small-scale fishing fleets; even though these fisheries land the same biomass for human consumption as industrial fisheries. Current methods are data-intensive and require extensive extrapolation when estimated across large spatial scales. We present an accessible, spatial method of calculating the extent of small-scale fisheries based on two simple measures that are available, or at least easily estimable, in even the most data poor fisheries: the number of boats and the local coastal human population. We demonstrate this method is fishery-type independent and can be used to quantitatively evaluate the efficacy of growth in small-scale fisheries. This method provides an important first step towards estimating the fishing extent of the small-scale fleet, globally.

  18. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    NASA Astrophysics Data System (ADS)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables such the Green Area Index (GAI), fAPAR and fcover usually retrieved from MODIS, MERIS, SPOT-Vegetation described the quality of the green vegetation development. The GLOBAM (Belgium) and EU FP-7 MOCCCASIN projects (Russia) improved the standard products and were demonstrated over large scale. The GAI retrieved from MODIS time series using a purity index criterion depicted successfully the inter-annual variability. Furthermore, the quantitative assimilation of these GAI time series into a crop growth model improved the yield estimate over years. These results showed that the GAI assimilation works best at the district or provincial level. In the context of the GEO Ag., the Joint Experiment of Crop Assessment and Monitoring (JECAM) was designed to enable the global agricultural monitoring community to compare such methods and results over a variety of regional cropping systems. For a network of test sites around the world, satellite and field measurements are currently collected and will be made available for collaborative effort. This experiment should facilitate international standards for data products and reporting, eventually supporting the development of a global system of systems for agricultural crop assessment and monitoring.

  19. How large a dataset should be in order to estimate scaling exponents and other statistics correctly in studies of solar wind turbulence

    NASA Astrophysics Data System (ADS)

    Rowlands, G.; Kiyani, K. H.; Chapman, S. C.; Watkins, N. W.

    2009-12-01

    Quantitative analysis of solar wind fluctuations are often performed in the context of intermittent turbulence and center around methods to quantify statistical scaling, such as power spectra and structure functions which assume a stationary process. The solar wind exhibits large scale secular changes and so the question arises as to whether the timeseries of the fluctuations is non-stationary. One approach is to seek a local stationarity by parsing the time interval over which statistical analysis is performed. Hence, natural systems such as the solar wind unavoidably provide observations over restricted intervals. Consequently, due to a reduction of sample size leading to poorer estimates, a stationary stochastic process (time series) can yield anomalous time variation in the scaling exponents, suggestive of nonstationarity. The variance in the estimates of scaling exponents computed from an interval of N observations is known for finite variance processes to vary as ~1/N as N becomes large for certain statistical estimators; however, the convergence to this behavior will depend on the details of the process, and may be slow. We study the variation in the scaling of second-order moments of the time-series increments with N for a variety of synthetic and “real world” time series, and we find that in particular for heavy tailed processes, for realizable N, one is far from this ~1/N limiting behavior. We propose a semiempirical estimate for the minimum N needed to make a meaningful estimate of the scaling exponents for model stochastic processes and compare these with some “real world” time series from the solar wind. With fewer datapoints the stationary timeseries becomes indistinguishable from a nonstationary process and we illustrate this with nonstationary synthetic datasets. Reference article: K. H. Kiyani, S. C. Chapman and N. W. Watkins, Phys. Rev. E 79, 036109 (2009).

  20. Multiscale factors affecting human attitudes toward snow leopards and wolves.

    PubMed

    Suryawanshi, Kulbhushansingh R; Bhatia, Saloni; Bhatnagar, Yash Veer; Redpath, Stephen; Mishra, Charudutt

    2014-12-01

    The threat posed by large carnivores to livestock and humans makes peaceful coexistence between them difficult. Effective implementation of conservation laws and policies depends on the attitudes of local residents toward the target species. There are many known correlates of human attitudes toward carnivores, but they have only been assessed at the scale of the individual. Because human societies are organized hierarchically, attitudes are presumably influenced by different factors at different scales of social organization, but this scale dependence has not been examined. We used structured interview surveys to quantitatively assess the attitudes of a Buddhist pastoral community toward snow leopards (Panthera uncia) and wolves (Canis lupus). We interviewed 381 individuals from 24 villages within 6 study sites across the high-elevation Spiti Valley in the Indian Trans-Himalaya. We gathered information on key explanatory variables that together captured variation in individual and village-level socioeconomic factors. We used hierarchical linear models to examine how the effect of these factors on human attitudes changed with the scale of analysis from the individual to the community. Factors significant at the individual level were gender, education, and age of the respondent (for wolves and snow leopards), number of income sources in the family (wolves), agricultural production, and large-bodied livestock holdings (snow leopards). At the community level, the significant factors included the number of smaller-bodied herded livestock killed by wolves and mean agricultural production (wolves) and village size and large livestock holdings (snow leopards). Our results show that scaling up from the individual to higher levels of social organization can highlight important factors that influence attitudes of people toward wildlife and toward formal conservation efforts in general. Such scale-specific information can help managers apply conservation measures at appropriate scales. Our results reiterate the need for conflict management programs to be multipronged. © 2014 Society for Conservation Biology.

  1. Fast Measurement and Reconstruction of Large Workpieces with Freeform Surfaces by Combining Local Scanning and Global Position Data

    PubMed Central

    Chen, Zhe; Zhang, Fumin; Qu, Xinghua; Liang, Baoqiu

    2015-01-01

    In this paper, we propose a new approach for the measurement and reconstruction of large workpieces with freeform surfaces. The system consists of a handheld laser scanning sensor and a position sensor. The laser scanning sensor is used to acquire the surface and geometry information, and the position sensor is utilized to unify the scanning sensors into a global coordinate system. The measurement process includes data collection, multi-sensor data fusion and surface reconstruction. With the multi-sensor data fusion, errors accumulated during the image alignment and registration process are minimized, and the measuring precision is significantly improved. After the dense accurate acquisition of the three-dimensional (3-D) coordinates, the surface is reconstructed using a commercial software piece, based on the Non-Uniform Rational B-Splines (NURBS) surface. The system has been evaluated, both qualitatively and quantitatively, using reference measurements provided by a commercial laser scanning sensor. The method has been applied for the reconstruction of a large gear rim and the accuracy is up to 0.0963 mm. The results prove that this new combined method is promising for measuring and reconstructing the large-scale objects with complex surface geometry. Compared with reported methods of large-scale shape measurement, it owns high freedom in motion, high precision and high measurement speed in a wide measurement range. PMID:26091396

  2. Characterizing Listener Engagement with Popular Songs Using Large-Scale Music Discovery Data

    PubMed Central

    Kaneshiro, Blair; Ruan, Feng; Baker, Casey W.; Berger, Jonathan

    2017-01-01

    Music discovery in everyday situations has been facilitated in recent years by audio content recognition services such as Shazam. The widespread use of such services has produced a wealth of user data, specifying where and when a global audience takes action to learn more about music playing around them. Here, we analyze a large collection of Shazam queries of popular songs to study the relationship between the timing of queries and corresponding musical content. Our results reveal that the distribution of queries varies over the course of a song, and that salient musical events drive an increase in queries during a song. Furthermore, we find that the distribution of queries at the time of a song's release differs from the distribution following a song's peak and subsequent decline in popularity, possibly reflecting an evolution of user intent over the “life cycle” of a song. Finally, we derive insights into the data size needed to achieve consistent query distributions for individual songs. The combined findings of this study suggest that music discovery behavior, and other facets of the human experience of music, can be studied quantitatively using large-scale industrial data. PMID:28386241

  3. Inferring cortical function in the mouse visual system through large-scale systems neuroscience.

    PubMed

    Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof

    2016-07-05

    The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.

  4. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies.

    PubMed

    Atkinson, Jonathan A; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E; Griffiths, Marcus; Wells, Darren M

    2017-10-01

    Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. © The Authors 2017. Published by Oxford University Press.

  5. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  6. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies

    PubMed Central

    Atkinson, Jonathan A.; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E.; Griffiths, Marcus

    2017-01-01

    Abstract Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. PMID:29020748

  7. Characterizing Listener Engagement with Popular Songs Using Large-Scale Music Discovery Data.

    PubMed

    Kaneshiro, Blair; Ruan, Feng; Baker, Casey W; Berger, Jonathan

    2017-01-01

    Music discovery in everyday situations has been facilitated in recent years by audio content recognition services such as Shazam. The widespread use of such services has produced a wealth of user data, specifying where and when a global audience takes action to learn more about music playing around them. Here, we analyze a large collection of Shazam queries of popular songs to study the relationship between the timing of queries and corresponding musical content. Our results reveal that the distribution of queries varies over the course of a song, and that salient musical events drive an increase in queries during a song. Furthermore, we find that the distribution of queries at the time of a song's release differs from the distribution following a song's peak and subsequent decline in popularity, possibly reflecting an evolution of user intent over the "life cycle" of a song. Finally, we derive insights into the data size needed to achieve consistent query distributions for individual songs. The combined findings of this study suggest that music discovery behavior, and other facets of the human experience of music, can be studied quantitatively using large-scale industrial data.

  8. Double-exponential decay of orientational correlations in semiflexible polyelectrolytes.

    PubMed

    Bačová, P; Košovan, P; Uhlík, F; Kuldová, J; Limpouchová, Z; Procházka, K

    2012-06-01

    In this paper we revisited the problem of persistence length of polyelectrolytes. We performed a series of Molecular Dynamics simulations using the Debye-Hückel approximation for electrostatics to test several equations which go beyond the classical description of Odijk, Skolnick and Fixman (OSF). The data confirm earlier observations that in the limit of large contour separations the decay of orientational correlations can be described by a single-exponential function and the decay length can be described by the OSF relation. However, at short countour separations the behaviour is more complex. Recent equations which introduce more complicated expressions and an additional length scale could describe the results very well on both the short and the long length scale. The equation of Manghi and Netz when used without adjustable parameters could capture the qualitative trend but deviated in a quantitative comparison. Better quantitative agreement within the estimated error could be obtained using three equations with one adjustable parameter: 1) the equation of Manghi and Netz; 2) the equation proposed by us in this paper; 3) the equation proposed by Cannavacciuolo and Pedersen. Two characteristic length scales can be identified in the data: the intrinsic or bare persistence length and the electrostatic persistence length. All three equations use a single parameter to describe a smooth crossover from the short-range behaviour dominated by the intrinsic stiffness of the chain to the long-range OSF-like behaviour.

  9. The Quaternary fossil-pollen record and global change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimm, E.C.

    Fossil pollen provide one of the most valuable records of vegetation and climate change during the recent geological past. Advantages of the fossil-pollen record are that deposits containing fossil pollen are widespread, especially in areas having natural lakes, that fossil pollen occurs in continuous stratigraphic sequences spanning millennia, and that fossil pollen occurs in quantitative assemblages permitting a multivariate approach for reconstructing past vegetation and climates. Because of stratigraphic continuity, fossil pollen records climate cycles on a wide range of scales, from annual to the 100 ka Milankovitch cycles. Receiving particular emphasis recently are decadal to century scale changes, possiblemore » from the sediments of varved lakes, and late Pleistocene events on a 5--10 ka scale possibly correlating with the Heinrich events in the North Atlantic marine record or the Dansgaard-Oeschger events in the Greenland ice-core record. Researchers have long reconstructed vegetation and climate by qualitative interpretation of the fossil-pollen record. Recently quantitative interpretation has developed with the aid of large fossil-pollen databases and sophisticated numerical models. In addition, fossil pollen are important climate proxy data for validating General Circulation Models, which are used for predicting the possible magnitude future climate change. Fossil-pollen data also contribute to an understanding of ecological issues associated with global climate change, including questions of how and how rapidly ecosystems might respond to abrupt climate change.« less

  10. Monthly mean large-scale analyses of upper-tropospheric humidity and wind field divergence derived from three geostationary satellites

    NASA Technical Reports Server (NTRS)

    Schmetz, Johannes; Menzel, W. Paul; Velden, Christopher; Wu, Xiangqian; Vandeberg, Leo; Nieman, Steve; Hayden, Christopher; Holmlund, Kenneth; Geijo, Carlos

    1995-01-01

    This paper describes the results from a collaborative study between the European Space Operations Center, the European Organization for the Exploitation of Meteorological Satellites, the National Oceanic and Atmospheric Administration, and the Cooperative Institute for Meteorological Satellite Studies investigating the relationship between satellite-derived monthly mean fields of wind and humidity in the upper troposphere for March 1994. Three geostationary meteorological satellites GOES-7, Meteosat-3, and Meteosat-5 are used to cover an area from roughly 160 deg W to 50 deg E. The wind fields are derived from tracking features in successive images of upper-tropospheric water vapor (WV) as depicted in the 6.5-micron absorption band. The upper-tropospheric relative humidity (UTH) is inferred from measured water vapor radiances with a physical retrieval scheme based on radiative forward calculations. Quantitative information on large-scale circulation patterns in the upper-troposphere is possible with the dense spatial coverage of the WV wind vectors. The monthly mean wind field is used to estimate the large-scale divergence; values range between about-5 x 10(exp -6) and 5 x 10(exp 6)/s when averaged over a scale length of about 1000-2000 km. The spatial patterns of the UTH field and the divergence of the wind field closely resemble one another, suggesting that UTH patterns are principally determined by the large-scale circulation. Since the upper-tropospheric humidity absorbs upwelling radiation from lower-tropospheric levels and therefore contributes significantly to the atmospheric greenhouse effect, this work implies that studies on the climate relevance of water vapor should include three-dimensional modeling of the atmospheric dynamics. The fields of UTH and WV winds are useful parameters for a climate-monitoring system based on satellite data. The results from this 1-month analysis suggest the desirability of further GOES and Meteosat studies to characterize the changes in the upper-tropospheric moisture sources and sinks over the past decade.

  11. Evaluation of a large-scale quantitative respirator-fit testing program for healthcare workers: survey results.

    PubMed

    Wilkinson, Irene J; Pisaniello, Dino; Ahmad, Junaid; Edwards, Suzanne

    2010-09-01

    To present the evaluation of a large-scale quantitative respirator-fit testing program. Concurrent questionnaire survey of fit testers and test subjects. Ambulatory care, home nursing care, and acute care hospitals across South Australia. Quantitative facial-fit testing was performed with TSI PortaCount instruments for healthcare workers (HCWs) who wore 5 different models of a disposable P2 (N95-equivalent) respirator. The questionnaire included questions about the HCW's age, sex, race, occupational category, main area of work, smoking status, facial characteristics, prior training and experience in use of respiratory masks, and number of attempts to obtain a respirator fit. A total of 6,160 HCWs were successfully fitted during the period from January through July 2007. Of the 4,472 HCWs who responded to the questionnaire and were successfully fitted, 3,707 (82.9%) were successfully fitted with the first tested respirator, 551 (12.3%) required testing with a second model, and 214 (4.8%) required 3 or more tests. We noted an increased pass rate on the first attempt over time. Asians (excluding those from South and Central Asia) had the highest failure rate (16.3% [45 of 276 Asian HCWs were unsuccessfully fitted]), and whites had the lowest (9.8% [426 of 4,338 white HCWs]). Race was highly correlated with facial shape. Among occupational groups, doctors had the highest failure rate (13.4% [81 of 604 doctors]), but they also had the highest proportion of Asians. Prior education and/or training in respirator use were not associated with a higher pass rate. Certain facial characteristics were associated with higher or lower pass rates with regard to fit testing, and fit testers were able to select a suitable respirator on the basis of a visual assessment in the majority of cases. For the fit tester, training and experience were important factors; however, for the HCW being fitted, prior experience in respirator use was not an important factor.

  12. Scientific goals of the Cooperative Multiscale Experiment (CME)

    NASA Technical Reports Server (NTRS)

    Cotton, William

    1993-01-01

    Mesoscale Convective Systems (MCS) form the focus of CME. Recent developments in global climate models, the urgent need to improve the representation of the physics of convection, radiation, the boundary layer, and orography, and the surge of interest in coupling hydrologic, chemistry, and atmospheric models of various scales, have emphasized the need for a broad interdisciplinary and multi-scale approach to understanding and predicting MCS's and their interactions with processes at other scales. The role of mesoscale systems in the large-scale atmospheric circulation, the representation of organized convection and other mesoscale flux sources in terms of bulk properties, and the mutually consistent treatment of water vapor, clouds, radiation, and precipitation, are all key scientific issues concerning which CME will seek to increase understanding. The manner in which convective, mesoscale, and larger scale processes interact to produce and organize MCS's, the moisture cycling properties of MCS's, and the use of coupled cloud/mesoscale models to better understand these processes, are also major objectives of CME. Particular emphasis will be placed on the multi-scale role of MCS's in the hydrological cycle and in the production and transport of chemical trace constituents. The scientific goals of the CME consist of the following: understand how the large and small scales of motion influence the location, structure, intensity, and life cycles of MCS's; understand processes and conditions that determine the relative roles of balanced (slow manifold) and unbalanced (fast manifold) circulations in the dynamics of MCS's throughout their life cycles; assess the predictability of MCS's and improve the quantitative forecasting of precipitation and severe weather events; quantify the upscale feedback of MCS's to the large-scale environment and determine interrelationships between MCS occurrence and variations in the large-scale flow and surface forcing; provide a data base for initialization and verification of coupled regional, mesoscale/hydrologic, mesoscale/chemistry, and prototype mesoscale/cloud-resolving models for prediction of severe weather, ceilings, and visibility; provide a data base for initialization and validation of cloud-resolving models, and for assisting in the fabrication, calibration, and testing of cloud and MCS parameterization schemes; and provide a data base for validation of four dimensional data assimilation schemes and algorithms for retrieving cloud and state parameters from remote sensing instrumentation.

  13. CPTAC Evaluates Long-Term Reproducibility of Quantitative Proteomics Using Breast Cancer Xenografts | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.

  14. A random-walk/giant-loop model for interphase chromosomes.

    PubMed Central

    Sachs, R K; van den Engh, G; Trask, B; Yokota, H; Hearst, J E

    1995-01-01

    Fluorescence in situ hybridization data on distances between defined genomic sequences are used to construct a quantitative model for the overall geometric structure of a human chromosome. We suggest that the large-scale geometry during the G0/G1 part of the cell cycle may consist of flexible chromatin loops, averaging approximately 3 million bp, with a random-walk backbone. A fully explicit, three-parametric polymer model of this random-walk/giant-loop structure can account well for the data. More general models consistent with the data are briefly discussed. PMID:7708711

  15. Paleoclimatic signature in terrestrial flood deposits.

    PubMed

    Koltermann, C E; Gorelick, S M

    1992-06-26

    Large-scale process simulation was used to reconstruct the geologic evolution during the past 600,000 years of an alluvial fan in northern California. In order to reproduce the sedimentary record, the simulation accounted for the dynamics of river flooding, sedimentation, subsidence, land movement that resulted from faulting, and sea level changes. Paleoclimatic trends induced fluctuations in stream flows and dominated the development of the sedimentary deposits. The process simulation approach serves as a quantitative means to explore the genesis of sedimentary architecture and its link to past climatic conditions and fault motion.

  16. An Introduction to Magnetospheric Physics by Means of Simple Models

    NASA Technical Reports Server (NTRS)

    Stern, D. P.

    1981-01-01

    The large scale structure and behavior of the Earth's magnetosphere is discussed. The model is suitable for inclusion in courses on space physics, plasmas, astrophysics or the Earth's environment, as well as for self-study. Nine quantitative problems, dealing with properties of linear superpositions of a dipole and a constant field are presented. Topics covered include: open and closed models of the magnetosphere; field line motion; the role of magnetic merging (reconnection); magnetospheric convection; and the origin of the magnetopause, polar cusps, and high latitude lobes.

  17. [Advances in mass spectrometry-based approaches for neuropeptide analysis].

    PubMed

    Ji, Qianyue; Ma, Min; Peng, Xin; Jia, Chenxi; Ji, Qianyue

    2017-07-25

    Neuropeptides are an important class of endogenous bioactive substances involved in the function of the nervous system, and connect the brain and other neural and peripheral organs. Mass spectrometry-based neuropeptidomics are designed to study neuropeptides in a large-scale manner and obtain important molecular information to further understand the mechanism of nervous system regulation and the pathogenesis of neurological diseases. This review summarizes the basic strategies for the study of neuropeptides using mass spectrometry, including sample preparation and processing, qualitative and quantitative methods, and mass spectrometry imagining.

  18. Quantitative properties of clustering within modern microscopic nuclear models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Volya, A.; Tchuvil’sky, Yu. M., E-mail: tchuvl@nucl-th.sinp.msu.ru

    2016-09-15

    A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially themore » possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.« less

  19. Role of Correlations in the Collective Behavior of Microswimmer Suspensions

    NASA Astrophysics Data System (ADS)

    Stenhammar, Joakim; Nardini, Cesare; Nash, Rupert W.; Marenduzzo, Davide; Morozov, Alexander

    2017-07-01

    In this Letter, we study the collective behavior of a large number of self-propelled microswimmers immersed in a fluid. Using unprecedentedly large-scale lattice Boltzmann simulations, we reproduce the transition to bacterial turbulence. We show that, even well below the transition, swimmers move in a correlated fashion that cannot be described by a mean-field approach. We develop a novel kinetic theory that captures these correlations and is nonperturbative in the swimmer density. To provide an experimentally accessible measure of correlations, we calculate the diffusivity of passive tracers and reveal its nontrivial density dependence. The theory is in quantitative agreement with the lattice Boltzmann simulations and captures the asymmetry between pusher and puller swimmers below the transition to turbulence.

  20. Dynamic evolution of the spectrum of long-period fiber Bragg gratings fabricated from hydrogen-loaded optical fiber by ultraviolet laser irradiation.

    PubMed

    Fujita, Keio; Masuda, Yuji; Nakayama, Keisuke; Ando, Maki; Sakamoto, Kenji; Mohri, Jun-pei; Yamauchi, Makoto; Kimura, Masanori; Mizutani, Yasuo; Kimura, Susumu; Yokouchi, Takashi; Suzaki, Yoshifumi; Ejima, Seiki

    2005-11-20

    Long-period fiber Bragg gratings fabricated by exposure of hydrogen-loaded fiber to UV laser light exhibit large-scale dynamic evolution for approximately two weeks at room temperature. During this time two distinct features show up in their spectrum: a large upswing in wavelength and a substantial deepening of the transmission minimum. The dynamic evolution of the transmission spectrum is explained quantitatively by use of Malo's theory of UV-induced quenching [Electron. Lett. 30, 442 (1994)] followed by refilling of hydrogen in the fiber core and the theory of hydrogen diffusion in the fiber material. The amount of hydrogen quenched by the UV irradiation is 6% of the loaded hydrogen.

  1. Scale and time dependence of serial correlations in word-length time series of written texts

    NASA Astrophysics Data System (ADS)

    Rodriguez, E.; Aguilar-Cornejo, M.; Femat, R.; Alvarez-Ramirez, J.

    2014-11-01

    This work considered the quantitative analysis of large written texts. To this end, the text was converted into a time series by taking the sequence of word lengths. The detrended fluctuation analysis (DFA) was used for characterizing long-range serial correlations of the time series. To this end, the DFA was implemented within a rolling window framework for estimating the variations of correlations, quantified in terms of the scaling exponent, strength along the text. Also, a filtering derivative was used to compute the dependence of the scaling exponent relative to the scale. The analysis was applied to three famous English-written literary narrations; namely, Alice in Wonderland (by Lewis Carrol), Dracula (by Bram Stoker) and Sense and Sensibility (by Jane Austen). The results showed that high correlations appear for scales of about 50-200 words, suggesting that at these scales the text contains the stronger coherence. The scaling exponent was not constant along the text, showing important variations with apparent cyclical behavior. An interesting coincidence between the scaling exponent variations and changes in narrative units (e.g., chapters) was found. This suggests that the scaling exponent obtained from the DFA is able to detect changes in narration structure as expressed by the usage of words of different lengths.

  2. Psychometric Properties of the Quantitative Myasthenia Gravis Score and the Myasthenia Gravis Composite Scale.

    PubMed

    Barnett, Carolina; Merkies, Ingemar S J; Katzberg, Hans; Bril, Vera

    2015-09-02

    The Quantitative Myasthenia Gravis Score and the Myasthenia Gravis Composite are two commonly used outcome measures in Myasthenia Gravis. So far, their measurement properties have not been compared, so we aimed to study their psychometric properties using the Rasch model. 251 patients with stable myasthenia gravis were assessed with both scales, and 211 patients returned for a second assessment. We studied fit to the Rasch model at the first visit, and compared item fit, thresholds, differential item functioning, local dependence, person separation index, and tests for unidimensionality. We also assessed test-retest reliability and estimated the Minimal Detectable Change. Neither scale fit the Rasch model (X2p <  0.05). The Myasthenia Gravis Composite had lower discrimination properties than the Quantitative Myasthenia Gravis Scale (Person Separation Index: 0.14 and 0.7). There was local dependence in both scales, as well as differential item functioning for ocular and generalized disease. Disordered thresholds were found in 6(60%) items of the Myasthenia Gravis Composite and in 4(31%) of the Quantitative Myasthenia Gravis Score. Both tools had adequate test-retest reliability (ICCs >0.8). The minimally detectable change was 4.9 points for the Myasthenia Gravis Composite and 4.3 points for the Quantitative Myasthenia Gravis Score. Neither scale fulfilled Rasch model expectations. The Quantitative Myasthenia Gravis Score has higher discrimination than the Myasthenia Gravis Composite. Both tools have items with disordered thresholds, differential item functioning and local dependency. There was evidence of multidimensionality in the QMGS. The minimal detectable change values are higher than previous studies on the minimal significant change. These findings might inform future modifications of these tools.

  3. Determinants of fruit and vegetable consumption among children and adolescents: a review of the literature. Part II: qualitative studies.

    PubMed

    Krølner, Rikke; Rasmussen, Mette; Brug, Johannes; Klepp, Knut-Inge; Wind, Marianne; Due, Pernille

    2011-10-14

    Large proportions of children do not fulfil the World Health Organization recommendation of eating at least 400 grams of fruit and vegetables (FV) per day. To promote an increased FV intake among children it is important to identify factors which influence their consumption. Both qualitative and quantitative studies are needed. Earlier reviews have analysed evidence from quantitative studies. The aim of this paper is to present a systematic review of qualitative studies of determinants of children's FV intake. Relevant studies were identified by searching Anthropology Plus, Cinahl, CSA illumine, Embase, International Bibliography of the Social Sciences, Medline, PsycINFO, and Web of Science using combinations of synonyms for FV intake, children/adolescents and qualitative methods as search terms. The literature search was completed by December 1st 2010. Papers were included if they applied qualitative methods to investigate 6-18-year-olds' perceptions of factors influencing their FV consumption. Quantitative studies, review studies, studies reported in other languages than English, and non-peer reviewed or unpublished manuscripts were excluded. The papers were reviewed systematically using standardised templates for summary of papers, quality assessment, and synthesis of findings across papers. The review included 31 studies, mostly based on US populations and focus group discussions. The synthesis identified the following potential determinants for FV intake which supplement the quantitative knowledge base: Time costs; lack of taste guarantee; satiety value; appropriate time/occasions/settings for eating FV; sensory and physical aspects; variety, visibility, methods of preparation; access to unhealthy food; the symbolic value of food for image, gender identity and social interaction with peers; short term outcome expectancies. The review highlights numerous potential determinants which have not been investigated thoroughly in quantitative studies. Future large scale quantitative studies should attempt to quantify the importance of these factors. Further, mechanisms behind gender, age and socioeconomic differences in FV consumption are proposed which should be tested quantitatively in order to better tailor interventions to vulnerable groups. Finally, the review provides input to the conceptualisation and measurements of concepts (i.e. peer influence, availability in schools) which may refine survey instruments and theoretical frameworks concerning eating behaviours.

  4. 3D-PTV around Operational Wind Turbines

    NASA Astrophysics Data System (ADS)

    Brownstein, Ian; Dabiri, John

    2016-11-01

    Laboratory studies and numerical simulations of wind turbines are typically constrained in how they can inform operational turbine behavior. Laboratory experiments are usually unable to match both pertinent parameters of full-scale wind turbines, the Reynolds number (Re) and tip speed ratio, using scaled-down models. Additionally, numerical simulations of the flow around wind turbines are constrained by the large domain size and high Re that need to be simulated. When these simulations are preformed, turbine geometry is typically simplified resulting in flow structures near the rotor not being well resolved. In order to bypass these limitations, a quantitative flow visualization method was developed to take in situ measurements of the flow around wind turbines at the Field Laboratory for Optimized Wind Energy (FLOWE) in Lancaster, CA. The apparatus constructed was able to seed an approximately 9m x 9m x 5m volume in the wake of the turbine using artificial snow. Quantitative measurements were obtained by tracking the evolution of the artificial snow using a four camera setup. The methodology for calibrating and collecting data, as well as preliminary results detailing the flow around a 2kW vertical-axis wind turbine (VAWT), will be presented.

  5. A centennial tribute to G.K. Gilbert's Hydraulic Mining Débris in the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    James, L. A.; Phillips, J. D.; Lecce, S. A.

    2017-10-01

    G.K. Gilbert's (1917) classic monograph, Hydraulic-Mining Débris in the Sierra Nevada, is described and put into the context of modern geomorphic knowledge. The emphasis here is on large-scale applied fluvial geomorphology, but other key elements-e.g., coastal geomorphology-are also briefly covered. A brief synopsis outlines key elements of the monograph, followed by discussions of highly influential aspects including the integrated watershed perspective, the extreme example of anthropogenic sedimentation, computation of a quantitative, semidistributed sediment budget, and advent of sediment-wave theory. Although Gilbert did not address concepts of equilibrium and grade in much detail, the rivers of the northwestern Sierra Nevada were highly disrupted and thrown into a condition of nonequilibrium. Therefore, concepts of equilibrium and grade-for which Gilbert's early work is often cited-are discussed. Gilbert's work is put into the context of complex nonlinear dynamics in geomorphic systems and how these concepts can be used to interpret the nonequilibrium systems described by Gilbert. Broad, basin-scale studies were common in the period, but few were as quantitative and empirically rigorous or employed such a range of methodologies as PP105. None demonstrated such an extreme case of anthropogeomorphic change.

  6. Nonlinear optical microscopy and ultrasound imaging of human cervical structure

    NASA Astrophysics Data System (ADS)

    Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.; Hall, Timothy J.

    2013-03-01

    The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth.

  7. FracPaQ: a MATLAB™ Toolbox for the Quantification of Fracture Patterns

    NASA Astrophysics Data System (ADS)

    Healy, D.; Rizzo, R. E.; Cornwell, D. G.; Timms, N.; Farrell, N. J.; Watkins, H.; Gomez-Rivas, E.; Smith, M.

    2016-12-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying the fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The method presented is inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. Planned future releases will incorporate multi-scale analyses based on a wavelet method to look for scale transitions, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern.

  8. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  9. War, space, and the evolution of Old World complex societies.

    PubMed

    Turchin, Peter; Currie, Thomas E; Turner, Edward A L; Gavrilets, Sergey

    2013-10-08

    How did human societies evolve from small groups, integrated by face-to-face cooperation, to huge anonymous societies of today, typically organized as states? Why is there so much variation in the ability of different human populations to construct viable states? Existing theories are usually formulated as verbal models and, as a result, do not yield sharply defined, quantitative predictions that could be unambiguously tested with data. Here we develop a cultural evolutionary model that predicts where and when the largest-scale complex societies arose in human history. The central premise of the model, which we test, is that costly institutions that enabled large human groups to function without splitting up evolved as a result of intense competition between societies-primarily warfare. Warfare intensity, in turn, depended on the spread of historically attested military technologies (e.g., chariots and cavalry) and on geographic factors (e.g., rugged landscape). The model was simulated within a realistic landscape of the Afroeurasian landmass and its predictions were tested against a large dataset documenting the spatiotemporal distribution of historical large-scale societies in Afroeurasia between 1,500 BCE and 1,500 CE. The model-predicted pattern of spread of large-scale societies was very similar to the observed one. Overall, the model explained 65% of variance in the data. An alternative model, omitting the effect of diffusing military technologies, explained only 16% of variance. Our results support theories that emphasize the role of institutions in state-building and suggest a possible explanation why a long history of statehood is positively correlated with political stability, institutional quality, and income per capita.

  10. War, space, and the evolution of Old World complex societies

    PubMed Central

    Turchin, Peter; Currie, Thomas E.; Turner, Edward A. L.; Gavrilets, Sergey

    2013-01-01

    How did human societies evolve from small groups, integrated by face-to-face cooperation, to huge anonymous societies of today, typically organized as states? Why is there so much variation in the ability of different human populations to construct viable states? Existing theories are usually formulated as verbal models and, as a result, do not yield sharply defined, quantitative predictions that could be unambiguously tested with data. Here we develop a cultural evolutionary model that predicts where and when the largest-scale complex societies arose in human history. The central premise of the model, which we test, is that costly institutions that enabled large human groups to function without splitting up evolved as a result of intense competition between societies—primarily warfare. Warfare intensity, in turn, depended on the spread of historically attested military technologies (e.g., chariots and cavalry) and on geographic factors (e.g., rugged landscape). The model was simulated within a realistic landscape of the Afroeurasian landmass and its predictions were tested against a large dataset documenting the spatiotemporal distribution of historical large-scale societies in Afroeurasia between 1,500 BCE and 1,500 CE. The model-predicted pattern of spread of large-scale societies was very similar to the observed one. Overall, the model explained 65% of variance in the data. An alternative model, omitting the effect of diffusing military technologies, explained only 16% of variance. Our results support theories that emphasize the role of institutions in state-building and suggest a possible explanation why a long history of statehood is positively correlated with political stability, institutional quality, and income per capita. PMID:24062433

  11. Evolution of Precipitation Structure During the November DYNAMO MJO Event: Cloud-Resolving Model Intercomparison and Cross Validation Using Radar Observations

    NASA Astrophysics Data System (ADS)

    Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong

    2018-04-01

    Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.

  12. Disproportionate photosynthetic decline and inverse relationship between constitutive and induced volatile emissions upon feeding of Quercus robur leaves by large larvae of gypsy moth (Lymantria dispar)

    PubMed Central

    Copolovici, Lucian; Pag, Andreea; Kännaste, Astrid; Bodescu, Adina; Tomescu, Daniel; Copolovici, Dana; Soran, Maria-Loredana; Niinemets, Ülo

    2018-01-01

    Gypsy moth (Lymantria dispar L., Lymantriinae) is a major pest of pedunculate oak (Quercus robur) forests in Europe, but how its infections scale with foliage physiological characteristics, in particular with photosynthesis rates and emissions of volatile organic compounds has not been studied. Differently from the majority of insect herbivores, large larvae of L. dispar rapidly consume leaf area, and can also bite through tough tissues, including secondary and primary leaf veins. Given the rapid and devastating feeding responses, we hypothesized that infection of Q. robur leaves by L. dispar leads to disproportionate scaling of leaf photosynthesis and constitutive isoprene emissions with damaged leaf area, and to less prominent enhancements of induced volatile release. Leaves with 0% (control) to 50% of leaf area removed by larvae were studied. Across this range of infection severity, all physiological characteristics were quantitatively correlated with the degree of damage, but all these traits changed disproportionately with the degree of damage. The net assimilation rate was reduced by almost 10-fold and constitutive isoprene emissions by more than 7-fold, whereas the emissions of green leaf volatiles, monoterpenes, methyl salicylate and the homoterpene (3E)-4,8-dimethy-1,3,7-nonatriene scaled negatively and almost linearly with net assimilation rate through damage treatments. This study demonstrates that feeding by large insect herbivores disproportionately alters photosynthetic rate and constitutive isoprene emissions. Furthermore, the leaves have a surprisingly large capacity for enhancement of induced emissions even when foliage photosynthetic function is severely impaired. PMID:29367792

  13. "What else are you worried about?" – Integrating textual responses into quantitative social science research

    PubMed Central

    Brümmer, Martin; Schmukle, Stefan C.; Goebel, Jan; Wagner, Gert G.

    2017-01-01

    Open-ended questions have routinely been included in large-scale survey and panel studies, yet there is some perplexity about how to actually incorporate the answers to such questions into quantitative social science research. Tools developed recently in the domain of natural language processing offer a wide range of options for the automated analysis of such textual data, but their implementation has lagged behind. In this study, we demonstrate straightforward procedures that can be applied to process and analyze textual data for the purposes of quantitative social science research. Using more than 35,000 textual answers to the question “What else are you worried about?” from participants of the German Socio-economic Panel Study (SOEP), we (1) analyzed characteristics of respondents that determined whether they answered the open-ended question, (2) used the textual data to detect relevant topics that were reported by the respondents, and (3) linked the features of the respondents to the worries they reported in their textual data. The potential uses as well as the limitations of the automated analysis of textual data are discussed. PMID:28759628

  14. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    PubMed

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  15. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  16. Hd6, a rice quantitative trait locus involved in photoperiod sensitivity, encodes the α subunit of protein kinase CK2

    PubMed Central

    Takahashi, Yuji; Shomura, Ayahiko; Sasaki, Takuji; Yano, Masahiro

    2001-01-01

    Hd6 is a quantitative trait locus involved in rice photoperiod sensitivity. It was detected in backcross progeny derived from a cross between the japonica variety Nipponbare and the indica variety Kasalath. To isolate a gene at Hd6, we used a large segregating population for the high-resolution and fine-scale mapping of Hd6 and constructed genomic clone contigs around the Hd6 region. Linkage analysis with P1-derived artificial chromosome clone-derived DNA markers delimited Hd6 to a 26.4-kb genomic region. We identified a gene encoding the α subunit of protein kinase CK2 (CK2α) in this region. The Nipponbare allele of CK2α contains a premature stop codon, and the resulting truncated product is undoubtedly nonfunctional. Genetic complementation analysis revealed that the Kasalath allele of CK2α increases days-to-heading. Map-based cloning with advanced backcross progeny enabled us to identify a gene underlying a quantitative trait locus even though it exhibited a relatively small effect on the phenotype. PMID:11416158

  17. "What else are you worried about?" - Integrating textual responses into quantitative social science research.

    PubMed

    Rohrer, Julia M; Brümmer, Martin; Schmukle, Stefan C; Goebel, Jan; Wagner, Gert G

    2017-01-01

    Open-ended questions have routinely been included in large-scale survey and panel studies, yet there is some perplexity about how to actually incorporate the answers to such questions into quantitative social science research. Tools developed recently in the domain of natural language processing offer a wide range of options for the automated analysis of such textual data, but their implementation has lagged behind. In this study, we demonstrate straightforward procedures that can be applied to process and analyze textual data for the purposes of quantitative social science research. Using more than 35,000 textual answers to the question "What else are you worried about?" from participants of the German Socio-economic Panel Study (SOEP), we (1) analyzed characteristics of respondents that determined whether they answered the open-ended question, (2) used the textual data to detect relevant topics that were reported by the respondents, and (3) linked the features of the respondents to the worries they reported in their textual data. The potential uses as well as the limitations of the automated analysis of textual data are discussed.

  18. Organizational structure, team process, and future directions of interprofessional health care teams.

    PubMed

    Cole, Kenneth D; Waite, Martha S; Nichols, Linda O

    2003-01-01

    For a nationwide Geriatric Interdisciplinary Team Training (GITT) program evaluation of 8 sites and 26 teams, team evaluators developed a quantitative and qualitative team observation scale (TOS), examining structure, process, and outcome, with specific focus on the training function. Qualitative data provided an important expansion of quantitative data, highlighting positive effects that were not statistically significant, such as role modeling and training occurring within the clinical team. Qualitative data could also identify "too much" of a coded variable, such as time spent in individual team members' assessments and treatment plans. As healthcare organizations have increasing demands for productivity and changing reimbursement, traditional models of teamwork, with large teams and structured meetings, may no longer be as functional as they once were. To meet these constraints and to train students in teamwork, teams of the future will have to make choices, from developing and setting specific models to increasing the use of information technology to create virtual teams. Both quantitative and qualitative data will be needed to evaluate these new types of teams and the important outcomes they produce.

  19. High-Content Microscopy Analysis of Subcellular Structures: Assay Development and Application to Focal Adhesion Quantification.

    PubMed

    Kroll, Torsten; Schmidt, David; Schwanitz, Georg; Ahmad, Mubashir; Hamann, Jana; Schlosser, Corinne; Lin, Yu-Chieh; Böhm, Konrad J; Tuckermann, Jan; Ploubidou, Aspasia

    2016-07-01

    High-content analysis (HCA) converts raw light microscopy images to quantitative data through the automated extraction, multiparametric analysis, and classification of the relevant information content. Combined with automated high-throughput image acquisition, HCA applied to the screening of chemicals or RNAi-reagents is termed high-content screening (HCS). Its power in quantifying cell phenotypes makes HCA applicable also to routine microscopy. However, developing effective HCA and bioinformatic analysis pipelines for acquisition of biologically meaningful data in HCS is challenging. Here, the step-by-step development of an HCA assay protocol and an HCS bioinformatics analysis pipeline are described. The protocol's power is demonstrated by application to focal adhesion (FA) detection, quantitative analysis of multiple FA features, and functional annotation of signaling pathways regulating FA size, using primary data of a published RNAi screen. The assay and the underlying strategy are aimed at researchers performing microscopy-based quantitative analysis of subcellular features, on a small scale or in large HCS experiments. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  20. Quantitative image analysis of cellular heterogeneity in breast tumors complements genomic profiling.

    PubMed

    Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian

    2012-10-24

    Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.

  1. Patterns of Metabolite Changes Identified from Large-Scale Gene Perturbations in Arabidopsis Using a Genome-Scale Metabolic Network1[OPEN

    PubMed Central

    Kim, Taehyong; Dreher, Kate; Nilo-Poyanco, Ricardo; Lee, Insuk; Fiehn, Oliver; Lange, Bernd Markus; Nikolau, Basil J.; Sumner, Lloyd; Welti, Ruth; Wurtele, Eve S.; Rhee, Seung Y.

    2015-01-01

    Metabolomics enables quantitative evaluation of metabolic changes caused by genetic or environmental perturbations. However, little is known about how perturbing a single gene changes the metabolic system as a whole and which network and functional properties are involved in this response. To answer this question, we investigated the metabolite profiles from 136 mutants with single gene perturbations of functionally diverse Arabidopsis (Arabidopsis thaliana) genes. Fewer than 10 metabolites were changed significantly relative to the wild type in most of the mutants, indicating that the metabolic network was robust to perturbations of single metabolic genes. These changed metabolites were closer to each other in a genome-scale metabolic network than expected by chance, supporting the notion that the genetic perturbations changed the network more locally than globally. Surprisingly, the changed metabolites were close to the perturbed reactions in only 30% of the mutants of the well-characterized genes. To determine the factors that contributed to the distance between the observed metabolic changes and the perturbation site in the network, we examined nine network and functional properties of the perturbed genes. Only the isozyme number affected the distance between the perturbed reactions and changed metabolites. This study revealed patterns of metabolic changes from large-scale gene perturbations and relationships between characteristics of the perturbed genes and metabolic changes. PMID:25670818

  2. Heterogeneity and scale of sustainable development in cities.

    PubMed

    Brelsford, Christa; Lobo, José; Hand, Joe; Bettencourt, Luís M A

    2017-08-22

    Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals.

  3. Heterogeneity and scale of sustainable development in cities

    PubMed Central

    Brelsford, Christa; Lobo, José; Hand, Joe

    2017-01-01

    Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals. PMID:28461489

  4. Porous Silicon Antibody Microarrays for Quantitative Analysis: Measurement of Free and Total PSA in Clinical Plasma Samples

    PubMed Central

    Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas

    2014-01-01

    The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878

  5. Individuals with autism have higher 8-Iso-PGF2α levels than controls, but no correlation with quantitative assay of Paraoxonase 1 serum levels.

    PubMed

    Pop, Bianca; Niculae, Alexandru-Ștefan; Pop, Tudor Lucian; Răchișan, Andreea Liana

    2017-12-01

    Autism spectrum disorder (ASD) represents a very large set of neurodevelopmental issues with diverse clinical outcomes. Various hypotheses have been put forth for the etiology of autism spectrum disorder, including issues pertaining to oxidative stress. In this study, we conducted measurements of serum 8-Iso-Prostaglanding F2 α (8-iso-PGF2α, which is the results of non-enzimatically mediated polyunsaturated fatty acid oxidation) in a population of individuals with autism and a control group of age and sex matched controls. A quantitative assay of Paraoxonase 1 (PON1) was conducted. Data regarding comorbidities, structural MRI scans, medication, intelligence quotient (IQ) and Childhood Autism Rating Scale scores (CARS) were also included in our study. Our results show that patients diagnosed with autism have higher levels of 8-iso-PGF2α than their neurotypical counterparts. Levels of this particular metabolite, however, do not correlate with quantitative serum levels of Paraoxonase 1, which has been shown to be altered in individuals with autism. Neither 8-iso-PGF2α nor quantitative levels of PON1 provide any meaningful correlation with clinical or neuroimaging data in this study group. Future research should focus on providing data regarding PON 1 phenotype, in addition to standard quantitative measurements, in relation to 8-iso-PGF2α as well as other clinical and structural brain findings.

  6. Quantitative Evaluation of Musical Scale Tunings

    ERIC Educational Resources Information Center

    Hall, Donald E.

    1974-01-01

    The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)

  7. Quantitative challenges to our understanding of the tectonostratigraphic evolution of rift basin systems

    NASA Astrophysics Data System (ADS)

    Olsen, P. E.; Kent, D. V.

    2012-12-01

    Pervasive orbitally-paced lake level cycles combined with magnetic polarity stratigraphy in central Pangean early Mesozoic rift basins provide a thus far unique and very large-scale quantitative basis for observing patterns of basin fill and comparisons with other basins. The 32 Myr accumulation rate history of the Newark basin is segmented into intervals lasting millions of years with virtually no change in the long-term accumulation rate (at the 400-kyr-scale), and the transitions between segments are abrupt and apparently basin-wide. This is startling, because the basin geometry was, and is, a half graben - triangular in cross section and dish-shaped in along-strike section. The long periods of time with virtually no change is challenging given a simple model of basin growth (1), suggesting some kind of compensation in sediment input for the increasing surface of the area of the basin through time. Perhaps even more challenging are observations based on magnetic polarity stratigraphy and the cyclicity, that basins distributed over a huge area of central Pangea (~700,000 km2) show parallel and correlative quantitative changes in accumulation rate with those of the Newark basin. The synchronous changes in the accumulation rate in these basins suggests a very large-scale linkage, the only plausible mechanism for which would seem to be at the plate-tectonic scale, perhaps involving extension rates. Together, we can speculate that some kind of balance between extension rates, basin accommodation space and perhaps regional drainage basin size might have been in operation The most dramatic accumulation rate change in the basins' histories occurred close to, and perhaps causally related to, the Triassic-Jurassic boundary and end-Triassic extinction. The Newark basin, for example exhibits a 4-to-5-fold increase in accumulation rate during the emplacement of the brief (<1 Myr) and aerially massive Central Atlantic Magmatic Province (CAMP) beginning at 201.5 Ma, the only igneous event known during this long rifting episode. Parallel and correlative accumulation rate changes are seen in several of the other northern basins within central Pangea. Surprisingly, the rate of accommodation growth apparently increased dramatically during this time, because not only did the accumulation rate dramatically increase, the lakes apparently deepened during the same time as a huge volume of CAMP igneous material entered the basins. At the same time, the more southern basins in the southeastern US, apparently ceased to subside (2). Our ability to measure time in these rift basins using the orbitally-paced cycles, coupled with the ability to correlate between the basins using magnetic polarity stratigraphy, challenges us to form new mechanistic explanations and quantitative models to test against this rich library of observations. References: 1) Schlische RW & Olsen PE, 1990, Jour. Geol. 98:135. 2) Schlische et al., 2003, in Hames WE et al. (eds), Geophys. Monogr. 136:61.

  8. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  9. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  10. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  11. A New, Continuous 5400 Yr-long Paleotsunami Record from Lake Huelde, Chiloe Island, South Central Chile.

    NASA Astrophysics Data System (ADS)

    Kempf, P.; Moernaut, J.; Vandoorne, W.; Van Daele, M. E.; Pino, M.; Urrutia, R.; De Batist, M. A. O.

    2014-12-01

    After the last decade of extreme tsunami events with catastrophic damage to infrastructure and a horrendous amount of casualties, it is clear that more and better paleotsunami records are needed to improve our understanding of the recurrence intervals and intensities of large-scale tsunamis. Coastal lakes (e.g. Bradley Lake, Cascadia; Kelsey et al., 2005) have the potential to contain long and continuous sedimentary records, which is an important asset in view of the centennial- to millennial-scale recurrence times of great tsunami-triggering earthquakes. Lake Huelde on Chiloé Island (42.5°S), Chile, is a coastal lake located in the middle of the Valdivia segment, which is known for having produced the strongest ever instrumentally recorded earthquake in 1960 AD (MW: 9.5), and other large earthquakes prior to that: i.e. 1837 AD, 1737 AD (no report of a tsunami) and 1575 AD (Lomnitz, 1970, 2004, Cisternas et al., 2005). We present a new 5400 yr-long paleotsunami record with a Bayesian age-depth model based on 23 radiocarbon dates that exceeds all previous paleotsunami records from the Valdivia segment, both in terms of length and of continuity. 18 events are described and a semi-quantitative measure of the event intensity at the study area is given, revealing at least two predecessors of the 1960 AD event in the mid to late Holocene that are equal in intensity. The resulting implications from the age-depth model and from the semi-quantitative intensity reconstruction are discussed in this contribution.

  12. Enhanced conformational sampling technique provides an energy landscape view of large-scale protein conformational transitions.

    PubMed

    Shao, Qiang

    2016-10-26

    Large-scale conformational changes in proteins are important for their functions. Tracking the conformational change in real time at the level of a single protein molecule, however, remains a great challenge. In this article, we present a novel in silico approach with the combination of normal mode analysis and integrated-tempering-sampling molecular simulation (NMA-ITS) to give quantitative data for exploring the conformational transition pathway in multi-dimensional energy landscapes starting only from the knowledge of the two endpoint structures of the protein. The open-to-closed transitions of three proteins, including nCaM, AdK, and HIV-1 PR, were investigated using NMA-ITS simulations. The three proteins have varied structural flexibilities and domain communications in their respective conformational changes. The transition state structure in the conformational change of nCaM and the associated free-energy barrier are in agreement with those measured in a standard explicit-solvent REMD simulation. The experimentally measured transition intermediate structures of the intrinsically flexible AdK are captured by the conformational transition pathway measured here. The dominant transition pathways between the closed and fully open states of HIV-1 PR are very similar to those observed in recent REMD simulations. Finally, the evaluated relaxation times of the conformational transitions of three proteins are roughly at the same level as reported experimental data. Therefore, the NMA-ITS method is applicable for a variety of cases, providing both qualitative and quantitative insights into the conformational changes associated with the real functions of proteins.

  13. SOME USES OF MODELS OF QUANTITATIVE GENETIC SELECTION IN SOCIAL SCIENCE.

    PubMed

    Weight, Michael D; Harpending, Henry

    2017-01-01

    The theory of selection of quantitative traits is widely used in evolutionary biology, agriculture and other related fields. The fundamental model known as the breeder's equation is simple, robust over short time scales, and it is often possible to estimate plausible parameters. In this paper it is suggested that the results of this model provide useful yardsticks for the description of social traits and the evaluation of transmission models. The differences on a standard personality test between samples of Old Order Amish and Indiana rural young men from the same county and the decline of homicide in Medieval Europe are used as illustrative examples of the overall approach. It is shown that the decline of homicide is unremarkable under a threshold model while the differences between rural Amish and non-Amish young men are too large to be a plausible outcome of simple genetic selection in which assortative mating by affiliation is equivalent to truncation selection.

  14. The memory remains: Understanding collective memory in the digital age

    PubMed Central

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-01-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects. PMID:28435881

  15. Quantitative analysis of the evolution of novelty in cinema through crowdsourced keywords.

    PubMed

    Sreenivasan, Sameet

    2013-09-26

    The generation of novelty is central to any creative endeavor. Novelty generation and the relationship between novelty and individual hedonic value have long been subjects of study in social psychology. However, few studies have utilized large-scale datasets to quantitatively investigate these issues. Here we consider the domain of American cinema and explore these questions using a database of films spanning a 70 year period. We use crowdsourced keywords from the Internet Movie Database as a window into the contents of films, and prescribe novelty scores for each film based on occurrence probabilities of individual keywords and keyword-pairs. These scores provide revealing insights into the dynamics of novelty in cinema. We investigate how novelty influences the revenue generated by a film, and find a relationship that resembles the Wundt-Berlyne curve. We also study the statistics of keyword occurrence and the aggregate distribution of keywords over a 100 year period.

  16. Automatic analysis of quantitative NMR data of pharmaceutical compound libraries.

    PubMed

    Liu, Xuejun; Kolpak, Michael X; Wu, Jiejun; Leo, Gregory C

    2012-08-07

    In drug discovery, chemical library compounds are usually dissolved in DMSO at a certain concentration and then distributed to biologists for target screening. Quantitative (1)H NMR (qNMR) is the preferred method for the determination of the actual concentrations of compounds because the relative single proton peak areas of two chemical species represent the relative molar concentrations of the two compounds, that is, the compound of interest and a calibrant. Thus, an analyte concentration can be determined using a calibration compound at a known concentration. One particularly time-consuming step in the qNMR analysis of compound libraries is the manual integration of peaks. In this report is presented an automated method for performing this task without prior knowledge of compound structures and by using an external calibration spectrum. The script for automated integration is fast and adaptable to large-scale data sets, eliminating the need for manual integration in ~80% of the cases.

  17. Parametric studies with an atmospheric diffusion model that assesses toxic fuel hazards due to the ground clouds generated by rocket launches

    NASA Technical Reports Server (NTRS)

    Stewart, R. B.; Grose, W. L.

    1975-01-01

    Parametric studies were made with a multilayer atmospheric diffusion model to place quantitative limits on the uncertainty of predicting ground-level toxic rocket-fuel concentrations. Exhaust distributions in the ground cloud, cloud stabilized geometry, atmospheric coefficients, the effects of exhaust plume afterburning of carbon monoxide CO, assumed surface mixing-layer division in the model, and model sensitivity to different meteorological regimes were studied. Large-scale differences in ground-level predictions are quantitatively described. Cloud alongwind growth for several meteorological conditions is shown to be in error because of incorrect application of previous diffusion theory. In addition, rocket-plume calculations indicate that almost all of the rocket-motor carbon monoxide is afterburned to carbon dioxide CO2, thus reducing toxic hazards due to CO. The afterburning is also shown to have a significant effect on cloud stabilization height and on ground-level concentrations of exhaust products.

  18. The memory remains: Understanding collective memory in the digital age.

    PubMed

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-04-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects.

  19. Evidence for ice-ocean albedo feedback in the Arctic Ocean shifting to a seasonal ice zone.

    PubMed

    Kashiwase, Haruhiko; Ohshima, Kay I; Nihashi, Sohey; Eicken, Hajo

    2017-08-15

    Ice-albedo feedback due to the albedo contrast between water and ice is a major factor in seasonal sea ice retreat, and has received increasing attention with the Arctic Ocean shifting to a seasonal ice cover. However, quantitative evaluation of such feedbacks is still insufficient. Here we provide quantitative evidence that heat input through the open water fraction is the primary driver of seasonal and interannual variations in Arctic sea ice retreat. Analyses of satellite data (1979-2014) and a simplified ice-upper ocean coupled model reveal that divergent ice motion in the early melt season triggers large-scale feedback which subsequently amplifies summer sea ice anomalies. The magnitude of divergence controlling the feedback has doubled since 2000 due to a more mobile ice cover, which can partly explain the recent drastic ice reduction in the Arctic Ocean.

  20. Quantitative analysis of the evolution of novelty in cinema through crowdsourced keywords

    PubMed Central

    Sreenivasan, Sameet

    2013-01-01

    The generation of novelty is central to any creative endeavor. Novelty generation and the relationship between novelty and individual hedonic value have long been subjects of study in social psychology. However, few studies have utilized large-scale datasets to quantitatively investigate these issues. Here we consider the domain of American cinema and explore these questions using a database of films spanning a 70 year period. We use crowdsourced keywords from the Internet Movie Database as a window into the contents of films, and prescribe novelty scores for each film based on occurrence probabilities of individual keywords and keyword-pairs. These scores provide revealing insights into the dynamics of novelty in cinema. We investigate how novelty influences the revenue generated by a film, and find a relationship that resembles the Wundt-Berlyne curve. We also study the statistics of keyword occurrence and the aggregate distribution of keywords over a 100 year period. PMID:24067890

  1. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  2. Multi-approaches analysis reveals local adaptation in the emmer wheat (Triticum dicoccoides) at macro- but not micro-geographical scale.

    PubMed

    Volis, Sergei; Ormanbekova, Danara; Yermekbayev, Kanat; Song, Minshu; Shulgina, Irina

    2015-01-01

    Detecting local adaptation and its spatial scale is one of the most important questions of evolutionary biology. However, recognition of the effect of local selection can be challenging when there is considerable environmental variation across the distance at the whole species range. We analyzed patterns of local adaptation in emmer wheat, Triticum dicoccoides, at two spatial scales, small (inter-population distance less than one km) and large (inter-population distance more than 50 km) using several approaches. Plants originating from four distinct habitats at two geographic scales (cold edge, arid edge and two topographically dissimilar core locations) were reciprocally transplanted and their success over time was measured as 1) lifetime fitness in a year of planting, and 2) population growth four years after planting. In addition, we analyzed molecular (SSR) and quantitative trait variation and calculated the QST/FST ratio. No home advantage was detected at the small spatial scale. At the large spatial scale, home advantage was detected for the core population and the cold edge population in the year of introduction via measuring life-time plant performance. However, superior performance of the arid edge population in its own environment was evident only after several generations via measuring experimental population growth rate through genotyping with SSRs allowing counting the number of plants and seeds per introduced genotype per site. These results highlight the importance of multi-generation surveys of population growth rate in local adaptation testing. Despite predominant self-fertilization of T. dicoccoides and the associated high degree of structuring of genetic variation, the results of the QST - FST comparison were in general agreement with the pattern of local adaptation at the two spatial scales detected by reciprocal transplanting.

  3. A Method for Semi-quantitative Assessment of Exposure to Pesticides of Applicators and Re-entry Workers: An Application in Three Farming Systems in Ethiopia.

    PubMed

    Negatu, Beyene; Vermeulen, Roel; Mekonnen, Yalemtshay; Kromhout, Hans

    2016-07-01

    To develop an inexpensive and easily adaptable semi-quantitative exposure assessment method to characterize exposure to pesticide in applicators and re-entry farmers and farm workers in Ethiopia. Two specific semi-quantitative exposure algorithms for pesticides applicators and re-entry workers were developed and applied to 601 farm workers employed in 3 distinctly different farming systems [small-scale irrigated, large-scale greenhouses (LSGH), and large-scale open (LSO)] in Ethiopia. The algorithm for applicators was based on exposure-modifying factors including application methods, farm layout (open or closed), pesticide mixing conditions, cleaning of spraying equipment, intensity of pesticide application per day, utilization of personal protective equipment (PPE), personal hygienic behavior, annual frequency of application, and duration of employment at the farm. The algorithm for re-entry work was based on an expert-based re-entry exposure intensity score, utilization of PPE, personal hygienic behavior, annual frequency of re-entry work, and duration of employment at the farm. The algorithms allowed estimation of daily, annual and cumulative lifetime exposure for applicators, and re-entry workers by farming system, by gender, and by age group. For all metrics, highest exposures occurred in LSGH for both applicators and female re-entry workers. For male re-entry workers, highest cumulative exposure occurred in LSO farms. Female re-entry workers appeared to be higher exposed on a daily or annual basis than male re-entry workers, but their cumulative exposures were similar due to the fact that on average males had longer tenure. Factors related to intensity of exposure (like application method and farm layout) were indicated as the main driving factors for estimated potential exposure. Use of personal protection, hygienic behavior, and duration of employment in surveyed farm workers contributed less to the contrast in exposure estimates. This study indicated that farmers' and farm workers' exposure to pesticides can be inexpensively characterized, ranked, and classified. Our method could be extended to assess exposure to specific active ingredients provided that detailed information on pesticides used is available. The resulting exposure estimates will consequently be used in occupational epidemiology studies in Ethiopia and other similar countries with few resources. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  4. Large-scale human skin lipidomics by quantitative, high-throughput shotgun mass spectrometry.

    PubMed

    Sadowski, Tomasz; Klose, Christian; Gerl, Mathias J; Wójcik-Maciejewicz, Anna; Herzog, Ronny; Simons, Kai; Reich, Adam; Surma, Michal A

    2017-03-07

    The lipid composition of human skin is essential for its function; however the simultaneous quantification of a wide range of stratum corneum (SC) and sebaceous lipids is not trivial. We developed and validated a quantitative high-throughput shotgun mass spectrometry-based platform for lipid analysis of tape-stripped SC skin samples. It features coverage of 16 lipid classes; total quantification to the level of individual lipid molecules; high reproducibility and high-throughput capabilities. With this method we conducted a large lipidomic survey of 268 human SC samples, where we investigated the relationship between sampling depth and lipid composition, lipidome variability in samples from 14 different sampling sites on the human body and finally, we assessed the impact of age and sex on lipidome variability in 104 healthy subjects. We found sebaceous lipids to constitute an abundant component of the SC lipidome as they diffuse into the topmost SC layers forming a gradient. Lipidomic variability with respect to sampling depth, site and subject is considerable, and mainly accredited to sebaceous lipids, while stratum corneum lipids vary less. This stresses the importance of sampling design and the role of sebaceous lipids in skin studies.

  5. Voltage Imaging of Waking Mouse Cortex Reveals Emergence of Critical Neuronal Dynamics

    PubMed Central

    Scott, Gregory; Fagerholm, Erik D.; Mutoh, Hiroki; Leech, Robert; Sharp, David J.; Shew, Woodrow L.

    2014-01-01

    Complex cognitive processes require neuronal activity to be coordinated across multiple scales, ranging from local microcircuits to cortex-wide networks. However, multiscale cortical dynamics are not well understood because few experimental approaches have provided sufficient support for hypotheses involving multiscale interactions. To address these limitations, we used, in experiments involving mice, genetically encoded voltage indicator imaging, which measures cortex-wide electrical activity at high spatiotemporal resolution. Here we show that, as mice recovered from anesthesia, scale-invariant spatiotemporal patterns of neuronal activity gradually emerge. We show for the first time that this scale-invariant activity spans four orders of magnitude in awake mice. In contrast, we found that the cortical dynamics of anesthetized mice were not scale invariant. Our results bridge empirical evidence from disparate scales and support theoretical predictions that the awake cortex operates in a dynamical regime known as criticality. The criticality hypothesis predicts that small-scale cortical dynamics are governed by the same principles as those governing larger-scale dynamics. Importantly, these scale-invariant principles also optimize certain aspects of information processing. Our results suggest that during the emergence from anesthesia, criticality arises as information processing demands increase. We expect that, as measurement tools advance toward larger scales and greater resolution, the multiscale framework offered by criticality will continue to provide quantitative predictions and insight on how neurons, microcircuits, and large-scale networks are dynamically coordinated in the brain. PMID:25505314

  6. New well pattern optimization methodology in mature low-permeability anisotropic reservoirs

    NASA Astrophysics Data System (ADS)

    Qin, Jiazheng; Liu, Yuetian; Feng, Yueli; Ding, Yao; Liu, Liu; He, Youwei

    2018-02-01

    In China, lots of well patterns were designed before people knew the principal permeability direction in low-permeability anisotropic reservoirs. After several years’ production, it turns out that well line direction is unparallel with principal permeability direction. However, traditional well location optimization methods (in terms of the objective function such as net present value and/or ultimate recovery) are inapplicable, since wells are not free to move around in a mature oilfield. Thus, the well pattern optimization (WPO) of mature low-permeability anisotropic reservoirs is a significant but challenging task, since the original well pattern (WP) will be distorted and reconstructed due to permeability anisotropy. In this paper, we investigate the destruction and reconstruction of WP when the principal permeability direction and well line direction are unparallel. A new methodology was developed to quantitatively optimize the well locations of mature large-scale WP through a WPO algorithm on the basis of coordinate transformation (i.e. rotating and stretching). For a mature oilfield, large-scale WP has settled, so it is not economically viable to carry out further infill drilling. This paper circumvents this difficulty by combining the WPO algorithm with the well status (open or shut-in) and schedule adjustment. Finally, this methodology is applied to an example. Cumulative oil production rates of the optimized WP are higher, and water-cut is lower, which highlights the potential of the WPO methodology application in mature large-scale field development projects.

  7. PENDISC: a simple method for constructing a mathematical model from time-series data of metabolite concentrations.

    PubMed

    Sriyudthsak, Kansuporn; Iwata, Michio; Hirai, Masami Yokota; Shiraishi, Fumihide

    2014-06-01

    The availability of large-scale datasets has led to more effort being made to understand characteristics of metabolic reaction networks. However, because the large-scale data are semi-quantitative, and may contain biological variations and/or analytical errors, it remains a challenge to construct a mathematical model with precise parameters using only these data. The present work proposes a simple method, referred to as PENDISC (Parameter Estimation in a N on- DImensionalized S-system with Constraints), to assist the complex process of parameter estimation in the construction of a mathematical model for a given metabolic reaction system. The PENDISC method was evaluated using two simple mathematical models: a linear metabolic pathway model with inhibition and a branched metabolic pathway model with inhibition and activation. The results indicate that a smaller number of data points and rate constant parameters enhances the agreement between calculated values and time-series data of metabolite concentrations, and leads to faster convergence when the same initial estimates are used for the fitting. This method is also shown to be applicable to noisy time-series data and to unmeasurable metabolite concentrations in a network, and to have a potential to handle metabolome data of a relatively large-scale metabolic reaction system. Furthermore, it was applied to aspartate-derived amino acid biosynthesis in Arabidopsis thaliana plant. The result provides confirmation that the mathematical model constructed satisfactorily agrees with the time-series datasets of seven metabolite concentrations.

  8. A case report of evaluating a large-scale health systems improvement project in an uncontrolled setting: a quality improvement initiative in KwaZulu-Natal, South Africa.

    PubMed

    Mate, Kedar S; Ngidi, Wilbroda Hlolisile; Reddy, Jennifer; Mphatswe, Wendy; Rollins, Nigel; Barker, Pierre

    2013-11-01

    New approaches are needed to evaluate quality improvement (QI) within large-scale public health efforts. This case report details challenges to large-scale QI evaluation, and proposes solutions relying on adaptive study design. We used two sequential evaluative methods to study a QI effort to improve delivery of HIV preventive care in public health facilities in three districts in KwaZulu-Natal, South Africa, over a 3-year period. We initially used a cluster randomised controlled trial (RCT) design. During the RCT study period, tensions arose between intervention implementation and evaluation design due to loss of integrity of the randomisation unit over time, pressure to implement changes across the randomisation unit boundaries, and use of administrative rather than functional structures for the randomisation. In response to this loss of design integrity, we switched to a more flexible intervention design and a mixed-methods quasiexperimental evaluation relying on both a qualitative analysis and an interrupted time series quantitative analysis. Cluster RCT designs may not be optimal for evaluating complex interventions to improve implementation in uncontrolled 'real world' settings. More flexible, context-sensitive evaluation designs offer a better balance of the need to adjust the intervention during the evaluation to meet implementation challenges while providing the data required to evaluate effectiveness. Our case study involved HIV care in a resource-limited setting, but these issues likely apply to complex improvement interventions in other settings.

  9. Ab initio calculations for industrial materials engineering: successes and challenges.

    PubMed

    Wimmer, Erich; Najafabadi, Reza; Young, George A; Ballard, Jake D; Angeliu, Thomas M; Vollmer, James; Chambers, James J; Niimi, Hiroaki; Shaw, Judy B; Freeman, Clive; Christensen, Mikael; Wolf, Walter; Saxe, Paul

    2010-09-29

    Computational materials science based on ab initio calculations has become an important partner to experiment. This is demonstrated here for the effect of impurities and alloying elements on the strength of a Zr twist grain boundary, the dissociative adsorption and diffusion of iodine on a zirconium surface, the diffusion of oxygen atoms in a Ni twist grain boundary and in bulk Ni, and the dependence of the work function of a TiN-HfO(2) junction on the replacement of N by O atoms. In all of these cases, computations provide atomic-scale understanding as well as quantitative materials property data of value to industrial research and development. There are two key challenges in applying ab initio calculations, namely a higher accuracy in the electronic energy and the efficient exploration of large parts of the configurational space. While progress in these areas is fueled by advances in computer hardware, innovative theoretical concepts combined with systematic large-scale computations will be needed to realize the full potential of ab initio calculations for industrial applications.

  10. Regression-Based Identification of Behavior-Encoding Neurons During Large-Scale Optical Imaging of Neural Activity at Cellular Resolution

    PubMed Central

    Miri, Andrew; Daie, Kayvon; Burdine, Rebecca D.; Aksay, Emre

    2011-01-01

    The advent of methods for optical imaging of large-scale neural activity at cellular resolution in behaving animals presents the problem of identifying behavior-encoding cells within the resulting image time series. Rapid and precise identification of cells with particular neural encoding would facilitate targeted activity measurements and perturbations useful in characterizing the operating principles of neural circuits. Here we report a regression-based approach to semiautomatically identify neurons that is based on the correlation of fluorescence time series with quantitative measurements of behavior. The approach is illustrated with a novel preparation allowing synchronous eye tracking and two-photon laser scanning fluorescence imaging of calcium changes in populations of hindbrain neurons during spontaneous eye movement in the larval zebrafish. Putative velocity-to-position oculomotor integrator neurons were identified that showed a broad spatial distribution and diversity of encoding. Optical identification of integrator neurons was confirmed with targeted loose-patch electrical recording and laser ablation. The general regression-based approach we demonstrate should be widely applicable to calcium imaging time series in behaving animals. PMID:21084686

  11. Moderate point: Balanced entropy and enthalpy contributions in soft matter

    NASA Astrophysics Data System (ADS)

    He, Baoji; Wang, Yanting

    2017-03-01

    Various soft materials share some common features, such as significant entropic effect, large fluctuations, sensitivity to thermodynamic conditions, and mesoscopic characteristic spatial and temporal scales. However, no quantitative definitions have yet been provided for soft matter, and the intrinsic mechanisms leading to their common features are unclear. In this work, from the viewpoint of statistical mechanics, we show that soft matter works in the vicinity of a specific thermodynamic state named moderate point, at which entropy and enthalpy contributions among substates along a certain order parameter are well balanced or have a minimal difference. Around the moderate point, the order parameter fluctuation, the associated response function, and the spatial correlation length maximize, which explains the large fluctuation, the sensitivity to thermodynamic conditions, and mesoscopic spatial and temporal scales of soft matter, respectively. Possible applications to switching chemical bonds or allosteric biomachines determining their best working temperatures are also briefly discussed. Project supported by the National Basic Research Program of China (Grant No. 2013CB932804) and the National Natural Science Foundation of China (Grant Nos. 11274319 and 11421063).

  12. Large-scale Chromosomal Movements During Interphase Progression in Drosophila

    PubMed Central

    Csink, Amy K.; Henikoff, Steven

    1998-01-01

    We examined the effect of cell cycle progression on various levels of chromosome organization in Drosophila. Using bromodeoxyuridine incorporation and DNA quantitation in combination with fluorescence in situ hybridization, we detected gross chromosomal movements in diploid interphase nuclei of larvae. At the onset of S-phase, an increased separation was seen between proximal and distal positions of a long chromsome arm. Progression through S-phase disrupted heterochromatic associations that have been correlated with gene silencing. Additionally, we have found that large-scale G1 nuclear architecture is continually dynamic. Nuclei display a Rabl configuration for only ∼2 h after mitosis, and with further progression of G1-phase can establish heterochromatic interactions between distal and proximal parts of the chromosome arm. We also find evidence that somatic pairing of homologous chromosomes is disrupted during S-phase more rapidly for a euchromatic than for a heterochromatic region. Such interphase chromosome movements suggest a possible mechanism that links gene regulation via nuclear positioning to the cell cycle: delayed maturation of heterochromatin during G1-phase delays establishment of a silent chromatin state. PMID:9763417

  13. Large scale systematic proteomic quantification from non-metastatic to metastatic colorectal cancer

    NASA Astrophysics Data System (ADS)

    Yin, Xuefei; Zhang, Yang; Guo, Shaowen; Jin, Hong; Wang, Wenhai; Yang, Pengyuan

    2015-07-01

    A systematic proteomic quantification of formalin-fixed, paraffin-embedded (FFPE) colorectal cancer tissues from stage I to stage IIIC was performed in large scale. 1017 proteins were identified with 338 proteins in quantitative changes by label free method, while 341 proteins were quantified with significant expression changes among 6294 proteins by iTRAQ method. We found that proteins related to migration expression increased and those for binding and adherent decreased during the colorectal cancer development according to the gene ontology (GO) annotation and ingenuity pathway analysis (IPA). The integrin alpha 5 (ITA5) in integrin family was focused, which was consistent with the metastasis related pathway. The expression level of ITA5 decreased in metastasis tissues and the result has been further verified by Western blotting. Another two cell migration related proteins vitronectin (VTN) and actin-related protein (ARP3) were also proved to be up-regulated by both mass spectrometry (MS) based quantification results and Western blotting. Up to now, our result shows one of the largest dataset in colorectal cancer proteomics research. Our strategy reveals a disease driven omics-pattern for the metastasis colorectal cancer.

  14. European large-scale farmland investments and the land-water-energy-food nexus

    NASA Astrophysics Data System (ADS)

    Siciliano, Giuseppina; Rulli, Maria Cristina; D'Odorico, Paolo

    2017-12-01

    The escalating human demand for food, water, energy, fibres and minerals have resulted in increasing commercial pressures on land and water resources, which are partly reflected by the recent increase in transnational land investments. Studies have shown that many of the land-water issues associated with land acquisitions are directly related to the areas of energy and food production. This paper explores the land-water-energy-food nexus in relation to large-scale farmland investments pursued by investors from European countries. The analysis is based on a "resource assessment approach" which evaluates the linkages between land acquisitions for agricultural (including both energy and food production) and forestry purposes, and the availability of land and water in the target countries. To that end, the water appropriated by agricultural and forestry productions is quantitatively assessed and its impact on water resource availability is analysed. The analysis is meant to provide useful information to investors from EU countries and policy makers on aspects of resource acquisition, scarcity, and access to promote responsible land investments in the target countries.

  15. Evaluation of effectiveness of various devices for attenuation of trailing vortices based on model tests in a large towing basin

    NASA Technical Reports Server (NTRS)

    Kirkman, K. L.; Brown, C. E.; Goodman, A.

    1973-01-01

    The effectiveness of various candidate aircraft-wing devices for attenuation of trailing vortices generated by large aircraft is evaluated on basis of results of experiments conducted with a 0.03-scale model of a Boeing 747 transport aircraft using a technique developed at the HYDRONAUTICS Ship Model Basin. Emphasis is on the effects produced by these devices in the far-field (up to 8 kilometers downstream of full-scale generating aircraft) where the unaltered vortex-wakes could still be hazardous to small following aircraft. The evaluation is based primarily on quantitative measurements of the respective vortex velocity distributions made by means of hot-film probe traverses in a transverse plane at selected stations downstream. The effects of these altered wakes on rolling moment induced on a small following aircraft are also studied using a modified lifting-surface theory with a synthesized Gates Learjet as a typical example. Lift and drag measurements concurrently obtained in the model tests are used to appraise the effects of each device investigated on the performance characteristics of the generating aircraft.

  16. Fiber networks amplify active stress

    NASA Astrophysics Data System (ADS)

    Lenz, Martin; Ronceray, Pierre; Broedersz, Chase

    Large-scale force generation is essential for biological functions such as cell motility, embryonic development, and muscle contraction. In these processes, forces generated at the molecular level by motor proteins are transmitted by disordered fiber networks, resulting in large-scale active stresses. While fiber networks are well characterized macroscopically, this stress generation by microscopic active units is not well understood. I will present a comprehensive theoretical study of force transmission in these networks. I will show that the linear, small-force response of the networks is remarkably simple, as the macroscopic active stress depends only on the geometry of the force-exerting unit. In contrast, as non-linear buckling occurs around these units, local active forces are rectified towards isotropic contraction and strongly amplified. This stress amplification is reinforced by the networks' disordered nature, but saturates for high densities of active units. I will show that our predictions are quantitatively consistent with experiments on reconstituted tissues and actomyosin networks, and that they shed light on the role of the network microstructure in shaping active stresses in cells and tissue.

  17. Residual fields from extinct dynamos

    NASA Astrophysics Data System (ADS)

    Parker, E. N.

    The generation of magnetic fields in convective zones of declining vigor and/or thickness is considered, the goal being to explain the magnetic fields observed in A-stars. The investigation is restricted to kinematical dynamos in order to show some of the many possibilities, which depend on the assumed conditions of decline of the convection. The examples illustrate the quantitative detail required to describe the convection in order to extract any firm conclusions concerning specific stars. The first example treats the basic problem of diffusion from a layer of declining thickness. The second has a buoyant rise added to the field in the layer. The third deals with plane dynamo waves in a region with declining eddy diffusivity, dynamo coefficient, and large-scale shear. It is noted that the dynamo number may increase or decrease with declining convection, with an increase expected if the large-scale shear does not decline as rapidly as the eddy diffusivity. It is shown that one of the components of the field may increase without bound even when the dynamo number declines to zero.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaidheeswaran, Avinash; Shaffer, Franklin; Gopalan, Balaji

    Here, the statistics of fluctuating velocity components are studied in the riser of a closed-loop circulating fluidized bed with fluid catalytic cracking catalyst particles. Our analysis shows distinct similarities as well as deviations compared to existing theories and bench-scale experiments. The study confirms anisotropic and non-Maxwellian distribution of fluctuating velocity components. The velocity distribution functions (VDFs) corresponding to transverse fluctuations exhibit symmetry, and follow a stretched-exponential behavior up to three standard deviations. The form of the transverse VDF is largely determined by interparticle interactions. The tails become more overpopulated with an increase in particle loading. The observed deviations from themore » Gaussian distribution are represented using the leading order term in the Sonine expansion, which is commonly used to approximate the VDFs in kinetic theory for granular flows. The vertical fluctuating VDFs are asymmetric and the skewness shifts as the wall is approached. In comparison to transverse fluctuations, the vertical VDF is determined by the local hydrodynamics. This is an observation of particle velocity fluctuations in a large-scale system and their quantitative comparison with the Maxwell-Boltzmann statistics.« less

  19. Scale-invariant properties of public-debt growth

    NASA Astrophysics Data System (ADS)

    Petersen, A. M.; Podobnik, B.; Horvatic, D.; Stanley, H. E.

    2010-05-01

    Public debt is one of the important economic variables that quantitatively describes a nation's economy. Because bankruptcy is a risk faced even by institutions as large as governments (e.g., Iceland), national debt should be strictly controlled with respect to national wealth. Also, the problem of eliminating extreme poverty in the world is closely connected to the study of extremely poor debtor nations. We analyze the time evolution of national public debt and find "convergence": initially less-indebted countries increase their debt more quickly than initially more-indebted countries. We also analyze the public debt-to-GDP ratio {\\cal R} , a proxy for default risk, and approximate the probability density function P({\\cal R}) with a Gamma distribution, which can be used to establish thresholds for sustainable debt. We also observe "convergence" in {\\cal R} : countries with initially small {\\cal R} increase their {\\cal R} more quickly than countries with initially large {\\cal R} . The scaling relationships for debt and {\\cal R} have practical applications, e.g. the Maastricht Treaty requires members of the European Monetary Union to maintain {\\cal R} < 0.6 .

  20. Coordinated phenotype switching with large-scale chromosome flip-flop inversion observed in bacteria.

    PubMed

    Cui, Longzhu; Neoh, Hui-min; Iwamoto, Akira; Hiramatsu, Keiichi

    2012-06-19

    Genome inversions are ubiquitous in organisms ranging from prokaryotes to eukaryotes. Typical examples can be identified by comparing the genomes of two or more closely related organisms, where genome inversion footprints are clearly visible. Although the evolutionary implications of this phenomenon are huge, little is known about the function and biological meaning of this process. Here, we report our findings on a bacterium that generates a reversible, large-scale inversion of its chromosome (about half of its total genome) at high frequencies of up to once every four generations. This inversion switches on or off bacterial phenotypes, including colony morphology, antibiotic susceptibility, hemolytic activity, and expression of dozens of genes. Quantitative measurements and mathematical analyses indicate that this reversible switching is stochastic but self-organized so as to maintain two forms of stable cell populations (i.e., small colony variant, normal colony variant) as a bet-hedging strategy. Thus, this heritable and reversible genome fluctuation seems to govern the bacterial life cycle; it has a profound impact on the course and outcomes of bacterial infections.

  1. Spatially resolved mapping of electrical conductivity across individual domain (grain) boundaries in graphene.

    PubMed

    Clark, Kendal W; Zhang, X-G; Vlassiouk, Ivan V; He, Guowei; Feenstra, Randall M; Li, An-Ping

    2013-09-24

    All large-scale graphene films contain extended topological defects dividing graphene into domains or grains. Here, we spatially map electronic transport near specific domain and grain boundaries in both epitaxial graphene grown on SiC and CVD graphene on Cu subsequently transferred to a SiO2 substrate, with one-to-one correspondence to boundary structures. Boundaries coinciding with the substrate step on SiC exhibit a significant potential barrier for electron transport of epitaxial graphene due to the reduced charge transfer from the substrate near the step edge. Moreover, monolayer-bilayer boundaries exhibit a high resistance that can change depending on the height of substrate step coinciding at the boundary. In CVD graphene, the resistance of a grain boundary changes with the width of the disordered transition region between adjacent grains. A quantitative modeling of boundary resistance reveals the increased electron Fermi wave vector within the boundary region, possibly due to boundary induced charge density variation. Understanding how resistance change with domain (grain) boundary structure in graphene is a crucial first step for controlled engineering of defects in large-scale graphene films.

  2. On the nature of the NAA diffusion attenuated MR signal in the central nervous system.

    PubMed

    Kroenke, Christopher D; Ackerman, Joseph J H; Yablonskiy, Dmitriy A

    2004-11-01

    In the brain, on a macroscopic scale, diffusion of the intraneuronal constituent N-acetyl-L-aspartate (NAA) appears to be isotropic. In contrast, on a microscopic scale, NAA diffusion is likely highly anisotropic, with displacements perpendicular to neuronal fibers being markedly hindered, and parallel displacements less so. In this report we first substantiate that local anisotropy influences NAA diffusion in vivo by observing differing diffusivities parallel and perpendicular to human corpus callosum axonal fibers. We then extend our measurements to large voxels within rat brains. As expected, the macroscopic apparent diffusion coefficient (ADC) of NAA is practically isotropic due to averaging of the numerous and diverse fiber orientations. We demonstrate that the substantially non-monoexponential diffusion-mediated MR signal decay vs. b value can be quantitatively explained by a theoretical model of NAA confined to an ensemble of differently oriented neuronal fibers. On the microscopic scale, NAA diffusion is found to be strongly anisotropic, with displacements occurring almost exclusively parallel to the local fiber axis. This parallel diffusivity, ADCparallel, is 0.36 +/- 0.01 microm2/ms, and ADCperpendicular is essentially zero. From ADCparallel the apparent viscosity of the neuron cytoplasm is estimated to be twice as large as that of a temperature-matched dilute aqueous solution. (c) 2004 Wiley-Liss, Inc.

  3. Punctuated Evolution of Influenza Virus Neuraminidase (A/H1N1) under Opposing Migration and Vaccination Pressures

    PubMed Central

    Phillips, J. C.

    2014-01-01

    Influenza virus contains two highly variable envelope glycoproteins, hemagglutinin (HA) and neuraminidase (NA). The structure and properties of HA, which is responsible for binding the virus to the cell that is being infected, change significantly when the virus is transmitted from avian or swine species to humans. Here we focus first on the simpler problem of the much smaller human individual evolutionary amino acid mutational changes in NA, which cleaves sialic acid groups and is required for influenza virus replication. Our thermodynamic panorama shows that very small amino acid changes can be monitored very accurately across many historic (1945–2011) Uniprot and NCBI strains using hydropathicity scales to quantify the roughness of water film packages. Quantitative sequential analysis is most effective with the fractal differential hydropathicity scale based on protein self-organized criticality (SOC). Our analysis shows that large-scale vaccination programs have been responsible for a very large convergent reduction in common influenza severity in the last century. Hydropathic analysis is capable of interpreting and even predicting trends of functional changes in mutation prolific viruses directly from amino acid sequences alone. An engineered strain of NA1 is described which could well be significantly less virulent than current circulating strains. PMID:25143953

  4. Role of natural analogs in performance assessment of nuclear waste repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sagar, B.; Wittmeyer, G.W.

    1995-09-01

    Mathematical models of the flow of water and transport of radionuclides in porous media will be used to assess the ability of deep geologic repositories to safely contain nuclear waste. These models must, in some sense, be validated to ensure that they adequately describe the physical processes occurring within the repository and its geologic setting. Inasmuch as the spatial and temporal scales over which these models must be applied in performance assessment are very large, validation of these models against laboratory and small-scale field experiments may be considered inadequate. Natural analogs may provide validation data that are representative of physico-chemicalmore » processes that occur over spatial and temporal scales as large or larger than those relevant to repository design. The authors discuss the manner in which natural analog data may be used to increase confidence in performance assessment models and conclude that, while these data may be suitable for testing the basic laws governing flow and transport, there is insufficient control of boundary and initial conditions and forcing functions to permit quantitative validation of complex, spatially distributed flow and transport models. The authors also express their opinion that, for collecting adequate data from natural analogs, resources will have to be devoted to them that are much larger than are devoted to them at present.« less

  5. Mass-velocity and size-velocity distributions of ejecta cloud from shock-loaded tin surface using large scale molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Durand, Olivier; Soulard, Laurent

    2015-06-01

    The mass (volume and areal densities) versus velocity as well as the size versus velocity distributions of a shock-induced cloud of particles are investigated using large scale molecular dynamics (MD) simulations. A generic 3D tin crystal with a sinusoidal free surface roughness is set in contact with vacuum and shock-loaded so that it melts directly on shock. At the reflection of the shock wave onto the perturbations of the free surface, 2D sheets/jets of liquid metal are ejected. The simulations show that the distributions may be described by an analytical model based on the propagation of a fragmentation zone, from the tip of the sheets to the free surface, within which the kinetic energy of the atoms decreases as this zone comes closer to the free surface on late times. As this kinetic energy drives (i) the (self-similar) expansion of the zone once it has broken away from the sheet and (ii) the average size of the particles which result from fragmentation in the zone, the ejected mass and the average size of the particles progressively increase in the cloud as fragmentation occurs closer to the free surface. Though relative to nanometric scales, our model reproduces quantitatively experimental profiles and may help in their analysis.

  6. Formation and representation: Critical analyses of identity, supply, and demand in science, technology, engineering, and mathematics

    NASA Astrophysics Data System (ADS)

    Mandayam Doddamane, Prabha

    2011-12-01

    Considerable research, policy, and programmatic efforts have been dedicated to addressing the participation of particular populations in STEM for decades. Each of these efforts claims equity-related goals; yet, they heavily frame the problem, through pervasive STEM pipeline model discourse, in terms of national needs, workforce supply, and competitiveness. This particular framing of the problem may, indeed, be counter to equity goals, especially when paired with policy that largely relies on statistical significance and broad aggregation of data over exploring the identities and experiences of the populations targeted for equitable outcomes in that policy. In this study, I used the mixed-methods approach of critical discourse and critical quantitative analyses to understand how the pipeline model ideology has become embedded within academic discourse, research, and data surrounding STEM education and work and to provide alternatives for quantitative analysis. Using critical theory as a lens, I first conducted a critical discourse analysis of contemporary STEM workforce studies with a particular eye to pipeline ideology. Next, I used that analysis to inform logistic regression analyses of the 2006 SESTAT data. This quantitative analysis compared and contrasted different ways of thinking about identity and retention. Overall, the findings of this study show that many subjective choices are made in the construction of the large-scale datasets used to inform much national science and engineering policy and that these choices greatly influence likelihood of retention outcomes.

  7. Progress with modeling activity landscapes in drug discovery.

    PubMed

    Vogt, Martin

    2018-04-19

    Activity landscapes (ALs) are representations and models of compound data sets annotated with a target-specific activity. In contrast to quantitative structure-activity relationship (QSAR) models, ALs aim at characterizing structure-activity relationships (SARs) on a large-scale level encompassing all active compounds for specific targets. The popularity of AL modeling has grown substantially with the public availability of large activity-annotated compound data sets. AL modeling crucially depends on molecular representations and similarity metrics used to assess structural similarity. Areas covered: The concepts of AL modeling are introduced and its basis in quantitatively assessing molecular similarity is discussed. The different types of AL modeling approaches are introduced. AL designs can broadly be divided into three categories: compound-pair based, dimensionality reduction, and network approaches. Recent developments for each of these categories are discussed focusing on the application of mathematical, statistical, and machine learning tools for AL modeling. AL modeling using chemical space networks is covered in more detail. Expert opinion: AL modeling has remained a largely descriptive approach for the analysis of SARs. Beyond mere visualization, the application of analytical tools from statistics, machine learning and network theory has aided in the sophistication of AL designs and provides a step forward in transforming ALs from descriptive to predictive tools. To this end, optimizing representations that encode activity relevant features of molecules might prove to be a crucial step.

  8. Mutation update and uncommon phenotypes in a French cohort of 96 patients with WFS1-related disorders.

    PubMed

    Chaussenot, A; Rouzier, C; Quere, M; Plutino, M; Ait-El-Mkadem, S; Bannwarth, S; Barth, M; Dollfus, H; Charles, P; Nicolino, M; Chabrol, B; Vialettes, B; Paquis-Flucklinger, V

    2015-05-01

    WFS1 mutations are responsible for Wolfram syndrome (WS) characterized by juvenile-onset diabetes mellitus and optic atrophy, and for low-frequency sensorineural hearing loss (LFSNHL). Our aim was to analyze the French cohort of 96 patients with WFS1-related disorders in order (i) to update clinical and molecular data with 37 novel affected individuals, (ii) to describe uncommon phenotypes and, (iii) to precise the frequency of large-scale rearrangements in WFS1. We performed quantitative polymerase chain reaction (PCR) in 13 patients, carrying only one heterozygous variant, to identify large-scale rearrangements in WFS1. Among the 37 novel patients, 15 carried 15 novel deleterious putative mutations, including one large deletion of 17,444 base pairs. The analysis of the cohort revealed unexpected phenotypes including (i) late-onset symptoms in 13.8% of patients with a probable autosomal recessive transmission; (ii) two siblings with recessive optic atrophy without diabetes mellitus and, (iii) six patients from four families with dominantly-inherited deafness and optic atrophy. We highlight the expanding spectrum of WFS1-related disorders and we show that, even if large deletions are rare events, they have to be searched in patients with classical WS carrying only one WFS1 mutation after sequencing. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Visualizing vascular structures in virtual environments

    NASA Astrophysics Data System (ADS)

    Wischgoll, Thomas

    2013-01-01

    In order to learn more about the cause of coronary heart diseases and develop diagnostic tools, the extraction and visualization of vascular structures from volumetric scans for further analysis is an important step. By determining a geometric representation of the vasculature, the geometry can be inspected and additional quantitative data calculated and incorporated into the visualization of the vasculature. To provide a more user-friendly visualization tool, virtual environment paradigms can be utilized. This paper describes techniques for interactive rendering of large-scale vascular structures within virtual environments. This can be applied to almost any virtual environment configuration, such as CAVE-type displays. Specifically, the tools presented in this paper were tested on a Barco I-Space and a large 62x108 inch passive projection screen with a Kinect sensor for user tracking.

  10. Venus Interior Structure Mission (VISM): Establishing a Seismic Network on Venus

    NASA Technical Reports Server (NTRS)

    Stofan, E. R.; Saunders, R. S.; Senske, D.; Nock, K.; Tralli, D.; Lundgren, P.; Smrekar, S.; Banerdt, B.; Kaiser, W.; Dudenhoefer, J.

    1993-01-01

    Magellan radar data show the surface of Venus to contain a wide range of geologic features (large volcanoes, extensive rift valleys, etc.). Although networks of interconnecting zones of deformation are identified, a system of spreading ridges and subduction zones like those that dominate the tectonic style of the Earth do not appear to be present. In addition, the absence of a mantle low-viscosity zone suggests a strong link between mantle dynamics and the surface. As a natural follow-on to the Magellan mission, establishing a network of seismometers on Venus will provide detailed quantitative information on the large scale interior structure of the planet. When analyzed in conjunction with image, gravity, and topography information, these data will aid in constraining mechanisms that drive surface deformation.

  11. Cavallo's multiplier for in situ generation of high voltage

    NASA Astrophysics Data System (ADS)

    Clayton, S. M.; Ito, T. M.; Ramsey, J. C.; Wei, W.; Blatnik, M. A.; Filippone, B. W.; Seidel, G. M.

    2018-05-01

    A classic electrostatic induction machine, Cavallo's multiplier, is suggested for in situ production of very high voltage in cryogenic environments. The device is suitable for generating a large electrostatic field under conditions of very small load current. Operation of the Cavallo multiplier is analyzed, with quantitative description in terms of mutual capacitances between electrodes in the system. A demonstration apparatus was constructed, and measured voltages are compared to predictions based on measured capacitances in the system. The simplicity of the Cavallo multiplier makes it amenable to electrostatic analysis using finite element software, and electrode shapes can be optimized to take advantage of a high dielectric strength medium such as liquid helium. A design study is presented for a Cavallo multiplier in a large-scale, cryogenic experiment to measure the neutron electric dipole moment.

  12. Convergence between biological, behavioural and genetic determinants of obesity.

    PubMed

    Ghosh, Sujoy; Bouchard, Claude

    2017-12-01

    Multiple biological, behavioural and genetic determinants or correlates of obesity have been identified to date. Genome-wide association studies (GWAS) have contributed to the identification of more than 100 obesity-associated genetic variants, but their roles in causal processes leading to obesity remain largely unknown. Most variants are likely to have tissue-specific regulatory roles through joint contributions to biological pathways and networks, through changes in gene expression that influence quantitative traits, or through the regulation of the epigenome. The recent availability of large-scale functional genomics resources provides an opportunity to re-examine obesity GWAS data to begin elucidating the function of genetic variants. Interrogation of knockout mouse phenotype resources provides a further avenue to test for evidence of convergence between genetic variation and biological or behavioural determinants of obesity.

  13. Flow topologies and turbulence scales in a jet-in-cross-flow

    DOE PAGES

    Oefelein, Joseph C.; Ruiz, Anthony M.; Lacaze, Guilhem

    2015-04-03

    This study presents a detailed analysis of the flow topologies and turbulence scales in the jet-in-cross-flow experiment of [Su and Mungal JFM 2004]. The analysis is performed using the Large Eddy Simulation (LES) technique with a highly resolved grid and time-step and well controlled boundary conditions. This enables quantitative agreement with the first and second moments of turbulence statistics measured in the experiment. LES is used to perform the analysis since experimental measurements of time-resolved 3D fields are still in their infancy and because sampling periods are generally limited with direct numerical simulation. A major focal point is the comprehensivemore » characterization of the turbulence scales and their evolution. Time-resolved probes are used with long sampling periods to obtain maps of the integral scales, Taylor microscales, and turbulent kinetic energy spectra. Scalar-fluctuation scales are also quantified. In the near-field, coherent structures are clearly identified, both in physical and spectral space. Along the jet centerline, turbulence scales grow according to a classical one-third power law. However, the derived maps of turbulence scales reveal strong inhomogeneities in the flow. From the modeling perspective, these insights are useful to design optimized grids and improve numerical predictions in similar configurations.« less

  14. Conformal standard model, leptogenesis, and dark matter

    NASA Astrophysics Data System (ADS)

    Lewandowski, Adrian; Meissner, Krzysztof A.; Nicolai, Hermann

    2018-02-01

    The conformal standard model is a minimal extension of the Standard Model (SM) of particle physics based on the assumed absence of large intermediate scales between the TeV scale and the Planck scale, which incorporates only right-chiral neutrinos and a new complex scalar in addition to the usual SM degrees of freedom, but no other features such as supersymmetric partners. In this paper, we present a comprehensive quantitative analysis of this model, and show that all outstanding issues of particle physics proper can in principle be solved "in one go" within this framework. This includes in particular the stabilization of the electroweak scale, "minimal" leptogenesis and the explanation of dark matter, with a small mass and very weakly interacting Majoron as the dark matter candidate (for which we propose to use the name "minoron"). The main testable prediction of the model is a new and almost sterile scalar boson that would manifest itself as a narrow resonance in the TeV region. We give a representative range of parameter values consistent with our assumptions and with observation.

  15. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  16. Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines

    PubMed Central

    Mikut, Ralf

    2017-01-01

    Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927

  17. Dynamics of Topological Excitations in a Model Quantum Spin Ice

    NASA Astrophysics Data System (ADS)

    Huang, Chun-Jiong; Deng, Youjin; Wan, Yuan; Meng, Zi Yang

    2018-04-01

    We study the quantum spin dynamics of a frustrated X X Z model on a pyrochlore lattice by using large-scale quantum Monte Carlo simulation and stochastic analytic continuation. In the low-temperature quantum spin ice regime, we observe signatures of coherent photon and spinon excitations in the dynamic spin structure factor. As the temperature rises to the classical spin ice regime, the photon disappears from the dynamic spin structure factor, whereas the dynamics of the spinon remain coherent in a broad temperature window. Our results provide experimentally relevant, quantitative information for the ongoing pursuit of quantum spin ice materials.

  18. Triggering up states in all-to-all coupled neurons

    NASA Astrophysics Data System (ADS)

    Ngo, H.-V. V.; Köhler, J.; Mayer, J.; Claussen, J. C.; Schuster, H. G.

    2010-03-01

    Slow-wave sleep in mammalians is characterized by a change of large-scale cortical activity currently paraphrased as cortical Up/Down states. A recent experiment demonstrated a bistable collective behaviour in ferret slices, with the remarkable property that the Up states can be switched on and off with pulses, or excitations, of same polarity; whereby the effect of the second pulse significantly depends on the time interval between the pulses. Here we present a simple time-discrete model of a neural network that exhibits this type of behaviour, as well as quantitatively reproduces the time dependence found in the experiments.

  19. Alluvial Fans on Mars

    NASA Technical Reports Server (NTRS)

    Kraal, E. R.; Moore, J. M.; Howard, A. D.; Asphaug, E. A.

    2005-01-01

    Moore and Howard [1] reported the discovery of large alluvial fans in craters on Mars. Their initial survey from 0-30 S found that these fans clustered in three distinct regions and occurred at around the +1 km MOLA defined Mars datum. However, due to incomplete image coverage, Moore and Howard [1]could not conduct a comprehensive survey. They also recognized, though did not quantitatively address, gravity scaling issues. Here, we briefly discuss the identification of alluvial fans on Mars, then consider the general equations governing the deposition of alluvial fans and hypothesize a method for learning about grain size in alluvial fans on Mars.

  20. Service Discovery Oriented Management System Construction Method

    NASA Astrophysics Data System (ADS)

    Li, Huawei; Ren, Ying

    2017-10-01

    In order to solve the problem that there is no uniform method for design service quality management system in large-scale complex service environment, this paper proposes a distributed service-oriented discovery management system construction method. Three measurement functions are proposed to compute nearest neighbor user similarity at different levels. At present in view of the low efficiency of service quality management systems, three solutions are proposed to improve the efficiency of the system. Finally, the key technologies of distributed service quality management system based on service discovery are summarized through the factor addition and subtraction of quantitative experiment.

Top