Sample records for large-scale systematic study

  1. The use of data from national and other large-scale user experience surveys in local quality work: a systematic review.

    PubMed

    Haugum, Mona; Danielsen, Kirsten; Iversen, Hilde Hestad; Bjertnaes, Oyvind

    2014-12-01

    An important goal for national and large-scale surveys of user experiences is quality improvement. However, large-scale surveys are normally conducted by a professional external surveyor, creating an institutionalized division between the measurement of user experiences and the quality work that is performed locally. The aim of this study was to identify and describe scientific studies related to the use of national and large-scale surveys of user experiences in local quality work. Ovid EMBASE, Ovid MEDLINE, Ovid PsycINFO and the Cochrane Database of Systematic Reviews. Scientific publications about user experiences and satisfaction about the extent to which data from national and other large-scale user experience surveys are used for local quality work in the health services. Themes of interest were identified and a narrative analysis was undertaken. Thirteen publications were included, all differed substantially in several characteristics. The results show that large-scale surveys of user experiences are used in local quality work. The types of follow-up activity varied considerably from conducting a follow-up analysis of user experience survey data to information sharing and more-systematic efforts to use the data as a basis for improving the quality of care. This review shows that large-scale surveys of user experiences are used in local quality work. However, there is a need for more, better and standardized research in this field. The considerable variation in follow-up activities points to the need for systematic guidance on how to use data in local quality work. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  2. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  3. Characterizing unknown systematics in large scale structure surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Nishant; Ho, Shirley; Myers, Adam D.

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data,more » we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.« less

  4. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    PubMed

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  6. Impact of systemic sclerosis oral manifestations on patients' health-related quality of life: a systematic review.

    PubMed

    Smirani, Rawen; Truchetet, Marie-Elise; Poursac, Nicolas; Naveau, Adrien; Schaeverbeke, Thierry; Devillard, Raphaël

    2018-06-01

    Oropharyngeal features are frequent and often understated in the treatment clinical guidelines of systemic sclerosis in spite of important consequences on comfort, esthetics, nutrition and daily life. The aim of this systematic review was to assess a correlation between the oropharyngeal manifestations of systemic sclerosis and patients' health-related quality of life. A systematic search was conducted using four databases [PubMed ® , Cochrane Database ® , Dentistry & Oral Sciences Source ® , and SCOPUS ® ] up to January 2018, according to the Preferred reporting items for systematic reviews and meta analyses. Grey literature and hand search were also included. Study selection, risk bias assessment (Newcastle-Ottawa scale) and data extraction were performed by two independent reviewers. The review protocol was registered on PROSPERO database with the code CRD42018085994. From 375 screened studies, 6 cross-sectional studies were included in the systematic review. The total number of patients included per study ranged from 84 to 178. These studies reported a statistically significant association between oropharyngeal manifestations of systemic sclerosis (mainly assessed by maximal mouth opening and the mouth handicap in systemic sclerosis scale) and an impaired quality of life (measured by different scales). Studies were unequal concerning risk of bias mostly because of low level of evidence, different recruiting sources of samples, and different scales to assess the quality of life. This systematic review demonstrates a correlation between oropharyngeal manifestations of systemic sclerosis and impaired quality of life, despite the low level of evidence of included studies. Large-scaled studies are needed to provide stronger evidence of this association. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Enablers and Barriers to Large-Scale Uptake of Improved Solid Fuel Stoves: A Systematic Review

    PubMed Central

    Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G.

    2013-01-01

    Background: Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. Objectives: We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. Methods: We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as “factors” relating to one of seven domains—fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms—and also recorded issues that impacted equity. Results: We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Conclusions: Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness. Citation: Rehfuess EA, Puzzolo E, Stanistreet D, Pope D, Bruce NG. 2014. Enablers and barriers to large-scale uptake of improved solid fuel stoves: a systematic review. Environ Health Perspect 122:120–130; http://dx.doi.org/10.1289/ehp.1306639 PMID:24300100

  8. The impact of new forms of large-scale general practice provider collaborations on England's NHS: a systematic review.

    PubMed

    Pettigrew, Luisa M; Kumpunen, Stephanie; Mays, Nicholas; Rosen, Rebecca; Posaner, Rachel

    2018-03-01

    Over the past decade, collaboration between general practices in England to form new provider networks and large-scale organisations has been driven largely by grassroots action among GPs. However, it is now being increasingly advocated for by national policymakers. Expectations of what scaling up general practice in England will achieve are significant. To review the evidence of the impact of new forms of large-scale general practice provider collaborations in England. Systematic review. Embase, MEDLINE, Health Management Information Consortium, and Social Sciences Citation Index were searched for studies reporting the impact on clinical processes and outcomes, patient experience, workforce satisfaction, or costs of new forms of provider collaborations between general practices in England. A total of 1782 publications were screened. Five studies met the inclusion criteria and four examined the same general practice networks, limiting generalisability. Substantial financial investment was required to establish the networks and the associated interventions that were targeted at four clinical areas. Quality improvements were achieved through standardised processes, incentives at network level, information technology-enabled performance dashboards, and local network management. The fifth study of a large-scale multisite general practice organisation showed that it may be better placed to implement safety and quality processes than conventional practices. However, unintended consequences may arise, such as perceptions of disenfranchisement among staff and reductions in continuity of care. Good-quality evidence of the impacts of scaling up general practice provider organisations in England is scarce. As more general practice collaborations emerge, evaluation of their impacts will be important to understand which work, in which settings, how, and why. © British Journal of General Practice 2018.

  9. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions

    PubMed Central

    Taylor, Richard L.; Bentley, Christopher D. B.; Pedernales, Julen S.; Lamata, Lucas; Solano, Enrique; Carvalho, André R. R.; Hope, Joseph J.

    2017-01-01

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10−5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period. PMID:28401945

  10. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions.

    PubMed

    Taylor, Richard L; Bentley, Christopher D B; Pedernales, Julen S; Lamata, Lucas; Solano, Enrique; Carvalho, André R R; Hope, Joseph J

    2017-04-12

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10 -5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period.

  11. Uneven flows: On cosmic bulk flows, local observers, and gravity

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Bilicki, Maciej; Libeskind, Noam I.

    2018-05-01

    Using N -body simulations we study the impact of various systematic effects on the low-order moments of the cosmic velocity field: the bulk flow (BF) and the cosmic Mach number (CMN). We consider two types of systematics: those related to survey properties and those induced by the observer's location in the Universe. In the former category we model sparse sampling, velocity errors, and survey incompleteness (radial and geometrical). In the latter, we consider local group (LG) analogue observers, placed in a specific location within the cosmic web, satisfying various observational criteria. We differentiate such LG observers from Copernican ones, who are at random locations. We report strong systematic effects on the measured BF and CMN induced by sparse sampling, velocity errors and radial incompleteness. For BF most of these effects exceed 10% for scales R ≲100 h-1 Mpc . For CMN some of these systematics can be catastrophically large (i.e., >50 %) also on bigger scales. Moreover, we find that the position of the observer in the cosmic web significantly affects the locally measured BF (CMN), with effects as large as ˜20 % (30 % ) at R ≲50 h-1 Mpc for a LG-like observer as compared to a random one. This effect is comparable to the sample variance at the same scales. Such location-dependent effects have not been considered previously in BF and CMN studies and here we report their magnitude and scale for the first time. To highlight the importance of these systematics, we additionally study a model of modified gravity with ˜15 % enhanced growth rate (compared to general relativity). We found that the systematic effects can mimic the modified gravity signal. The worst-case scenario is realized for a case of a LG-like observer, when the effects induced by local structures are degenerate with the enhanced growth rate fostered by modified gravity. Our results indicate that dedicated constrained simulations and realistic mock galaxy catalogs will be absolutely necessary to fully benefit from the statistical power of the forthcoming peculiar velocity data from surveys such as TAIPAN, WALLABY, COSMICFLOWS-4 and SKA.

  12. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we improved the redshift space distortion growth rate measurement precision by a factor of 2.5 using customized clustering statistics in the non-linear regime that were immunized against observational systematics. We look forward to addressing the unique challenges of modeling and empirically characterizing the WFIRST galaxies and observational systematics.

  13. Constraining the baryon-dark matter relative velocity with the large-scale 3-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepian, Zachary; Slosar, Anze; Eisenstein, Daniel J.

    We search for a galaxy clustering bias due to a modulation of galaxy number with the baryon-dark matter relative velocity resulting from recombination-era physics. We find no detected signal and place the constraint bv <0.01 on the relative velocity bias for the CMASS galaxies. This bias is an important potential systematic of Baryon Acoustic Oscillation (BAO) method measurements of the cosmic distance scale using the 2-point clustering. Our limit on the relative velocity bias indicates a systematic shift of no more than 0.3% rms in the distance scale inferred from the BAO feature in the BOSS 2-point clustering, well belowmore » the 1% statistical error of this measurement. In conclusion, this constraint is the most stringent currently available and has important implications for the ability of upcoming large-scale structure surveys such as DESI to self-protect against the relative velocity as a possible systematic.« less

  14. Constraining the baryon-dark matter relative velocity with the large-scale 3-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    DOE PAGES

    Slepian, Zachary; Slosar, Anze; Eisenstein, Daniel J.; ...

    2017-10-24

    We search for a galaxy clustering bias due to a modulation of galaxy number with the baryon-dark matter relative velocity resulting from recombination-era physics. We find no detected signal and place the constraint bv <0.01 on the relative velocity bias for the CMASS galaxies. This bias is an important potential systematic of Baryon Acoustic Oscillation (BAO) method measurements of the cosmic distance scale using the 2-point clustering. Our limit on the relative velocity bias indicates a systematic shift of no more than 0.3% rms in the distance scale inferred from the BAO feature in the BOSS 2-point clustering, well belowmore » the 1% statistical error of this measurement. In conclusion, this constraint is the most stringent currently available and has important implications for the ability of upcoming large-scale structure surveys such as DESI to self-protect against the relative velocity as a possible systematic.« less

  15. Constraining the baryon-dark matter relative velocity with the large-scale three-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.; Blazek, Jonathan A.; Brownstein, Joel R.; Chuang, Chia-Hsun; Gil-Marín, Héctor; Ho, Shirley; Kitaura, Francisco-Shu; McEwen, Joseph E.; Percival, Will J.; Ross, Ashley J.; Rossi, Graziano; Seo, Hee-Jong; Slosar, Anže; Vargas-Magaña, Mariana

    2018-02-01

    We search for a galaxy clustering bias due to a modulation of galaxy number with the baryon-dark matter relative velocity resulting from recombination-era physics. We find no detected signal and place the constraint bv < 0.01 on the relative velocity bias for the CMASS galaxies. This bias is an important potential systematic of baryon acoustic oscillation (BAO) method measurements of the cosmic distance scale using the two-point clustering. Our limit on the relative velocity bias indicates a systematic shift of no more than 0.3 per cent rms in the distance scale inferred from the BAO feature in the BOSS two-point clustering, well below the 1 per cent statistical error of this measurement. This constraint is the most stringent currently available and has important implications for the ability of upcoming large-scale structure surveys such as the Dark Energy Spectroscopic Instrument (DESI) to self-protect against the relative velocity as a possible systematic.

  16. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    NASA Astrophysics Data System (ADS)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  17. Recovery of Large Angular Scale CMB Polarization for Instruments Employing Variable-Delay Polarization Modulators

    NASA Technical Reports Server (NTRS)

    Miller, N. J.; Chuss, D. T.; Marriage, T. A.; Wollack, E. J.; Appel, J. W.; Bennett, C. L.; Eimer, J.; Essinger-Hileman, T.; Fixsen, D. J.; Harrington, K.; hide

    2016-01-01

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residual modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/ f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r= 0.01. Indeed, r less than 0.01 is achievable with commensurately improved characterizations and controls.

  18. Systematic methods for defining coarse-grained maps in large biomolecules.

    PubMed

    Zhang, Zhiyong

    2015-01-01

    Large biomolecules are involved in many important biological processes. It would be difficult to use large-scale atomistic molecular dynamics (MD) simulations to study the functional motions of these systems because of the computational expense. Therefore various coarse-grained (CG) approaches have attracted rapidly growing interest, which enable simulations of large biomolecules over longer effective timescales than all-atom MD simulations. The first issue in CG modeling is to construct CG maps from atomic structures. In this chapter, we review the recent development of a novel and systematic method for constructing CG representations of arbitrarily complex biomolecules, in order to preserve large-scale and functionally relevant essential dynamics (ED) at the CG level. In this ED-CG scheme, the essential dynamics can be characterized by principal component analysis (PCA) on a structural ensemble, or elastic network model (ENM) of a single atomic structure. Validation and applications of the method cover various biological systems, such as multi-domain proteins, protein complexes, and even biomolecular machines. The results demonstrate that the ED-CG method may serve as a very useful tool for identifying functional dynamics of large biomolecules at the CG level.

  19. Enablers and barriers to large-scale uptake of improved solid fuel stoves: a systematic review.

    PubMed

    Rehfuess, Eva A; Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G

    2014-02-01

    Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as "factors" relating to one of seven domains-fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms-and also recorded issues that impacted equity. We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness.

  20. The Diversity of School Organizational Configurations

    ERIC Educational Resources Information Center

    Lee, Linda C.

    2013-01-01

    School reform on a large scale has largely been unsuccessful. Approaches designed to document and understand the variety of organizational conditions that comprise our school systems are needed so that reforms can be tailored and results scaled. Therefore, this article develops a configurational framework that allows a systematic analysis of many…

  1. Personality in 100,000 Words: A large-scale analysis of personality and word use among bloggers

    PubMed Central

    Yarkoni, Tal

    2010-01-01

    Previous studies have found systematic associations between personality and individual differences in word use. Such studies have typically focused on broad associations between major personality domains and aggregate word categories, potentially masking more specific associations. Here I report the results of a large-scale analysis of personality and word use in a large sample of blogs (N=694). The size of the dataset enabled pervasive correlations with personality to be identified for a broad range of lexical variables, including both aggregate word categories and individual English words. The results replicated category-level findings from previous offline studies, identified numerous novel associations at both a categorical and single-word level, and underscored the value of complementary approaches to the study of personality and word use. PMID:20563301

  2. Computational study of 3-D hot-spot initiation in shocked insensitive high-explosive

    NASA Astrophysics Data System (ADS)

    Najjar, F. M.; Howard, W. M.; Fried, L. E.; Manaa, M. R.; Nichols, A., III; Levesque, G.

    2012-03-01

    High-explosive (HE) material consists of large-sized grains with micron-sized embedded impurities and pores. Under various mechanical/thermal insults, these pores collapse generating hightemperature regions leading to ignition. A hydrodynamic study has been performed to investigate the mechanisms of pore collapse and hot spot initiation in TATB crystals, employing a multiphysics code, ALE3D, coupled to the chemistry module, Cheetah. This computational study includes reactive dynamics. Two-dimensional high-resolution large-scale meso-scale simulations have been performed. The parameter space is systematically studied by considering various shock strengths, pore diameters and multiple pore configurations. Preliminary 3-D simulations are undertaken to quantify the 3-D dynamics.

  3. Towards resolving the complete fern tree of life.

    PubMed

    Lehtonen, Samuli

    2011-01-01

    In the past two decades, molecular systematic studies have revolutionized our understanding of the evolutionary history of ferns. The availability of large molecular data sets together with efficient computer algorithms, now enables us to reconstruct evolutionary histories with previously unseen completeness. Here, the most comprehensive fern phylogeny to date, representing over one-fifth of the extant global fern diversity, is inferred based on four plastid genes. Parsimony and maximum-likelihood analyses provided a mostly congruent results and in general supported the prevailing view on the higher-level fern systematics. At a deep phylogenetic level, the position of horsetails depended on the optimality criteria chosen, with horsetails positioned as the sister group either of Marattiopsida-Polypodiopsida clade or of the Polypodiopsida. The analyses demonstrate the power of using a 'supermatrix' approach to resolve large-scale phylogenies and reveal questionable taxonomies. These results provide a valuable background for future research on fern systematics, ecology, biogeography and other evolutionary studies.

  4. As a Matter of Force—Systematic Biases in Idealized Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Grete, Philipp; O’Shea, Brian W.; Beckwith, Kris

    2018-05-01

    Many astrophysical systems encompass very large dynamical ranges in space and time, which are not accessible by direct numerical simulations. Thus, idealized subvolumes are often used to study small-scale effects including the dynamics of turbulence. These turbulent boxes require an artificial driving in order to mimic energy injection from large-scale processes. In this Letter, we show and quantify how the autocorrelation time of the driving and its normalization systematically change the properties of an isothermal compressible magnetohydrodynamic flow in the sub- and supersonic regime and affect astrophysical observations such as Faraday rotation. For example, we find that δ-in-time forcing with a constant energy injection leads to a steeper slope in kinetic energy spectrum and less-efficient small-scale dynamo action. In general, we show that shorter autocorrelation times require more power in the acceleration field, which results in more power in compressive modes that weaken the anticorrelation between density and magnetic field strength. Thus, derived observables, such as the line-of-sight (LOS) magnetic field from rotation measures, are systematically biased by the driving mechanism. We argue that δ-in-time forcing is unrealistic and numerically unresolved, and conclude that special care needs to be taken in interpreting observational results based on the use of idealized simulations.

  5. Psychiatric Illness in a Cohort of Adults with Prader-Willi Syndrome

    ERIC Educational Resources Information Center

    Sinnema, Margje; Boer, Harm; Collin, Philippe; Maaskant, Marian A.; van Roozendaal, Kees E. P.; Schrander-Stumpel, Constance T. R. M.; Curfs, Leopold M. G.

    2011-01-01

    Previous studies have suggested an association between PWS and comorbid psychiatric illness. Data on prevalence rates of psychopathology is still scarce. This paper describes a large-scale, systematic study investigating the prevalence of psychiatric illness in a Dutch adult PWS cohort. One hundred and two individuals were screened for psychiatric…

  6. Large-Scale Assessments of Students' Learning and Education Policy: Synthesising Evidence across World Regions

    ERIC Educational Resources Information Center

    Tobin, Mollie; Nugroho, Dita; Lietz, Petra

    2016-01-01

    This article synthesises findings from two systematic reviews that examined evidence of the link between large-scale assessments (LSAs) and education policy in economically developing countries and in countries of the Asia-Pacific. Analyses summarise evidence of assessment characteristics and policy goals of LSAs that influence education policy,…

  7. A roadmap for natural product discovery based on large-scale genomics and metabolomics

    USDA-ARS?s Scientific Manuscript database

    Actinobacteria encode a wealth of natural product biosynthetic gene clusters, whose systematic study is complicated by numerous repetitive motifs. By combining several metrics we developed a method for global classification of these gene clusters into families (GCFs) and analyzed the biosynthetic ca...

  8. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  9. Particle Acceleration in Mildly Relativistic Shearing Flows: The Interplay of Systematic and Stochastic Effects, and the Origin of the Extended High-energy Emission in AGN Jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ruo-Yu; Rieger, F. M.; Aharonian, F. A., E-mail: ruoyu@mpi-hd.mpg.de, E-mail: frank.rieger@mpi-hd.mpg.de, E-mail: aharon@mpi-hd.mpg.de

    The origin of the extended X-ray emission in the large-scale jets of active galactic nuclei (AGNs) poses challenges to conventional models of acceleration and emission. Although electron synchrotron radiation is considered the most feasible radiation mechanism, the formation of the continuous large-scale X-ray structure remains an open issue. As astrophysical jets are expected to exhibit some turbulence and shearing motion, we here investigate the potential of shearing flows to facilitate an extended acceleration of particles and evaluate its impact on the resultant particle distribution. Our treatment incorporates systematic shear and stochastic second-order Fermi effects. We show that for typical parametersmore » applicable to large-scale AGN jets, stochastic second-order Fermi acceleration, which always accompanies shear particle acceleration, can play an important role in facilitating the whole process of particle energization. We study the time-dependent evolution of the resultant particle distribution in the presence of second-order Fermi acceleration, shear acceleration, and synchrotron losses using a simple Fokker–Planck approach and provide illustrations for the possible emergence of a complex (multicomponent) particle energy distribution with different spectral branches. We present examples for typical parameters applicable to large-scale AGN jets, indicating the relevance of the underlying processes for understanding the extended X-ray emission and the origin of ultrahigh-energy cosmic rays.« less

  10. RECOVERY OF LARGE ANGULAR SCALE CMB POLARIZATION FOR INSTRUMENTS EMPLOYING VARIABLE-DELAY POLARIZATION MODULATORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, N. J.; Marriage, T. A.; Appel, J. W.

    2016-02-20

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residualmore » modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r = 0.01. Indeed, r < 0.01 is achievable with commensurately improved characterizations and controls.« less

  11. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers

    PubMed Central

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon—Salmo salar, number of >1 year old Atlantic salmon, number of brown trout—Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies. PMID:27191717

  12. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers.

    PubMed

    Bilotta, Gary S; Burnside, Niall G; Gray, Jeremy C; Orr, Harriet G

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon-Salmo salar, number of >1 year old Atlantic salmon, number of brown trout-Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies.

  13. Sloan Digital Sky Survey III photometric quasar clustering: probing the initial conditions of the Universe

    NASA Astrophysics Data System (ADS)

    Ho, Shirley; Agarwal, Nishant; Myers, Adam D.; Lyons, Richard; Disbrow, Ashley; Seo, Hee-Jong; Ross, Ashley; Hirata, Christopher; Padmanabhan, Nikhil; O'Connell, Ross; Huff, Eric; Schlegel, David; Slosar, Anže; Weinberg, David; Strauss, Michael; Ross, Nicholas P.; Schneider, Donald P.; Bahcall, Neta; Brinkmann, J.; Palanque-Delabrouille, Nathalie; Yèche, Christophe

    2015-05-01

    The Sloan Digital Sky Survey has surveyed 14,555 square degrees of the sky, and delivered over a trillion pixels of imaging data. We present the large-scale clustering of 1.6 million quasars between z=0.5 and z=2.5 that have been classified from this imaging, representing the highest density of quasars ever studied for clustering measurements. This data set spans 0~ 11,00 square degrees and probes a volume of 80 h-3 Gpc3. In principle, such a large volume and medium density of tracers should facilitate high-precision cosmological constraints. We measure the angular clustering of photometrically classified quasars using an optimal quadratic estimator in four redshift slices with an accuracy of ~ 25% over a bin width of δl ~ 10-15 on scales corresponding to matter-radiation equality and larger (0l ~ 2-3). Observational systematics can strongly bias clustering measurements on large scales, which can mimic cosmologically relevant signals such as deviations from Gaussianity in the spectrum of primordial perturbations. We account for systematics by employing a new method recently proposed by Agarwal et al. (2014) to the clustering of photometrically classified quasars. We carefully apply our methodology to mitigate known observational systematics and further remove angular bins that are contaminated by unknown systematics. Combining quasar data with the photometric luminous red galaxy (LRG) sample of Ross et al. (2011) and Ho et al. (2012), and marginalizing over all bias and shot noise-like parameters, we obtain a constraint on local primordial non-Gaussianity of fNL = -113+154-154 (1σ error). We next assume that the bias of quasar and galaxy distributions can be obtained independently from quasar/galaxy-CMB lensing cross-correlation measurements (such as those in Sherwin et al. (2013)). This can be facilitated by spectroscopic observations of the sources, enabling the redshift distribution to be completely determined, and allowing precise estimates of the bias parameters. In this paper, if the bias and shot noise parameters are fixed to their known values (which we model by fixing them to their best-fit Gaussian values), we find that the error bar reduces to 1σ simeq 65. We expect this error bar to reduce further by at least another factor of five if the data is free of any observational systematics. We therefore emphasize that in order to make best use of large scale structure data we need an accurate modeling of known systematics, a method to mitigate unknown systematics, and additionally independent theoretical models or observations to probe the bias of dark matter halos.

  14. Scale dependence of halo bispectrum from non-Gaussian initial conditions in cosmological N-body simulations

    NASA Astrophysics Data System (ADS)

    Nishimichi, Takahiro; Taruya, Atsushi; Koyama, Kazuya; Sabiu, Cristiano

    2010-07-01

    We study the halo bispectrum from non-Gaussian initial conditions. Based on a set of large N-body simulations starting from initial density fields with local type non-Gaussianity, we find that the halo bispectrum exhibits a strong dependence on the shape and scale of Fourier space triangles near squeezed configurations at large scales. The amplitude of the halo bispectrum roughly scales as fNL2. The resultant scaling on the triangular shape is consistent with that predicted by Jeong & Komatsu based on perturbation theory. We systematically investigate this dependence with varying redshifts and halo mass thresholds. It is shown that the fNL dependence of the halo bispectrum is stronger for more massive haloes at higher redshifts. This feature can be a useful discriminator of inflation scenarios in future deep and wide galaxy redshift surveys.

  15. Impact of lateral boundary conditions on regional analyses

    NASA Astrophysics Data System (ADS)

    Chikhar, Kamel; Gauthier, Pierre

    2017-04-01

    Regional and global climate models are usually validated by comparison to derived observations or reanalyses. Using a model in data assimilation results in a direct comparison to observations to produce its own analyses that may reveal systematic errors. In this study, regional analyses over North America are produced based on the fifth-generation Canadian Regional Climate Model (CRCM5) combined with the variational data assimilation system of the Meteorological Service of Canada (MSC). CRCM5 is driven at its boundaries by global analyses from ERA-interim or produced with the global configuration of the CRCM5. Assimilation cycles for the months of January and July 2011 revealed systematic errors in winter through large values in the mean analysis increments. This bias is attributed to the coupling of the lateral boundary conditions of the regional model with the driving data particularly over the northern boundary where a rapidly changing large scale circulation created significant cross-boundary flows. Increasing the time frequency of the lateral driving and applying a large-scale spectral nudging improved significantly the circulation through the lateral boundaries which translated in a much better agreement with observations.

  16. The efficacy of cognitive prosthetic technology for people with memory impairments: a systematic review and meta-analysis.

    PubMed

    Jamieson, Matthew; Cullen, Breda; McGee-Lennon, Marilyn; Brewster, Stephen; Evans, Jonathan J

    2014-01-01

    Technology can compensate for memory impairment. The efficacy of assistive technology for people with memory difficulties and the methodology of selected studies are assessed. A systematic search was performed and all studies that investigated the impact of technology on memory performance for adults with impaired memory resulting from acquired brain injury (ABI) or a degenerative disease were included. Two 10-point scales were used to compare each study to an ideally reported single case experimental design (SCED) study (SCED scale; Tate et al., 2008 ) or randomised control group study (PEDro-P scale; Maher, Sherrington, Herbert, Moseley, & Elkins, 2003 ). Thirty-two SCED (mean = 5.9 on the SCED scale) and 11 group studies (mean = 4.45 on the PEDro-P scale) were found. Baseline and intervention performance for each participant in the SCED studies was re-calculated using non-overlap of all pairs (Parker & Vannest, 2009 ) giving a mean score of 0.85 on a 0 to 1 scale (17 studies, n = 36). A meta-analysis of the efficacy of technology vs. control in seven group studies gave a large effect size (d = 1.27) (n = 147). It was concluded that prosthetic technology can improve performance on everyday tasks requiring memory. There is a specific need for investigations of technology for people with degenerative diseases.

  17. Multiscale solvers and systematic upscaling in computational physics

    NASA Astrophysics Data System (ADS)

    Brandt, A.

    2005-07-01

    Multiscale algorithms can overcome the scale-born bottlenecks that plague most computations in physics. These algorithms employ separate processing at each scale of the physical space, combined with interscale iterative interactions, in ways which use finer scales very sparingly. Having been developed first and well known as multigrid solvers for partial differential equations, highly efficient multiscale techniques have more recently been developed for many other types of computational tasks, including: inverse PDE problems; highly indefinite (e.g., standing wave) equations; Dirac equations in disordered gauge fields; fast computation and updating of large determinants (as needed in QCD); fast integral transforms; integral equations; astrophysics; molecular dynamics of macromolecules and fluids; many-atom electronic structures; global and discrete-state optimization; practical graph problems; image segmentation and recognition; tomography (medical imaging); fast Monte-Carlo sampling in statistical physics; and general, systematic methods of upscaling (accurate numerical derivation of large-scale equations from microscopic laws).

  18. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  19. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE PAGES

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    2016-07-26

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  20. Do large-scale hospital- and system-wide interventions improve patient outcomes: a systematic review.

    PubMed

    Clay-Williams, Robyn; Nosrati, Hadis; Cunningham, Frances C; Hillman, Kenneth; Braithwaite, Jeffrey

    2014-09-03

    While health care services are beginning to implement system-wide patient safety interventions, evidence on the efficacy of these interventions is sparse. We know that uptake can be variable, but we do not know the factors that affect uptake or how the interventions establish change and, in particular, whether they influence patient outcomes. We conducted a systematic review to identify how organisational and cultural factors mediate or are mediated by hospital-wide interventions, and to assess the effects of those factors on patient outcomes. A systematic review was conducted and reported in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Database searches were conducted using MEDLINE from 1946, CINAHL from 1991, EMBASE from 1947, Web of Science from 1934, PsycINFO from 1967, and Global Health from 1910 to September 2012. The Lancet, JAMA, BMJ, BMJ Quality and Safety, The New England Journal of Medicine and Implementation Science were also hand searched for relevant studies published over the last 5 years. Eligible studies were required to focus on organisational determinants of hospital- and system-wide interventions, and to provide patient outcome data before and after implementation of the intervention. Empirical, peer-reviewed studies reporting randomised and non-randomised controlled trials, observational, and controlled before and after studies were included in the review. Six studies met the inclusion criteria. Improved outcomes were observed for studies where outcomes were measured at least two years after the intervention. Associations between organisational factors, intervention success and patient outcomes were undetermined: organisational culture and patient outcomes were rarely measured together, and measures for culture and outcome were not standardised. Common findings show the difficulty of introducing large-scale interventions, and that effective leadership and clinical champions, adequate financial and educational resources, and dedicated promotional activities appear to be common factors in successful system-wide change.The protocol has been registered in the international prospective register of systematic reviews, PROSPERO (Registration No. CRD42103003050).

  1. Monitoring Forest Condition in Europe: Impacts of Nitrogen and Sulfur Depositions on Forest Ecosystems

    Treesearch

    M. Lorenz; G. Becher; V. Mues; E. Ulrich

    2006-01-01

    Forest condition in Europe has been monitored over 19 years jointly by the United Nations Economic Commission for Europe (UNECE) and the European Union (EU). Large-scale variations of forest condition over space and time in relation to natural and anthropogenic factors are assessed on about 6,000 plots systematically spread across Europe. This large-scale monitoring...

  2. Systematic observations of the slip pulse properties of large earthquake ruptures

    USGS Publications Warehouse

    Melgar, Diego; Hayes, Gavin

    2017-01-01

    In earthquake dynamics there are two end member models of rupture: propagating cracks and self-healing pulses. These arise due to different properties of faults and have implications for seismic hazard; rupture mode controls near-field strong ground motions. Past studies favor the pulse-like mode of rupture; however, due to a variety of limitations, it has proven difficult to systematically establish their kinematic properties. Here we synthesize observations from a database of >150 rupture models of earthquakes spanning M7–M9 processed in a uniform manner and show the magnitude scaling properties of these slip pulses indicates self-similarity. Further, we find that large and very large events are statistically distinguishable relatively early (at ~15 s) in the rupture process. This suggests that with dense regional geophysical networks strong ground motions from a large rupture can be identified before their onset across the source region.

  3. Development of Systematic Approaches for Calibration of Subsurface Transport Models Using Hard and Soft Data on System Characteristics and Behavior

    DTIC Science & Technology

    2011-02-02

    who graduated during this period and will receive scholarships or fellowships for further studies in science, mathematics, engineering or technology...nature or are collected at discrete points or localized areas in the system. The qualitative data includes, geology , large-scale stratigraphy and

  4. Intercomparison of methods of coupling between convection and large-scale circulation: 2. Comparison over nonuniform surface conditions

    DOE PAGES

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.; ...

    2016-03-18

    As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison ofmore » the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. Lastly, these large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.« less

  5. Risk of relapse after natalizumab withdrawal

    PubMed Central

    Vukusic, Sandra; Casey, Romain; Debard, Nadine; Stankoff, Bruno; Mrejen, Serge; Uhry, Zoe; Van Ganse, Eric; Castot, Anne; Clanet, Michel; Lubetzki, Catherine; Confavreux, Christian

    2016-01-01

    Objective: To assess disease activity within 12 months after natalizumab (NZ) discontinuation in a large French postmarketing cohort. Methods: In France, patients exposed at least once to NZ were included in the TYSEDMUS observational and multicenter cohort, part of the French NZ Risk Management Plan. Clinical disease activity during the year following NZ discontinuation was assessed in this cohort. Time to first relapse after NZ stop was analyzed using Kaplan-Meier method and potentially associated factors were studied using a multivariate Cox model. Results: Out of the 4,055 patients with multiple sclerosis (MS) included in TYSEDMUS, 1,253 discontinued NZ and 715 of them had relevant data for our study. The probability of relapse within the year after NZ stop was estimated at 45% (95% confidence interval 0.41–0.49). Conclusions: This large and systematic survey of patients with MS after NZ withdrawal allows quantifying the risk of increased disease activity following treatment discontinuation. This study provides large-scale, multicenter, systematic data after NZ cessation in real-life settings. PMID:27844037

  6. A systematic approach for the development of a kindergarten-based intervention for the prevention of obesity in preschool age children: the ToyBox-study.

    PubMed

    Manios, Y; Grammatikaki, E; Androutsos, O; Chinapaw, M J M; Gibson, E L; Buijs, G; Iotova, V; Socha, P; Annemans, L; Wildgruber, A; Mouratidou, T; Yngve, A; Duvinage, K; de Bourdeaudhuij, I

    2012-03-01

    The increasing childhood obesity epidemic calls for appropriate measures and effective policies to be applied early in life. Large-scale socioecological frameworks providing a holistic multifactorial and cost-effective approach necessary to support obesity prevention initiatives in this age are however currently missing. To address this missing link, ToyBox-study aims to build and evaluate a cost-effective kindergarten-based, family-involved intervention scheme to prevent obesity in early childhood, which could potentially be expanded on a pan-European scale. A multidisciplinary team of researchers from 10 countries have joined forces and will work to realize this according to a systematic stepwise approach that combines the use of the PRECEDE-PROCEED model and intervention mapping protocol. ToyBox-study will conduct systematic and narrative reviews, secondary data analyses, focus group research and societal assessment to design, implement and evaluate outcome, impact, process and cost effectiveness of the intervention. This is the first time that such a holistic approach has been used on a pan-European scale to promote healthy weight and healthy energy balance-related behaviours for the prevention of early childhood obesity. The results of ToyBox-study will be disseminated among key stakeholders including researchers, policy makers, practitioners and the general population. © 2012 The Authors. obesity reviews © 2012 International Association for the Study of Obesity.

  7. Conservation of lynx in the United States: A systematic approach to closing critical knowledge gaps [Chapter 17

    Treesearch

    Keith B. Aubry; Leonard F. Ruggiero; John R. Squires; Kevin S. McKelvey; Gary M. Koehler; Steven W. Buskirk; Charles J. Krebs

    2000-01-01

    Large-scale ecological studies and assessments are often implemented only after the focus of study generates substantial social, political, or legal pressure to take action (e.g., Thomas et al. 1990; Ruggiero et al. 1991; FEMAT 1993). In such a funding environment, the coordinated planning of research may suffer as the pressure to produce results escalates. To avoid...

  8. U.S. Regional Aquifer Analysis Program

    NASA Astrophysics Data System (ADS)

    Johnson, Ivan

    As a result of the severe 1976-1978 drought, Congress in 1978 requested that the U.S. Geological Survey (USGS) initiate studies of the nation's aquifers on a regional scale. This continuing USGS project, the Regional Aquifer System Analysis (RASA) Program, consists of systematic studies of the quality and quantity of water in the regional groundwater systems that supply a large part of the nation's water.

  9. An Exploratory Analysis of the Longitudinal Impact of Principal Change on Elementary School Achievement

    ERIC Educational Resources Information Center

    Hochbein, Craig; Cunningham, Brittany C.

    2013-01-01

    Recent reform initiatives, such as the Title I School Improvement Grants and Race to the Top, recommended a principal change to jump-start school turnaround. Yet, few educational researchers have examined principal change as way to improve schools in a state of systematic reform; furthermore, no large-scale quantitative study has determined the…

  10. Discussing the Flynn Effect: From Causes and Interpretation to Implications

    ERIC Educational Resources Information Center

    Kanaya, Tomoe

    2016-01-01

    Clark, Lawlor-Savage, and Goghari (this issue) point out that evidence of IQ rises had been documented decades before it was named the Flynn effect. These previous studies, however, were conducted sporadically and in isolated samples. Flynn (1984, 1987) examined them in a large-scale manner and was able to show their systematic and global nature.…

  11. Overview of the OGAP Formative Assessment Project and CPRE's Large-Scale Experimental Study of Implementation and Impacts

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2016-01-01

    In this presentation discussed in this brief abstracted report, the author presents about an ongoing partnership with the Philadelphia School District (PSD) to implement and research the Ongoing Assessment Project (OGAP). OGAP is a systematic, intentional and iterative formative assessment system grounded in the research on how students learn…

  12. An Evaluation of the Conditions, Processes, and Consequences of Laptop Computing in K-12 Classrooms

    ERIC Educational Resources Information Center

    Cavanaugh, Cathy; Dawson, Kara; Ritzhaupt, Albert

    2011-01-01

    This article examines how laptop computing technology, teacher professional development, and systematic support resulted in changed teaching practices and increased student achievement in 47 K-12 schools in 11 Florida school districts. The overview of a large-scale study documents the type and magnitude of change in student-centered teaching,…

  13. Educational Interventions for Children with ASD: A Systematic Literature Review 2008-2013

    ERIC Educational Resources Information Center

    Bond, Caroline; Symes, Wendy; Hebron, Judith; Humphrey, Neil; Morewood, Gareth; Woods, Kevin

    2016-01-01

    Systematic literature reviews can play a key role in underpinning evidence-based practice. To date, large-scale reviews of interventions for individuals with Autism Spectrum Disorder (ASD) have focused primarily on research quality. To assist practitioners, the current review adopted a broader framework which allowed for greater consideration of…

  14. Functional Independent Scaling Relation for ORR/OER Catalysts

    DOE PAGES

    Christensen, Rune; Hansen, Heine A.; Dickens, Colin F.; ...

    2016-10-11

    A widely used adsorption energy scaling relation between OH* and OOH* intermediates in the oxygen reduction reaction (ORR) and oxygen evolution reaction (OER), has previously been determined using density functional theory and shown to dictate a minimum thermodynamic overpotential for both reactions. Here, we show that the oxygen–oxygen bond in the OOH* intermediate is, however, not well described with the previously used class of exchange-correlation functionals. By quantifying and correcting the systematic error, an improved description of gaseous peroxide species versus experimental data and a reduction in calculational uncertainty is obtained. For adsorbates, we find that the systematic error largelymore » cancels the vdW interaction missing in the original determination of the scaling relation. An improved scaling relation, which is fully independent of the applied exchange–correlation functional, is obtained and found to differ by 0.1 eV from the original. Lastly, this largely confirms that, although obtained with a method suffering from systematic errors, the previously obtained scaling relation is applicable for predictions of catalytic activity.« less

  15. Assessment of Somatization and Medically Unexplained Symptoms in Later Life

    PubMed Central

    van Driel, T. J. W.; Hilderink, P. H.; Hanssen, D. J. C.; de Boer, P.; Rosmalen, J. G. M.; Oude Voshaar, R. C.

    2017-01-01

    The assessment of medically unexplained symptoms and “somatic symptom disorders” in older adults is challenging due to somatic multimorbidity, which threatens the validity of somatization questionnaires. In a systematic review study, the Patient Health Questionnaire–15 (PHQ-15) and the somatization subscale of the Symptom Checklist 90-item version (SCL-90 SOM) are recommended out of 40 questionnaires for usage in large-scale studies. While both scales measure physical symptoms which in younger persons often refer to unexplained symptoms, in older persons, these symptoms may originate from somatic diseases. Using empirical data, we show that PHQ-15 and SCL-90 SOM among older patients correlate with proxies of somatization as with somatic disease burden. Updating the previous systematic review, revealed six additional questionnaires. Cross-validation studies are needed as none of 46 identified scales met the criteria of suitability for an older population. Nonetheless, specific recommendations can be made for studying older persons, namely the SCL-90 SOM and PHQ-15 for population-based studies, the Freiburg Complaint List and somatization subscale of the Brief Symptom Inventory 53-item version for studies in primary care, and finally the Schedule for Evaluating Persistent Symptoms and Somatic Symptom Experiences Questionnaire for monitoring treatment studies. PMID:28745072

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.

    As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison ofmore » the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. Lastly, these large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.« less

  17. Multi-scale comparison of source parameter estimation using empirical Green's function approach

    NASA Astrophysics Data System (ADS)

    Chen, X.; Cheng, Y.

    2015-12-01

    Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.

  18. A systematic review of wheelchair skills tests for manual wheelchair users with a spinal cord injury: towards a standardized outcome measure.

    PubMed

    Fliess-Douer, Osnat; Vanlandewijck, Yves C; Lubel Manor, Galia; Van Der Woude, Lucas H V

    2010-10-01

    To review, analyse, evaluate and critically appraise available wheelchair skill tests in the international literature and to determine the need for a standardized measurement tool of manual wheeled mobility in those with spinal cord injury. A systematic review of literature (databases PubMed, Web of Science and Cochrane Library (1970-December 2009). Hand rim wheelchair users, mainly those with spinal cord injury. Studies' content and methodology were analysed qualitatively. Study quality was assessed using the scale of Gardner and Altman. Thirteen studies fell within the inclusion criteria and were critically reviewed. The 13 studies covered 11 tests, which involved 14 different skills. These 14 skills were categorized into: wheelchair manoeuvring and basic daily living skills; obstacle-negotiating skills; wheelie tasks; and transfers. The Wheelchair Skills Test version 2.4 (WST-2.4) and Wheelchair Circuit tests scored best on the Gardner and Altman scale, the Obstacle Course Assessment of Wheelchair User Performances (OCAWUP) test was found to be the most relevant for daily needs in a wheelchair. The different tests used different measurement scales, varying from binary to ordinal and continuous. Comparison of outcomes between tests was not possible because of differences in skills assessed, measurement scales, environment and equipment selected for each test. A lack of information regarding protocols as well as differences in terminology was also detected. This systematic review revealed large inconsistencies among the current available wheelchair skill tests. This makes it difficult to compare study results and to create norms and standards for wheelchair skill performance.

  19. Using Climate Regionalization to Understand Climate Forecast System Version 2 (CFSv2) Precipitation Performance for the Conterminous United States (CONUS)

    NASA Technical Reports Server (NTRS)

    Regonda, Satish K.; Zaitchik, Benjamin F.; Badr, Hamada S.; Rodell, Matthew

    2016-01-01

    Dynamically based seasonal forecasts are prone to systematic spatial biases due to imperfections in the underlying global climate model (GCM). This can result in low-forecast skill when the GCM misplaces teleconnections or fails to resolve geographic barriers, even if the prediction of large-scale dynamics is accurate. To characterize and address this issue, this study applies objective climate regionalization to identify discrepancies between the Climate Forecast SystemVersion 2 (CFSv2) and precipitation observations across the Contiguous United States (CONUS). Regionalization shows that CFSv2 1 month forecasts capture the general spatial character of warm season precipitation variability but that forecast regions systematically differ from observation in some transition zones. CFSv2 predictive skill for these misclassified areas is systematically reduced relative to correctly regionalized areas and CONUS as a whole. In these incorrectly regionalized areas, higher skill can be obtained by using a regional-scale forecast in place of the local grid cell prediction.

  20. Simulations of hypervelocity impacts for asteroid deflection studies

    NASA Astrophysics Data System (ADS)

    Heberling, T.; Ferguson, J. M.; Gisler, G. R.; Plesko, C. S.; Weaver, R.

    2016-12-01

    The possibility of kinetic-impact deflection of threatening near-earth asteroids will be tested for the first time in the proposed AIDA (Asteroid Impact Deflection Assessment) mission, involving two independent spacecraft, NASAs DART (Double Asteroid Redirection Test) and ESAs AIM (Asteroid Impact Mission). The impact of the DART spacecraft onto the secondary of the binary asteroid 65803 Didymos, at a speed of 5 to 7 km/s, is expected to alter the mutual orbit by an observable amount. The velocity imparted to the secondary depends on the geometry and dynamics of the impact, and especially on the momentum enhancement factor, conventionally called beta. We use the Los Alamos hydrocodes Rage and Pagosa to estimate beta in laboratory-scale benchmark experiments and in the large-scale asteroid deflection test. Simulations are performed in two- and three-dimensions, using a variety of equations of state and strength models for both the lab-scale and large-scale cases. This work is being performed as part of a systematic benchmarking study for the AIDA mission that includes other hydrocodes.

  1. Invertebrate iridoviruses: A glance over the last decade

    USDA-ARS?s Scientific Manuscript database

    Iridovirus is a genus of large dsDNA viruses that predominantly infects both invertebrate and vertebrate ectotherms and whose symptoms range in severity from minor reductions in fitness to systematic disease and large-scale mortality. Several characteristics have been useful for taxonomically classi...

  2. Planck 2015 results. III. LFI systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Burigana, C.; Butler, R. C.; Calabrese, E.; Catalano, A.; Christensen, P. R.; Colombo, L. P. L.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Doré, O.; Ducout, A.; Dupac, X.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Frailis, M.; Franceschet, C.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Harrison, D. L.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Knoche, J.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Nati, F.; Natoli, P.; Noviello, F.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Pearson, T. J.; Perdereau, O.; Pettorino, V.; Piacentini, F.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Stolyarov, V.; Stompor, R.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (I) simulations based on measured data and physical models of the known systematic effects; and (II) analysis of difference maps containing the same sky signal ("null-maps"). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrum by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10-20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.

  3. Planck 2015 results: III. LFI systematic uncertainties

    DOE PAGES

    Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; ...

    2016-09-20

    In this paper, we present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal (“null-maps”). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrummore » by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10–20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Finally, some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.« less

  4. Planck 2015 results: III. LFI systematic uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ade, P. A. R.; Aumont, J.; Baccigalupi, C.

    In this paper, we present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal (“null-maps”). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrummore » by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10–20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Finally, some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.« less

  5. Geomorphic analysis of large alluvial rivers

    NASA Astrophysics Data System (ADS)

    Thorne, Colin R.

    2002-05-01

    Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.

  6. Antitoxin Treatment of Inhalation Anthrax: A Systematic Review

    PubMed Central

    Huang, Eileen; Pillai, Satish K.; Bower, William A.; Hendricks, Katherine A.; Guarnizo, Julie T.; Hoyle, Jamechia D.; Gorman, Susan E.; Boyer, Anne E.; Quinn, Conrad P.; Meaney-Delman, Dana

    2016-01-01

    Concern about use of anthrax as a bioweapon prompted development of novel anthrax antitoxins for treatment. Clinical guidelines for the treatment of anthrax recommend antitoxin therapy in combination with intravenous antimicrobials; however, a large-scale or mass anthrax incident may exceed antitoxin availability and create a need for judicious antitoxin use. We conducted a systematic review of antitoxin treatment of inhalation anthrax in humans and experimental animals to inform antitoxin recommendations during a large-scale or mass anthrax incident. A comprehensive search of 11 databases and the FDA website was conducted to identify relevant animal studies and human reports: 28 animal studies and 3 human cases were identified. Antitoxin monotherapy at or shortly after symptom onset demonstrates increased survival compared to no treatment in animals. With early treatment, survival did not differ between antimicrobial monotherapy and antimicrobial-antitoxin therapy in nonhuman primates and rabbits. With delayed treatment, antitoxin-antimicrobial treatment increased rabbit survival. Among human cases, addition of antitoxin to combination antimicrobial treatment was associated with survival in 2 of the 3 cases treated. Despite the paucity of human data, limited animal data suggest that adjunctive antitoxin therapy may improve survival. Delayed treatment studies suggest improved survival with combined antitoxin-antimicrobial therapy, although a survival difference compared with antimicrobial therapy alone was not demonstrated statistically. In a mass anthrax incident with limited antitoxin supplies, antitoxin treatment of individuals who have not demonstrated a clinical benefit from antimicrobials, or those who present with more severe illness, may be warranted. Additional pathophysiology studies are needed, and a point-of-care assay correlating toxin levels with clinical status may provide important information to guide antitoxin use during a large-scale anthrax incident. PMID:26690378

  7. Antitoxin Treatment of Inhalation Anthrax: A Systematic Review.

    PubMed

    Huang, Eileen; Pillai, Satish K; Bower, William A; Hendricks, Katherine A; Guarnizo, Julie T; Hoyle, Jamechia D; Gorman, Susan E; Boyer, Anne E; Quinn, Conrad P; Meaney-Delman, Dana

    2015-01-01

    Concern about use of anthrax as a bioweapon prompted development of novel anthrax antitoxins for treatment. Clinical guidelines for the treatment of anthrax recommend antitoxin therapy in combination with intravenous antimicrobials; however, a large-scale or mass anthrax incident may exceed antitoxin availability and create a need for judicious antitoxin use. We conducted a systematic review of antitoxin treatment of inhalation anthrax in humans and experimental animals to inform antitoxin recommendations during a large-scale or mass anthrax incident. A comprehensive search of 11 databases and the FDA website was conducted to identify relevant animal studies and human reports: 28 animal studies and 3 human cases were identified. Antitoxin monotherapy at or shortly after symptom onset demonstrates increased survival compared to no treatment in animals. With early treatment, survival did not differ between antimicrobial monotherapy and antimicrobial-antitoxin therapy in nonhuman primates and rabbits. With delayed treatment, antitoxin-antimicrobial treatment increased rabbit survival. Among human cases, addition of antitoxin to combination antimicrobial treatment was associated with survival in 2 of the 3 cases treated. Despite the paucity of human data, limited animal data suggest that adjunctive antitoxin therapy may improve survival. Delayed treatment studies suggest improved survival with combined antitoxin-antimicrobial therapy, although a survival difference compared with antimicrobial therapy alone was not demonstrated statistically. In a mass anthrax incident with limited antitoxin supplies, antitoxin treatment of individuals who have not demonstrated a clinical benefit from antimicrobials, or those who present with more severe illness, may be warranted. Additional pathophysiology studies are needed, and a point-of-care assay correlating toxin levels with clinical status may provide important information to guide antitoxin use during a large-scale anthrax incident.

  8. A systematic review of treatments for anxiety in youth with autism spectrum disorders.

    PubMed

    Vasa, Roma A; Carroll, Laura M; Nozzolillo, Alixandra A; Mahajan, Rajneesh; Mazurek, Micah O; Bennett, Amanda E; Wink, Logan K; Bernal, Maria Pilar

    2014-12-01

    This study systematically examined the efficacy and safety of psychopharmacological and non-psychopharmacological treatments for anxiety in youth with autism spectrum disorders (ASD). Four psychopharmacological, nine cognitive behavioral therapy (CBT), and two alternative treatment studies met inclusion criteria. Psychopharmacological studies were descriptive or open label, sometimes did not specify the anxiety phenotype, and reported behavioral activation. Citalopram and buspirone yielded some improvement, whereas fluvoxamine did not. Non-psychopharmacological studies were mainly randomized controlled trials (RCTs) with CBT demonstrating moderate efficacy for anxiety disorders in youth with high functioning ASD. Deep pressure and neurofeedback provided some benefit. All studies were short-term and included small sample sizes. Large scale and long term RCTs examining psychopharmacological and non-psychopharmacological treatments are sorely needed.

  9. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  10. Bio-stimuli-responsive multi-scale hyaluronic acid nanoparticles for deepened tumor penetration and enhanced therapy.

    PubMed

    Huo, Mengmeng; Li, Wenyan; Chaudhuri, Arka Sen; Fan, Yuchao; Han, Xiu; Yang, Chen; Wu, Zhenghong; Qi, Xiaole

    2017-09-01

    In this study, we developed bio-stimuli-responsive multi-scale hyaluronic acid (HA) nanoparticles encapsulated with polyamidoamine (PAMAM) dendrimers as the subunits. These HA/PAMAM nanoparticles of large scale (197.10±3.00nm) were stable during systematic circulation then enriched at the tumor sites; however, they were prone to be degraded by the high expressed hyaluronidase (HAase) to release inner PAMAM dendrimers and regained a small scale (5.77±0.25nm) with positive charge. After employing tumor spheroids penetration assay on A549 3D tumor spheroids for 8h, the fluorescein isothiocyanate (FITC) labeled multi-scale HA/PAMAM-FITC nanoparticles could penetrate deeply into these tumor spheroids with the degradation of HAase. Moreover, small animal imaging technology in male nude mice bearing H22 tumor showed HA/PAMAM-FITC nanoparticles possess higher prolonged systematic circulation compared with both PAMAM-FITC nanoparticles and free FITC. In addition, after intravenous administration in mice bearing H22 tumors, methotrexate (MTX) loaded multi-scale HA/PAMAM-MTX nanoparticles exhibited a 2.68-fold greater antitumor activity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. ScreenBEAM: a novel meta-analysis algorithm for functional genomics screens via Bayesian hierarchical modeling | Office of Cancer Genomics

    Cancer.gov

    Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.

  12. Demonstration of reduced-order urban scale building energy models

    DOE PAGES

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...

    2017-09-08

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  13. Demonstration of reduced-order urban scale building energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  14. A Systematic Multi-Time Scale Solution for Regional Power Grid Operation

    NASA Astrophysics Data System (ADS)

    Zhu, W. J.; Liu, Z. G.; Cheng, T.; Hu, B. Q.; Liu, X. Z.; Zhou, Y. F.

    2017-10-01

    Many aspects need to be taken into consideration in a regional grid while making schedule plans. In this paper, a systematic multi-time scale solution for regional power grid operation considering large scale renewable energy integration and Ultra High Voltage (UHV) power transmission is proposed. In the time scale aspect, we discuss the problem from month, week, day-ahead, within-day to day-behind, and the system also contains multiple generator types including thermal units, hydro-plants, wind turbines and pumped storage stations. The 9 subsystems of the scheduling system are described, and their functions and relationships are elaborated. The proposed system has been constructed in a provincial power grid in Central China, and the operation results further verified the effectiveness of the system.

  15. Asylum Seekers, Violence and Health: A Systematic Review of Research in High-Income Host Countries

    PubMed Central

    Hossain, Mazeda; Kiss, Ligia; Zimmerman, Cathy

    2013-01-01

    We performed a systematic review of literature on violence and related health concerns among asylum seekers in high-income host countries. We extracted data from 23 peer-reviewed studies. Prevalence of torture, variably defined, was above 30% across all studies. Torture history in clinic populations correlated with hunger and posttraumatic stress disorder, although in small, nonrepresentative samples. One study observed that previous exposure to interpersonal violence interacted with longer immigration detention periods, resulting in higher depression scores. Limited evidence suggests that asylum seekers frequently experience violence and health problems, but large-scale studies are needed to inform policies and services for this vulnerable group often at the center of political debate. PMID:23327250

  16. Drought Persistence Errors in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, H.; Gudmundsson, L.; Seneviratne, S. I.

    2018-04-01

    The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.

  17. Epidemiological considerations for the use of databases in transfusion research: a Scandinavian perspective.

    PubMed

    Edgren, Gustaf; Hjalgrim, Henrik

    2010-11-01

    At current safety levels, with adverse events from transfusions being relatively rare, further progress in risk reductions will require large-scale investigations. Thus, truly prospective studies may prove unfeasible and other alternatives deserve consideration. In this review, we will try to give an overview of recent and historical developments in the use of blood donation and transfusion databases in research. In addition, we will go over important methodological issues. There are at least three nationwide or near-nationwide donation/transfusion databases with the possibility for long-term follow-up of donors and recipients. During the past few years, a large number of reports have been published utilizing such data sources to investigate transfusion-associated risks. In addition, numerous clinics systematically collect and use such data on a smaller scale. Combining systematically recorded donation and transfusion data with long-term health follow-up opens up exciting opportunities for transfusion medicine research. However, the correct analysis of such data requires close attention to methodological issues, especially including the indication for transfusion and reverse causality.

  18. Decoupling local mechanics from large-scale structure in modular metamaterials.

    PubMed

    Yang, Nan; Silverberg, Jesse L

    2017-04-04

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such "inverse design" is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module's design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  19. Decoupling local mechanics from large-scale structure in modular metamaterials

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  20. The role of the airline transportation network in the prediction and predictability of global epidemics.

    PubMed

    Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro

    2006-02-14

    The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.

  1. Influence of a large-scale field on energy dissipation in magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Zhdankin, Vladimir; Boldyrev, Stanislav; Mason, Joanne

    2017-07-01

    In magnetohydrodynamic (MHD) turbulence, the large-scale magnetic field sets a preferred local direction for the small-scale dynamics, altering the statistics of turbulence from the isotropic case. This happens even in the absence of a total magnetic flux, since MHD turbulence forms randomly oriented large-scale domains of strong magnetic field. It is therefore customary to study small-scale magnetic plasma turbulence by assuming a strong background magnetic field relative to the turbulent fluctuations. This is done, for example, in reduced models of plasmas, such as reduced MHD, reduced-dimension kinetic models, gyrokinetics, etc., which make theoretical calculations easier and numerical computations cheaper. Recently, however, it has become clear that the turbulent energy dissipation is concentrated in the regions of strong magnetic field variations. A significant fraction of the energy dissipation may be localized in very small volumes corresponding to the boundaries between strongly magnetized domains. In these regions, the reduced models are not applicable. This has important implications for studies of particle heating and acceleration in magnetic plasma turbulence. The goal of this work is to systematically investigate the relationship between local magnetic field variations and magnetic energy dissipation, and to understand its implications for modelling energy dissipation in realistic turbulent plasmas.

  2. Planck intermediate results: XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    DOE PAGES

    Aghanim, N.; Ashdown, M.; Aumont, J.; ...

    2016-12-12

    This study describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducingmore » significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. Finally, in a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.« less

  3. Planck intermediate results: XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghanim, N.; Ashdown, M.; Aumont, J.

    This study describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducingmore » significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. Finally, in a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.« less

  4. Synergy of Stochastic and Systematic Energization of Plasmas during Turbulent Reconnection

    NASA Astrophysics Data System (ADS)

    Pisokas, Theophilos; Vlahos, Loukas; Isliker, Heinz

    2018-01-01

    The important characteristic of turbulent reconnection is that it combines large-scale magnetic disturbances (δ B/B∼ 1) with randomly distributed unstable current sheets (UCSs). Many well-known nonlinear MHD structures (strong turbulence, current sheet(s), shock(s)) lead asymptotically to the state of turbulent reconnection. We analyze in this article, for the first time, the energization of electrons and ions in a large-scale environment that combines large-amplitude disturbances propagating with sub-Alfvénic speed with UCSs. The magnetic disturbances interact stochastically (second-order Fermi) with the charged particles and play a crucial role in the heating of the particles, while the UCSs interact systematically (first-order Fermi) and play a crucial role in the formation of the high-energy tail. The synergy of stochastic and systematic acceleration provided by the mixture of magnetic disturbances and UCSs influences the energetics of the thermal and nonthermal particles, the power-law index, and the length of time the particles remain inside the energy release volume. We show that this synergy can explain the observed very fast and impulsive particle acceleration and the slightly delayed formation of a superhot particle population.

  5. XLID-Causing Mutations and Associated Genes Challenged in Light of Data From Large-Scale Human Exome Sequencing

    PubMed Central

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-01-01

    Because of the unbalanced sex ratio (1.3–1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. PMID:23871722

  6. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  7. A systematic review of Investigator Global Assessment (IGA) in atopic dermatitis (AD) trials: Many options, no standards.

    PubMed

    Futamura, Masaki; Leshem, Yael A; Thomas, Kim S; Nankervis, Helen; Williams, Hywel C; Simpson, Eric L

    2016-02-01

    Investigators often use global assessments to provide a snapshot of overall disease severity in dermatologic clinical trials. Although easy to perform, the frequency of use and standardization of global assessments in studies of atopic dermatitis (AD) is unclear. We sought to assess the frequency, definitions, and methods of analysis of Investigator Global Assessment in randomized controlled trials of AD. We conducted a systematic review using all published randomized controlled trials of AD treatments in the Global Resource of Eczema Trials database (2000-2014). We determined the frequency of global scales application and defining features. Among 317 trials identified, 101 trials (32%) used an investigator-performed global assessment as an outcome measure. There was large variability in global assessments between studies in nomenclature, scale size, definitions, outcome description, and analysis. Both static and dynamic scales were identified that ranged from 4- to 7-point scales. North American studies used global assessments more commonly than studies from other countries. The search was restricted to the Global Resource of Eczema Trials database. Global assessments are used frequently in studies of AD, but their complete lack of standardized definitions and implementation preclude any meaningful comparisons between studies, which in turn impedes data synthesis to inform clinical decision-making. Standardization is urgently required. Copyright © 2015. Published by Elsevier Inc.

  8. An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.

    PubMed

    2012-11-01

    Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.

  9. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2016-01-05

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  10. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  11. The Cosmology Large Angular Scale Surveyor

    NASA Technical Reports Server (NTRS)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  12. Meridional flow in the solar convection zone. I. Measurements from gong data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kholikov, S.; Serebryanskiy, A.; Jackiewicz, J., E-mail: kholikov@noao.edu

    2014-04-01

    Large-scale plasma flows in the Sun's convection zone likely play a major role in solar dynamics on decadal timescales. In particular, quantifying meridional motions is a critical ingredient for understanding the solar cycle and the transport of magnetic flux. Because the signal of such features can be quite small in deep solar layers and be buried in systematics or noise, the true meridional velocity profile has remained elusive. We perform time-distance helioseismology measurements on several years worth of Global Oscillation Network Group Doppler data. A spherical harmonic decomposition technique is applied to a subset of acoustic modes to measure travel-timemore » differences to try to obtain signatures of meridional flows throughout the solar convection zone. Center-to-limb systematics are taken into account in an intuitive yet ad hoc manner. Travel-time differences near the surface that are consistent with a poleward flow in each hemisphere and are similar to previous work are measured. Additionally, measurements in deep layers near the base of the convection zone suggest a possible equatorward flow, as well as partial evidence of a sign change in the travel-time differences at mid-convection zone depths. This analysis on an independent data set using different measurement techniques strengthens recent conclusions that the convection zone may have multiple 'cells' of meridional flow. The results may challenge the common understanding of one large conveyor belt operating in the solar convection zone. Further work with helioseismic inversions and a careful study of systematic effects are needed before firm conclusions of these large-scale flow structures can be made.« less

  13. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  14. The void spectrum in two-dimensional numerical simulations of gravitational clustering

    NASA Technical Reports Server (NTRS)

    Kauffmann, Guinevere; Melott, Adrian L.

    1992-01-01

    An algorithm for deriving a spectrum of void sizes from two-dimensional high-resolution numerical simulations of gravitational clustering is tested, and it is verified that it produces the correct results where those results can be anticipated. The method is used to study the growth of voids as clustering proceeds. It is found that the most stable indicator of the characteristic void 'size' in the simulations is the mean fractional area covered by voids of diameter d, in a density field smoothed at its correlation length. Very accurate scaling behavior is found in power-law numerical models as they evolve. Eventually, this scaling breaks down as the nonlinearity reaches larger scales. It is shown that this breakdown is a manifestation of the undesirable effect of boundary conditions on simulations, even with the very large dynamic range possible here. A simple criterion is suggested for deciding when simulations with modest large-scale power may systematically underestimate the frequency of larger voids.

  15. Measuring the Large-scale Solar Magnetic Field

    NASA Astrophysics Data System (ADS)

    Hoeksema, J. T.; Scherrer, P. H.; Peterson, E.; Svalgaard, L.

    2017-12-01

    The Sun's large-scale magnetic field is important for determining global structure of the corona and for quantifying the evolution of the polar field, which is sometimes used for predicting the strength of the next solar cycle. Having confidence in the determination of the large-scale magnetic field of the Sun is difficult because the field is often near the detection limit, various observing methods all measure something a little different, and various systematic effects can be very important. We compare resolved and unresolved observations of the large-scale magnetic field from the Wilcox Solar Observatory, Heliseismic and Magnetic Imager (HMI), Michelson Doppler Imager (MDI), and Solis. Cross comparison does not enable us to establish an absolute calibration, but it does allow us to discover and compensate for instrument problems, such as the sensitivity decrease seen in the WSO measurements in late 2016 and early 2017.

  16. Effects of forcing time scale on the simulated turbulent flows and turbulent collision statistics of inertial particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosa, B., E-mail: bogdan.rosa@imgw.pl; Parishani, H.; Department of Earth System Science, University of California, Irvine, California 92697-3100

    2015-01-15

    In this paper, we study systematically the effects of forcing time scale in the large-scale stochastic forcing scheme of Eswaran and Pope [“An examination of forcing in direct numerical simulations of turbulence,” Comput. Fluids 16, 257 (1988)] on the simulated flow structures and statistics of forced turbulence. Using direct numerical simulations, we find that the forcing time scale affects the flow dissipation rate and flow Reynolds number. Other flow statistics can be predicted using the altered flow dissipation rate and flow Reynolds number, except when the forcing time scale is made unrealistically large to yield a Taylor microscale flow Reynoldsmore » number of 30 and less. We then study the effects of forcing time scale on the kinematic collision statistics of inertial particles. We show that the radial distribution function and the radial relative velocity may depend on the forcing time scale when it becomes comparable to the eddy turnover time. This dependence, however, can be largely explained in terms of altered flow Reynolds number and the changing range of flow length scales present in the turbulent flow. We argue that removing this dependence is important when studying the Reynolds number dependence of the turbulent collision statistics. The results are also compared to those based on a deterministic forcing scheme to better understand the role of large-scale forcing, relative to that of the small-scale turbulence, on turbulent collision of inertial particles. To further elucidate the correlation between the altered flow structures and dynamics of inertial particles, a conditional analysis has been performed, showing that the regions of higher collision rate of inertial particles are well correlated with the regions of lower vorticity. Regions of higher concentration of pairs at contact are found to be highly correlated with the region of high energy dissipation rate.« less

  17. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    PubMed

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  18. Analyzing the cosmic variance limit of remote dipole measurements of the cosmic microwave background using the large-scale kinetic Sunyaev Zel'dovich effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terrana, Alexandra; Johnson, Matthew C.; Harris, Mary-Jean, E-mail: aterrana@perimeterinstitute.ca, E-mail: mharris8@perimeterinstitute.ca, E-mail: mjohnson@perimeterinstitute.ca

    Due to cosmic variance we cannot learn any more about large-scale inhomogeneities from the primary cosmic microwave background (CMB) alone. More information on large scales is essential for resolving large angular scale anomalies in the CMB. Here we consider cross correlating the large-scale kinetic Sunyaev Zel'dovich (kSZ) effect and probes of large-scale structure, a technique known as kSZ tomography. The statistically anisotropic component of the cross correlation encodes the CMB dipole as seen by free electrons throughout the observable Universe, providing information about long wavelength inhomogeneities. We compute the large angular scale power asymmetry, constructing the appropriate transfer functions, andmore » estimate the cosmic variance limited signal to noise for a variety of redshift bin configurations. The signal to noise is significant over a large range of power multipoles and numbers of bins. We present a simple mode counting argument indicating that kSZ tomography can be used to estimate more modes than the primary CMB on comparable scales. A basic forecast indicates that a first detection could be made with next-generation CMB experiments and galaxy surveys. This paper motivates a more systematic investigation of how close to the cosmic variance limit it will be possible to get with future observations.« less

  19. Modeling motivated misreports to sensitive survey questions.

    PubMed

    Böckenholt, Ulf

    2014-07-01

    Asking sensitive or personal questions in surveys or experimental studies can both lower response rates and increase item non-response and misreports. Although non-response is easily diagnosed, misreports are not. However, misreports cannot be ignored because they give rise to systematic bias. The purpose of this paper is to present a modeling approach that identifies misreports and corrects for them. Misreports are conceptualized as a motivated process under which respondents edit their answers before they report them. For example, systematic bias introduced by overreports of socially desirable behaviors or underreports of less socially desirable ones can be modeled, leading to more-valid inferences. The proposed approach is applied to a large-scale experimental study and shows that respondents who feel powerful tend to overclaim their knowledge.

  20. Topics in geophysical fluid dynamics: Atmospheric dynamics, dynamo theory, and climate dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghil, M.; Childress, S.

    1987-01-01

    This text is the first study to apply systematically the successive bifurcations approach to complex time-dependent processes in large scale atmospheric dynamics, geomagnetism, and theoretical climate dynamics. The presentation of recent results on planetary-scale phenomena in the earth's atmosphere, ocean, cryosphere, mantle and core provides an integral account of mathematical theory and methods together with physical phenomena and processes. The authors address a number of problems in rapidly developing areas of geophysics, bringing into closer contact the modern tools of nonlinear mathematics and the novel problems of global change in the environment.

  1. Disentangling dark energy and cosmic tests of gravity from weak lensing systematics

    NASA Astrophysics Data System (ADS)

    Laszlo, Istvan; Bean, Rachel; Kirk, Donnacha; Bridle, Sarah

    2012-06-01

    We consider the impact of key astrophysical and measurement systematics on constraints on dark energy and modifications to gravity on cosmic scales. We focus on upcoming photometric ‘stage III’ and ‘stage IV’ large-scale structure surveys such as the Dark Energy Survey (DES), the Subaru Measurement of Images and Redshifts survey, the Euclid survey, the Large Synoptic Survey Telescope (LSST) and Wide Field Infra-Red Space Telescope (WFIRST). We illustrate the different redshift dependencies of gravity modifications compared to intrinsic alignments, the main astrophysical systematic. The way in which systematic uncertainties, such as galaxy bias and intrinsic alignments, are modelled can change dark energy equation-of-state parameter and modified gravity figures of merit by a factor of 4. The inclusion of cross-correlations of cosmic shear and galaxy position measurements helps reduce the loss of constraining power from the lensing shear surveys. When forecasts for Planck cosmic microwave background and stage IV surveys are combined, constraints on the dark energy equation-of-state parameter and modified gravity model are recovered, relative to those from shear data with no systematic uncertainties, provided fewer than 36 free parameters in total are used to describe the galaxy bias and intrinsic alignment models as a function of scale and redshift. While some uncertainty in the intrinsic alignment (IA) model can be tolerated, it is going to be important to be able to parametrize IAs well in order to realize the full potential of upcoming surveys. To facilitate future investigations, we also provide a fitting function for the matter power spectrum arising from the phenomenological modified gravity model we consider.

  2. The large-scale organization of metabolic networks

    NASA Astrophysics Data System (ADS)

    Jeong, H.; Tombor, B.; Albert, R.; Oltvai, Z. N.; Barabási, A.-L.

    2000-10-01

    In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents.

  3. A Systematic Review of Barriers and Facilitators to Minority Research Participation Among African Americans, Latinos, Asian Americans, and Pacific Islanders

    PubMed Central

    Duran, Nelida; Norris, Keith

    2014-01-01

    To assess the experienced or perceived barriers and facilitators to health research participation for major US racial/ethnic minority populations, we conducted a systematic review of qualitative and quantitative studies from a search on PubMed and Web of Science from January 2000 to December 2011. With 44 articles included in the review, we found distinct and shared barriers and facilitators. Despite different expressions of mistrust, all groups represented in these studies were willing to participate for altruistic reasons embedded in cultural and community priorities. Greater comparative understanding of barriers and facilitators to racial/ethnic minorities’ research participation can improve population-specific recruitment and retention strategies and could better inform future large-scale prospective quantitative and in-depth ethnographic studies. PMID:24328648

  4. Complexities and Perplexities: A Critical Appraisal of the Evidence for Soil-Transmitted Helminth Infection-Related Morbidity

    PubMed Central

    Nery, Susana V.; Doi, Suhail A.; Gray, Darren J.; Soares Magalhães, Ricardo J.; McCarthy, James S.; Traub, Rebecca J.; Andrews, Ross M.; Clements, Archie C. A.

    2016-01-01

    Background: Soil-transmitted helminths (STH) have acute and chronic manifestations, and can result in lifetime morbidity. Disease burden is difficult to quantify, yet quantitative evidence is required to justify large-scale deworming programmes. A recent Cochrane systematic review, which influences Global Burden of Disease (GBD) estimates for STH, has again called into question the evidence for deworming benefit on morbidity due to STH. In this narrative review, we investigate in detail what the shortfalls in evidence are. Methodology/Principal Findings: We systematically reviewed recent literature that used direct measures to investigate morbidity from STH and we critically appraised systematic reviews, particularly the most recent Cochrane systematic review investigating deworming impact on morbidity. We included six systematic reviews and meta-analyses, 36 literature reviews, 44 experimental or observational studies, and five case series. We highlight where evidence is insufficient and where research needs to be directed to strengthen morbidity evidence, ideally to prove benefits of deworming. Conclusions/Significance: Overall, the Cochrane systematic review and recent studies indicate major shortfalls in evidence for direct morbidity. However, it is questionable whether the systematic review methodology should be applied to STH due to heterogeneity of the prevalence of different species in each setting. Urgent investment in studies powered to detect direct morbidity effects due to STH is required. PMID:27196100

  5. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models

    PubMed Central

    Ataman, Meric

    2017-01-01

    Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these “consistently-reduced” models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models. PMID:28727725

  6. Not a Copernican observer: biased peculiar velocity statistics in the local Universe

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Nusser, Adi; Feix, Martin; Bilicki, Maciej

    2017-05-01

    We assess the effect of the local large-scale structure on the estimation of two-point statistics of the observed radial peculiar velocities of galaxies. A large N-body simulation is used to examine these statistics from the perspective of random observers as well as 'Local Group-like' observers conditioned to reside in an environment resembling the observed Universe within 20 Mpc. The local environment systematically distorts the shape and amplitude of velocity statistics with respect to ensemble-averaged measurements made by a Copernican (random) observer. The Virgo cluster has the most significant impact, introducing large systematic deviations in all the statistics. For a simple 'top-hat' selection function, an idealized survey extending to ˜160 h-1 Mpc or deeper is needed to completely mitigate the effects of the local environment. Using shallower catalogues leads to systematic deviations of the order of 50-200 per cent depending on the scale considered. For a flat redshift distribution similar to the one of the CosmicFlows-3 survey, the deviations are even more prominent in both the shape and amplitude at all separations considered (≲100 h-1 Mpc). Conclusions based on statistics calculated without taking into account the impact of the local environment should be revisited.

  7. Large-scale linear rankSVM.

    PubMed

    Lee, Ching-Pei; Lin, Chih-Jen

    2014-04-01

    Linear rankSVM is one of the widely used methods for learning to rank. Although its performance may be inferior to nonlinear methods such as kernel rankSVM and gradient boosting decision trees, linear rankSVM is useful to quickly produce a baseline model. Furthermore, following its recent development for classification, linear rankSVM may give competitive performance for large and sparse data. A great deal of works have studied linear rankSVM. The focus is on the computational efficiency when the number of preference pairs is large. In this letter, we systematically study existing works, discuss their advantages and disadvantages, and propose an efficient algorithm. We discuss different implementation issues and extensions with detailed experiments. Finally, we develop a robust linear rankSVM tool for public use.

  8. The role of the large-scale coronal magnetic field in the eruption of prominence/cavity systems

    NASA Astrophysics Data System (ADS)

    de Toma, G.; Gibson, S. E.; Fan, Y.; Torok, T.

    2013-12-01

    Prominence/cavity systems are large-scale coronal structures that can live for many weeks and even months and often end their life in the form of large coronal eruptions. We investigate the role of the surrounding ambient coronal field in stabilizing these systems against eruption. In particular, we examine the extent to which the decline with height of the external coronal magnetic field influences the evolution of these coronal systems and their likelihood to erupt. We study prominence/cavity systems during the rising phase of cycle 24 in 2010-2013, when a significant number of CMEs were associated with polar crown or large filament eruptions. We use EUV observations from SDO/AIA to identify stable and eruptive coronal cavities, and SDO/HMI magnetograms as boundary conditions to PFSS extrapolation to derive the ambient coronal field. We compute the decay index of the potential field for the two groups and find that systematic differences exist between eruptive and non-eruptive systems.

  9. International network of cancer genome projects

    PubMed Central

    2010-01-01

    The International Cancer Genome Consortium (ICGC) was launched to coordinate large-scale cancer genome studies in tumors from 50 different cancer types and/or subtypes that are of clinical and societal importance across the globe. Systematic studies of over 25,000 cancer genomes at the genomic, epigenomic, and transcriptomic levels will reveal the repertoire of oncogenic mutations, uncover traces of the mutagenic influences, define clinically-relevant subtypes for prognosis and therapeutic management, and enable the development of new cancer therapies. PMID:20393554

  10. The Cosmology Large Angular Scale Surveyor

    NASA Astrophysics Data System (ADS)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; Dahal, Sumit; Denis, Kevin; Dünner, Rolando; Eimer, Joseph; Essinger-Hileman, Thomas; Fluxa, Pedro; Halpern, Mark; Hilton, Gene; Hinshaw, Gary F.; Hubmayr, Johannes; Iuliano, Jeffrey; Karakla, John; McMahon, Jeff; Miller, Nathan T.; Moseley, Samuel H.; Palma, Gonzalo; Parker, Lucas; Petroff, Matthew; Pradenas, Bastián.; Rostem, Karwan; Sagliocca, Marco; Valle, Deniz; Watts, Duncan; Wollack, Edward; Xu, Zhilei; Zeng, Lingzhen

    2016-07-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  11. The Cosmology Large Angular Scale Surveyor (CLASS)

    NASA Technical Reports Server (NTRS)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  12. Measurement tools for assessment of older age bipolar disorder: A systematic review of the recent global literature.

    PubMed

    Rej, Soham; Quayle, William; Forester, Brent P; Dols, Annemiek; Gatchel, Jennifer; Chen, Peijun; Gough, Sarah; Fox, Rebecca; Sajatovic, Martha; Strejilevich, Sergio A; Eyler, Lisa T

    2018-06-01

    More than 50% of people with bipolar disorder will be age 60 years or older by 2030. There is a need for more data to guide assessment and treatment in older age bipolar disorder (OABD); however, interpretation of findings from small, single-site studies may not be generalizable and there are few large trials. As a step in the direction of coordinated large-scale OABD data collection, it is critical to identify which measurements are currently used and identify potential gaps in domains typically assessed. An international group of OABD experts performed a systematic literature review to identify studies examining OABD in the past 6 years. Relevant articles were assessed to categorize the types of clinical, cognitive, biomarker, and neuroimaging OABD tools routinely used in OABD studies. A total of 53 papers were identified, with a broad range of assessments. Most studies evaluated demographic and clinical domains, with fewer studies assessing cognition. There are relatively few biomarker and neuroimaging data, and data collection methods were less comprehensively covered. Assessment tools used in the recent OABD literature may help to identify both a minimum and a comprehensive dataset that should be evaluated in OABD. Our review also highlights gaps where key clinical outcomes have not been routinely assessed. Biomarker and neuroimaging assessment could be further developed and standardized. Clinical data could be combined with neuroimaging, genetic, and other biomarkers in large-scale coordinated data collection to further improve our understanding of OABD phenomenology and biology, thereby contributing to research that advances care. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Shifts in tree functional composition amplify the response of forest biomass to climate

    NASA Astrophysics Data System (ADS)

    Zhang, Tao; Niinemets, Ülo; Sheffield, Justin; Lichstein, Jeremy W.

    2018-04-01

    Forests have a key role in global ecosystems, hosting much of the world’s terrestrial biodiversity and acting as a net sink for atmospheric carbon. These and other ecosystem services that are provided by forests may be sensitive to climate change as well as climate variability on shorter time scales (for example, annual to decadal). Previous studies have documented responses of forest ecosystems to climate change and climate variability, including drought-induced increases in tree mortality rates. However, relationships between forest biomass, tree species composition and climate variability have not been quantified across a large region using systematically sampled data. Here we use systematic forest inventories from the 1980s and 2000s across the eastern USA to show that forest biomass responds to decadal-scale changes in water deficit, and that this biomass response is amplified by concurrent changes in community-mean drought tolerance, a functionally important aspect of tree species composition. The amplification of the direct effects of water stress on biomass occurs because water stress tends to induce a shift in tree species composition towards species that are more tolerant to drought but are slower growing. These results demonstrate concurrent changes in forest species composition and biomass carbon storage across a large, systematically sampled region, and highlight the potential for climate-induced changes in forest ecosystems across the world, resulting from both direct effects of climate on forest biomass and indirect effects mediated by shifts in species composition.

  14. Shifts in tree functional composition amplify the response of forest biomass to climate.

    PubMed

    Zhang, Tao; Niinemets, Ülo; Sheffield, Justin; Lichstein, Jeremy W

    2018-04-05

    Forests have a key role in global ecosystems, hosting much of the world's terrestrial biodiversity and acting as a net sink for atmospheric carbon. These and other ecosystem services that are provided by forests may be sensitive to climate change as well as climate variability on shorter time scales (for example, annual to decadal). Previous studies have documented responses of forest ecosystems to climate change and climate variability, including drought-induced increases in tree mortality rates. However, relationships between forest biomass, tree species composition and climate variability have not been quantified across a large region using systematically sampled data. Here we use systematic forest inventories from the 1980s and 2000s across the eastern USA to show that forest biomass responds to decadal-scale changes in water deficit, and that this biomass response is amplified by concurrent changes in community-mean drought tolerance, a functionally important aspect of tree species composition. The amplification of the direct effects of water stress on biomass occurs because water stress tends to induce a shift in tree species composition towards species that are more tolerant to drought but are slower growing. These results demonstrate concurrent changes in forest species composition and biomass carbon storage across a large, systematically sampled region, and highlight the potential for climate-induced changes in forest ecosystems across the world, resulting from both direct effects of climate on forest biomass and indirect effects mediated by shifts in species composition.

  15. The impact of Lyman-α radiative transfer on large-scale clustering in the Illustris simulation

    NASA Astrophysics Data System (ADS)

    Behrens, C.; Byrohl, C.; Saito, S.; Niemeyer, J. C.

    2018-06-01

    Context. Lyman-α emitters (LAEs) are a promising probe of the large-scale structure at high redshift, z ≳ 2. In particular, the Hobby-Eberly Telescope Dark Energy Experiment aims at observing LAEs at 1.9 < z < 3.5 to measure the baryon acoustic oscillation (BAO) scale and the redshift-space distortion (RSD). However, it has been pointed out that the complicated radiative transfer (RT) of the resonant Lyman-α emission line generates an anisotropic selection bias in the LAE clustering on large scales, s ≳ 10 Mpc. This effect could potentially induce a systematic error in the BAO and RSD measurements. Also, there exists a recent claim to have observational evidence of the effect in the Lyman-α intensity map, albeit statistically insignificant. Aims: We aim at quantifying the impact of the Lyman-α RT on the large-scale galaxy clustering in detail. For this purpose, we study the correlations between the large-scale environment and the ratio of an apparent Lyman-α luminosity to an intrinsic one, which we call the "observed fraction", at 2 < z < 6. Methods: We apply our Lyman-α RT code by post-processing the full Illustris simulations. We simply assume that the intrinsic luminosity of the Lyman-α emission is proportional to the star formation rate of galaxies in Illustris, yielding a sufficiently large sample of LAEs to measure the anisotropic selection bias. Results: We find little correlation between large-scale environment and the observed fraction induced by the RT, and hence a smaller anisotropic selection bias than has previously been claimed. We argue that the anisotropy was overestimated in previous work due to insufficient spatial resolution; it is important to keep the resolution such that it resolves the high-density region down to the scale of the interstellar medium, that is, 1 physical kpc. We also find that the correlation can be further enhanced by assumptions in modeling intrinsic Lyman-α emission.

  16. Measuring safety climate in health care.

    PubMed

    Flin, R; Burns, C; Mearns, K; Yule, S; Robertson, E M

    2006-04-01

    To review quantitative studies of safety climate in health care to examine the psychometric properties of the questionnaires designed to measure this construct. A systematic literature review was undertaken to study sample and questionnaire design characteristics (source, no of items, scale type), construct validity (content validity, factor structure and internal reliability, concurrent validity), within group agreement, and level of analysis. Twelve studies were examined. There was a lack of explicit theoretical underpinning for most questionnaires and some instruments did not report standard psychometric criteria. Where this information was available, several questionnaires appeared to have limitations. More consideration should be given to psychometric factors in the design of healthcare safety climate instruments, especially as these are beginning to be used in large scale surveys across healthcare organisations.

  17. SHEAR-DRIVEN DYNAMO WAVES IN THE FULLY NONLINEAR REGIME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pongkitiwanichakul, P.; Nigro, G.; Cattaneo, F.

    2016-07-01

    Large-scale dynamo action is well understood when the magnetic Reynolds number ( Rm ) is small, but becomes problematic in the astrophysically relevant large Rm limit since the fluctuations may control the operation of the dynamo, obscuring the large-scale behavior. Recent works by Tobias and Cattaneo demonstrated numerically the existence of large-scale dynamo action in the form of dynamo waves driven by strongly helical turbulence and shear. Their calculations were carried out in the kinematic regime in which the back-reaction of the Lorentz force on the flow is neglected. Here, we have undertaken a systematic extension of their work tomore » the fully nonlinear regime. Helical turbulence and large-scale shear are produced self-consistently by prescribing body forces that, in the kinematic regime, drive flows that resemble the original velocity used by Tobias and Cattaneo. We have found four different solution types in the nonlinear regime for various ratios of the fluctuating velocity to the shear and Reynolds numbers. Some of the solutions are in the form of propagating waves. Some solutions show large-scale helical magnetic structure. Both waves and structures are permanent only when the kinetic helicity is non-zero on average.« less

  18. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  19. Large-scale block adjustment without use of ground control points based on the compensation of geometric calibration for ZY-3 images

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Mi; Xu, Wen; Li, Deren; Gong, Jianya; Pi, Yingdong

    2017-12-01

    The potential of large-scale block adjustment (BA) without ground control points (GCPs) has long been a concern among photogrammetric researchers, which is of effective guiding significance for global mapping. However, significant problems with the accuracy and efficiency of this method remain to be solved. In this study, we analyzed the effects of geometric errors on BA, and then developed a step-wise BA method to conduct integrated processing of large-scale ZY-3 satellite images without GCPs. We first pre-processed the BA data, by adopting a geometric calibration (GC) method based on the viewing-angle model to compensate for systematic errors, such that the BA input images were of good initial geometric quality. The second step was integrated BA without GCPs, in which a series of technical methods were used to solve bottleneck problems and ensure accuracy and efficiency. The BA model, based on virtual control points (VCPs), was constructed to address the rank deficiency problem caused by lack of absolute constraints. We then developed a parallel matching strategy to improve the efficiency of tie points (TPs) matching, and adopted a three-array data structure based on sparsity to relieve the storage and calculation burden of the high-order modified equation. Finally, we used the conjugate gradient method to improve the speed of solving the high-order equations. To evaluate the feasibility of the presented large-scale BA method, we conducted three experiments on real data collected by the ZY-3 satellite. The experimental results indicate that the presented method can effectively improve the geometric accuracies of ZY-3 satellite images. This study demonstrates the feasibility of large-scale mapping without GCPs.

  20. Wetlands as large-scale nature-based solutions: status and future challenges for research and management

    NASA Astrophysics Data System (ADS)

    Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia

    2017-04-01

    Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.

  1. Systematic review and meta-analysis of genetic association studies in idiopathic recurrent spontaneous abortion.

    PubMed

    Pereza, Nina; Ostojić, Saša; Kapović, Miljenko; Peterlin, Borut

    2017-01-01

    1) To perform the first comprehensive systematic review of genetic association studies (GASs) in idiopathic recurrent spontaneous abortion (IRSA); 2) to analyze studies according to recurrent spontaneous abortion (RSA) definition and selection criteria for patients and control subjects; and 3) to perform meta-analyses for the association of candidate genes with IRSA. Systematic review and meta-analysis. Not applicable. Couples with IRSA and their spontaneously aborted embryos. Summary odds ratios (ORs) were calculated by means of fixed- or random-effects models. Association of genetic variants with IRSA. The systematic review included 428 case-control studies (1990-2015), which differed substantially regarding RSA definition, clinical evaluation of patients, and selection of control subjects. In women, 472 variants in 187 genes were investigated. Meta-analyses were performed for 36 variants in 16 genes. Association with IRSA defined as three or more spontaneous abortions (SAs) was detected for 21 variants in genes involved in immune response (IFNG, IL10, KIR2DS2, KIR2DS3, KIR2DS4, MBL, TNF), coagulation (F2, F5, PAI-1, PROZ), metabolism (GSTT1, MTHFR), and angiogenesis (NOS3, VEGFA). However, ORs were modest (0.51-2.37), with moderate or weak epidemiologic credibility. Minor differences in summary ORs were detected between IRSA defined as two or more and as three or more SAs. Male partners were included in 12.1% of studies, and one study included spontaneously aborted embryos. Candidate gene studies show moderate associations with IRSA. Owing to large differences in RSA definition and selection criteria for participants, consensus is needed. Future GASs should include both partners and spontaneously aborted embryos. Genome-wide association studies and large-scale replications of identified associations are recommended. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Will COBE challenge the inflationary paradigm - Cosmic microwave background anisotropies versus large-scale streaming motions revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorski, K.M.

    1991-03-01

    The relation between cosmic microwave background (CMB) anisotropies and large-scale galaxy streaming motions is examined within the framework of inflationary cosmology. The minimal Sachs and Wolfe (1967) CMB anisotropies at large angular scales in the models with initial Harrison-Zel'dovich spectrum of inhomogeneity normalized to the local large-scale bulk flow, which are independent of the Hubble constant and specific nature of dark matter, are found to be within the anticipated ultimate sensitivity limits of COBE's Differential Microwave Radiometer experiment. For example, the most likely value of the quadrupole coefficient is predicted to be a2 not less than 7 x 10 tomore » the -6th, where equality applies to the limiting minimal model. If (1) COBE's DMR instruments perform well throughout the two-year period; (2) the anisotropy data are not marred by the systematic errors; (3) the large-scale motions retain their present observational status; (4) there is no statistical conspiracy in a sense of the measured bulk flow being of untypically high and the large-scale anisotropy of untypically low amplitudes; and (5) the low-order multipoles in the all-sky primordial fireball temperature map are not detected, the inflationary paradigm will have to be questioned. 19 refs.« less

  3. Intrinsic uncertainty on the nature of dark energy

    NASA Astrophysics Data System (ADS)

    Valkenburg, Wessel; Kunz, Martin; Marra, Valerio

    2013-12-01

    We argue that there is an intrinsic noise on measurements of the equation of state parameter w = p/ρ from large-scale structure around us. The presence of the large-scale structure leads to an ambiguity in the definition of the background universe and thus there is a maximal precision with which we can determine the equation of state of dark energy. To study the uncertainty due to local structure, we model density perturbations stemming from a standard inflationary power spectrum by means of the exact Lemaître-Tolman-Bondi solution of Einstein’s equation, and show that the usual distribution of matter inhomogeneities in a ΛCDM cosmology causes a variation of w - as inferred from distance measures - of several percent. As we observe only one universe, or equivalently because of the cosmic variance, this uncertainty is systematic in nature.

  4. XLID-causing mutations and associated genes challenged in light of data from large-scale human exome sequencing.

    PubMed

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-08-08

    Because of the unbalanced sex ratio (1.3-1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  5. Disorder in the Disk: The Influence of Accretion Disk Thickness on the Large-scale Magnetic Dynamo.

    NASA Astrophysics Data System (ADS)

    Hogg, J. Drew; Reynolds, Christopher S.

    2018-01-01

    The evolution of the magnetic field from the enigmatic large-scale dynamo is often considered a central feature of the accretion disk around a black hole. The resulting low-frequency oscillations introduced from the growth and decay of the field strength, along with the change in field orientation, are thought to be intimately tied to variability from the disk. Several factors are at play, but the dynamo can either be directly tied to observable signatures through modulation of the heating rate, or indirectly as the source of quasiperiodic oscillations, the driver of nonlinear structure from propagating fluctuations in mass accretion rate, or even the trigger of state transitions. We present a selection of results from a recent study of this process using a suite of four global, high-resolution, MHD accretion disk simulations. We systematically vary the scale height ratio and find the large-scale dynamo fails to develop above a scale height ratio of h/r ≥ 0.2. Using “butterfly” diagrams of the azimuthal magnetic field, we show the large-scale dynamo exists in the thinner accretion disk models, but fails to excite when the scale height ratio is increased, a feature which is also reflected in 2D Fourier transforms. Additionally, we calculate the dynamo α-parameter through correlations in the averaged magnetic field and turbulent electromotive force, and also generate synthetic light curves from the disk cooling. Using our emission proxy, we find the disks have markedly different characters as photometric fluctuations are larger and less ordered when the disk is thicker and the dynamo is absent.

  6. Effect of Diabetes Mellitus Type 2 on Salivary Glucose – A Systematic Review and Meta-Analysis of Observational Studies

    PubMed Central

    Mascarenhas, Paulo; Fatela, Bruno; Barahona, Isabel

    2014-01-01

    Background Early screening of type 2 diabetes mellitus (DM) is essential for improved prognosis and effective delay of clinical complications. However, testing for high glycemia often requires invasive and painful blood testing, limiting its large-scale applicability. We have combined new, unpublished data with published data comparing salivary glucose levels in type 2 DM patients and controls and/or looked at the correlation between salivary glucose and glycemia/HbA1c to systematically review the effectiveness of salivary glucose to estimate glycemia and HbA1c. We further discuss salivary glucose as a biomarker for large-scale screening of diabetes or developing type 2 DM. Methods and Findings We conducted a meta-analysis of peer-reviewed published articles that reported data regarding mean salivary glucose levels and/or correlation between salivary glucose levels and glycemia or HbA1c for type 2 DM and non-diabetic individuals and combined them with our own unpublished results. Our global meta-analysis of standardized mean differences on salivary glucose levels shows an overall large positive effect of type 2 DM over salivary glucose (Hedge's g = 1.37). The global correlation coefficient (r) between salivary glucose and glycemia was large (r = 0.49), with subgroups ranging from medium (r = 0.30 in non-diabetics) to very large (r = 0.67 in diabetics). Meta-analysis of the global correlation between salivary glucose and HbA1c showed an overall association of medium strength (r = 0.37). Conclusions Our systematic review reports an overall meaningful salivary glucose concentration increase in type 2 DM and a significant overall relationship between salivary glucose concentration and associated glycemia/HbA1c values, with the strength of the correlation increasing for higher glycemia/HbA1c values. These results support the potential of salivary glucose levels as a biomarker for type 2 DM, providing a less painful/invasive method for screening type 2 DM, as well as for monitoring blood glucose levels in large cohorts of DM patients. PMID:25025218

  7. Electrodynamics of the middle atmosphere: Superpressure balloon program

    NASA Technical Reports Server (NTRS)

    Holzworth, Robert H.

    1987-01-01

    In this experiment a comprehensive set of electrical parameters were measured during eight long duration flights in the southern hemisphere stratosphere. These flight resulted in the largest data set ever collected from the stratosphere. The stratosphere has never been electrodynamically sampled in the systematic manner before. New discoveries include short term variability in the planetary scale electric current system, the unexpected observation of stratospheric conductivity variations over thunderstorms and the observation of direct stratospheric conductivity variations following a relatively small solar flare. Major statistical studies were conducted of the large scale current systems, the stratospheric conductivity and the neutral gravity waves (from pressure and temperature data) using the entire data set.

  8. Spatially distributed potential evapotranspiration modeling and climate projections.

    PubMed

    Gharbia, Salem S; Smullen, Trevor; Gill, Laurence; Johnston, Paul; Pilla, Francesco

    2018-08-15

    Evapotranspiration integrates energy and mass transfer between the Earth's surface and atmosphere and is the most active mechanism linking the atmosphere, hydrosphsophere, lithosphere and biosphere. This study focuses on the fine resolution modeling and projection of spatially distributed potential evapotranspiration on the large catchment scale as response to climate change. Six potential evapotranspiration designed algorithms, systematically selected based on a structured criteria and data availability, have been applied and then validated to long-term mean monthly data for the Shannon River catchment with a 50m 2 cell size. The best validated algorithm was therefore applied to evaluate the possible effect of future climate change on potential evapotranspiration rates. Spatially distributed potential evapotranspiration projections have been modeled based on climate change projections from multi-GCM ensembles for three future time intervals (2020, 2050 and 2080) using a range of different Representative Concentration Pathways producing four scenarios for each time interval. Finally, seasonal results have been compared to baseline results to evaluate the impact of climate change on the potential evapotranspiration and therefor on the catchment dynamical water balance. The results present evidence that the modeled climate change scenarios would have a significant impact on the future potential evapotranspiration rates. All the simulated scenarios predicted an increase in potential evapotranspiration for each modeled future time interval, which would significantly affect the dynamical catchment water balance. This study addresses the gap in the literature of using GIS-based algorithms to model fine-scale spatially distributed potential evapotranspiration on the large catchment systems based on climatological observations and simulations in different climatological zones. Providing fine-scale potential evapotranspiration data is very crucial to assess the dynamical catchment water balance to setup management scenarios for the water abstractions. This study illustrates a transferable systematic method to design GIS-based algorithms to simulate spatially distributed potential evapotranspiration on the large catchment systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. North Atlantic weather regimes: A synoptic study of phase space. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Orrhede, Anna Karin

    1990-01-01

    In the phase space of weather, low frequency variability (LFV) of the atmosphere can be captured in a large scale subspace, where a trajectory connects consecutive large scale weather maps, thus revealing flow changes and recurrences. Using this approach, Vautard applied the trajectory speed minimization method (Vautard and Legras) to atmospheric data. From 37 winters of 700 mb geopotential height anomalies over the North Atlantic and the adjacent land masses, four persistent and recurrent weather patterns, interpreted as weather regimes, were discernable: a blocking regime, a zonal regime, a Greenland anticyclone regime, and an Atlantic regime. These regimes are studied further in terms of maintenance and transitions. A regime survey unveils preferences regarding event durations and precursors for the onset or break of an event. The transition frequencies between regimes vary, and together with the transition times, suggest the existence of easier transition routes. These matters are more systematically studied using complete synoptic map sequences from a number of events.

  10. A systematic review of interventions for anxiety, depression, and PTSD in adult offenders.

    PubMed

    Leigh-Hunt, Nicholas; Perry, Amanda

    2015-06-01

    There is a high prevalence of anxiety and depression in offender populations but with no recent systematic review of interventions to identify what is effective. This systematic review was undertaken to identify randomised controlled trials of pharmacological and non-pharmacological interventions in adult offenders in prison or community settings. A search of five databases identified 14 studies meeting inclusion criteria, which considered the impact of psychological interventions, pharmacological agents, or exercise on levels of depression and anxiety. A narrative synthesis was undertaken and Hedges g effect sizes calculated to allow comparison between studies. Effect sizes for depression interventions ranged from 0.17 to 1.41, for anxiety 0.61 to 0.71 and for posttraumatic stress disorder 0 to 1.41. Cognitive behavioural therapy interventions for the reduction of depression and anxiety in adult offenders appear effective in the short term, though a large-scale trial of sufficient duration is needed to confirm this finding. © The Author(s) 2014.

  11. Similarity spectra analysis of high-performance jet aircraft noise.

    PubMed

    Neilsen, Tracianne B; Gee, Kent L; Wall, Alan T; James, Michael M

    2013-04-01

    Noise measured in the vicinity of an F-22A Raptor has been compared to similarity spectra found previously to represent mixing noise from large-scale and fine-scale turbulent structures in laboratory-scale jet plumes. Comparisons have been made for three engine conditions using ground-based sideline microphones, which covered a large angular aperture. Even though the nozzle geometry is complex and the jet is nonideally expanded, the similarity spectra do agree with large portions of the measured spectra. Toward the sideline, the fine-scale similarity spectrum is used, while the large-scale similarity spectrum provides a good fit to the area of maximum radiation. Combinations of the two similarity spectra are shown to match the data in between those regions. Surprisingly, a combination of the two is also shown to match the data at the farthest aft angle. However, at high frequencies the degree of congruity between the similarity and the measured spectra changes with engine condition and angle. At the higher engine conditions, there is a systematically shallower measured high-frequency slope, with the largest discrepancy occurring in the regions of maximum radiation.

  12. Dark matter, long-range forces, and large-scale structure

    NASA Technical Reports Server (NTRS)

    Gradwohl, Ben-Ami; Frieman, Joshua A.

    1992-01-01

    If the dark matter in galaxies and clusters is nonbaryonic, it can interact with additional long-range fields that are invisible to experimental tests of the equivalence principle. We discuss the astrophysical and cosmological implications of a long-range force coupled only to the dark matter and find rather tight constraints on its strength. If the force is repulsive (attractive), the masses of galaxy groups and clusters (and the mean density of the universe inferred from them) have been systematically underestimated (overestimated). We explore the consequent effects on the two-point correlation function, large-scale velocity flows, and microwave background anisotropies, for models with initial scale-invariant adiabatic perturbations and cold dark matter.

  13. Chromatin as active matter

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankit; Ganai, Nirmalendu; Sengupta, Surajit; Menon, Gautam I.

    2017-01-01

    Active matter models describe a number of biophysical phenomena at the cell and tissue scale. Such models explore the macroscopic consequences of driving specific soft condensed matter systems of biological relevance out of equilibrium through ‘active’ processes. Here, we describe how active matter models can be used to study the large-scale properties of chromosomes contained within the nuclei of human cells in interphase. We show that polymer models for chromosomes that incorporate inhomogeneous activity reproduce many general, yet little understood, features of large-scale nuclear architecture. These include: (i) the spatial separation of gene-rich, low-density euchromatin, predominantly found towards the centre of the nucleus, vis a vis. gene-poor, denser heterochromatin, typically enriched in proximity to the nuclear periphery, (ii) the differential positioning of individual gene-rich and gene-poor chromosomes, (iii) the formation of chromosome territories, as well as (iv), the weak size-dependence of the positions of individual chromosome centres-of-mass relative to the nuclear centre that is seen in some cell types. Such structuring is induced purely by the combination of activity and confinement and is absent in thermal equilibrium. We systematically explore active matter models for chromosomes, discussing how our model can be generalized to study variations in chromosome positioning across different cell types. The approach and model we outline here represent a preliminary attempt towards a quantitative, first-principles description of the large-scale architecture of the cell nucleus.

  14. Pilot study of large-scale production of mutant pigs by ENU mutagenesis

    PubMed Central

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-01-01

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research. DOI: http://dx.doi.org/10.7554/eLife.26248.001 PMID:28639938

  15. Numerical study of anomalous dynamic scaling behaviour of (1+1)-dimensional Das Sarma-Tamborenea model

    NASA Astrophysics Data System (ADS)

    Xun, Zhi-Peng; Tang, Gang; Han, Kui; Hao, Da-Peng; Xia, Hui; Zhou, Wei; Yang, Xi-Quan; Wen, Rong-Ji; Chen, Yu-Ling

    2010-07-01

    In order to discuss the finite-size effect and the anomalous dynamic scaling behaviour of Das Sarma-Tamborenea growth model, the (1+1)-dimensional Das Sarma-Tamborenea model is simulated on a large length scale by using the kinetic Monte-Carlo method. In the simulation, noise reduction technique is used in order to eliminate the crossover effect. Our results show that due to the existence of the finite-size effect, the effective global roughness exponent of the (1+1)-dimensional Das Sarma-Tamborenea model systematically decreases with system size L increasing when L > 256. This finding proves the conjecture by Aarao Reis[Aarao Reis F D A 2004 Phys. Rev. E 70 031607]. In addition, our simulation results also show that the Das Sarma-Tamborenea model in 1+1 dimensions indeed exhibits intrinsic anomalous scaling behaviour.

  16. How arbitrary is language?

    PubMed Central

    Monaghan, Padraic; Shillcock, Richard C.; Christiansen, Morten H.; Kirby, Simon

    2014-01-01

    It is a long established convention that the relationship between sounds and meanings of words is essentially arbitrary—typically the sound of a word gives no hint of its meaning. However, there are numerous reported instances of systematic sound–meaning mappings in language, and this systematicity has been claimed to be important for early language development. In a large-scale corpus analysis of English, we show that sound–meaning mappings are more systematic than would be expected by chance. Furthermore, this systematicity is more pronounced for words involved in the early stages of language acquisition and reduces in later vocabulary development. We propose that the vocabulary is structured to enable systematicity in early language learning to promote language acquisition, while also incorporating arbitrariness for later language in order to facilitate communicative expressivity and efficiency. PMID:25092667

  17. Eating Disorders in Non-Dance Performing Artists: A Systematic Literature Review.

    PubMed

    Kapsetaki, Marianna E; Easmon, Charlie

    2017-12-01

    Previous literature on dancers and athletes has shown a large impact of eating disorders (EDs) on these individuals, but there is limited research on EDs affecting non-dance performing artists (i.e., musicians, actors, etc.). This systematic review aimed to identify and evaluate the literature on EDs in non-dance performing artists. A systematic review of the literature was performed on 24 databases, using search terms related to EDs and non-dance performing artists. All results from the databases were systematically screened for inclusion and exclusion criteria. The initial search returned 86,383 total articles, which after screening and removal of duplicates and irrelevant papers yielded 129 results. After screening the 129 full-text results for eligibility, 10 studies met criteria for inclusion: 6 papers addressed EDs in musicians, and 4 papers addressed EDs in theatre performers. Most studies used questionnaires and body mass index (BMI) as diagnostic tools for EDs. Most were small-scale studies and participants were mostly students. Because of the studies' heterogeneity and varying quality, the results obtained were often contradictory and questionable. Although there has been a lot of literature in dancers, we found relatively few studies associating EDs with other performing artists, and most were inconsistent in their information.

  18. Identification of a basic helix-loop-helix-type transcription regulator gene in Aspergillus oryzae by systematically deleting large chromosomal segments.

    PubMed

    Jin, Feng Jie; Takahashi, Tadashi; Machida, Masayuki; Koyama, Yasuji

    2009-09-01

    We previously developed two methods (loop-out and replacement-type recombination) for generating large-scale chromosomal deletions that can be applied to more effective chromosomal engineering in Aspergillus oryzae. In this study, the replacement-type method is used to systematically delete large chromosomal DNA segments to identify essential and nonessential regions in chromosome 7 (2.93 Mb), which is the smallest A. oryzae chromosome and contains a large number of nonsyntenic blocks. We constructed 12 mutants harboring deletions that spanned 16- to 150-kb segments of chromosome 7 and scored phenotypic changes in the resulting mutants. Among the deletion mutants, strains designated Delta5 and Delta7 displayed clear phenotypic changes involving growth and conidiation. In particular, the Delta5 mutant exhibited vigorous growth and conidiation, potentially beneficial characteristics for certain industrial applications. Further deletion analysis allowed identification of the AO090011000215 gene as the gene responsible for the Delta5 mutant phenotype. The AO090011000215 gene was predicted to encode a helix-loop-helix binding protein belonging to the bHLH family of transcription factors. These results illustrate the potential of the approach for identifying novel functional genes.

  19. Systematic Observations of the Slip-pulse Properties of Large Earthquake Ruptures

    NASA Astrophysics Data System (ADS)

    Melgar, D.; Hayes, G. P.

    2017-12-01

    In earthquake dynamics there are two end member models of rupture: propagating cracks and self-healing pulses. These arise due to different properties of ruptures and have implications for seismic hazard; rupture mode controls near-field strong ground motions. Past studies favor the pulse-like mode of rupture, however, due to a variety of limitations, it has proven difficult to systematically establish their kinematic properties. Here we synthesize observations from a database of >150 rupture models of earthquakes spanning M7-M9 processed in a uniform manner and show the magnitude scaling properties (rise time, pulse width, and peak slip rate) of these slip pulses indicates self-similarity. Self similarity suggests a weak form of rupture determinism, where early on in the source process broader, higher amplitude slip pulses will distinguish between events of icnreasing magnitude. Indeed, we find by analyzing the moment rate functions that large and very large events are statistically distinguishable relatively early (at 15 seconds) in the rupture process. This suggests that with dense regional geophysical networks strong ground motions from a large rupture can be identified before their onset across the source region.

  20. Newspaper coverage of controversies about large-scale swine facilities in rural communities in Illinois.

    PubMed

    Reisner, A E

    2005-11-01

    The building and expansion of large-scale swine facilities has created considerable controversy in many neighboring communities, but to date, no systematic analysis has been done of the types of claims made during these conflicts. This study examined how local newspapers in one state covered the transition from the dominance of smaller, diversified swine operations to large, single-purpose pig production facilities. To look at publicly made statements concerning large-scale swine facilities (LSSF), the study collected all articles related to LSSF from 22 daily Illinois newspapers over a 3-yr period (a total of 1,737 articles). The most frequent sets of claims used by proponents of LSSF were that the environment was not harmed, that state regulations were sufficiently strict, and that the state economically needed this type of agriculture. The most frequent claims made by opponents were that LSSF harmed the environment and neighboring communities and that stricter regulations were needed. Proponents' claims were primarily defensive and, to some degree, underplayed the advantages of LSSF. Pro-and anti-LSSF groups were talking at cross-purposes, to some degree. Even across similar themes, those in favor of LSSF and those opposed were addressing different sets of concerns. The newspaper claims did not indicate any effective alliances forming between local anti-LSSF groups and national environmental or animal rights groups.

  1. A Systematic Review of Biomarkers and Risk of Incident Type 2 Diabetes: An Overview of Epidemiological, Prediction and Aetiological Research Literature

    PubMed Central

    Sahlqvist, Anna-Stina; Lotta, Luca; Brosnan, Julia M.; Vollenweider, Peter; Giabbanelli, Philippe; Nunez, Derek J.; Waterworth, Dawn; Scott, Robert A.; Langenberg, Claudia; Wareham, Nicholas J.

    2016-01-01

    Background Blood-based or urinary biomarkers may play a role in quantifying the future risk of type 2 diabetes (T2D) and in understanding possible aetiological pathways to disease. However, no systematic review has been conducted that has identified and provided an overview of available biomarkers for incident T2D. We aimed to systematically review the associations of biomarkers with risk of developing T2D and to highlight evidence gaps in the existing literature regarding the predictive and aetiological value of these biomarkers and to direct future research in this field. Methods and Findings We systematically searched PubMed MEDLINE (January 2000 until March 2015) and Embase (until January 2016) databases for observational studies of biomarkers and incident T2D according to the 2009 PRISMA guidelines. We also searched availability of meta-analyses, Mendelian randomisation and prediction research for the identified biomarkers. We reviewed 3910 titles (705 abstracts) and 164 full papers and included 139 papers from 69 cohort studies that described the prospective relationships between 167 blood-based or urinary biomarkers and incident T2D. Only 35 biomarkers were reported in large scale studies with more than 1000 T2D cases, and thus the evidence for association was inconclusive for the majority of biomarkers. Fourteen biomarkers have been investigated using Mendelian randomisation approaches. Only for one biomarker was there strong observational evidence of association and evidence from genetic association studies that was compatible with an underlying causal association. In additional search for T2D prediction, we found only half of biomarkers were examined with formal evidence of predictive value for a minority of these biomarkers. Most biomarkers did not enhance the strength of prediction, but the strongest evidence for prediction was for biomarkers that quantify measures of glycaemia. Conclusions This study presents an extensive review of the current state of the literature to inform the strategy for future interrogation of existing and newly described biomarkers for T2D. Many biomarkers have been reported to be associated with the risk of developing T2D. The evidence of their value in adding to understanding of causal pathways to disease is very limited so far. The utility of most biomarkers remains largely unknown in clinical prediction. Future research should focus on providing good genetic instruments across consortia for possible biomarkers in Mendelian randomisation, prioritising biomarkers for measurement in large-scale cohort studies and examining predictive utility of biomarkers for a given context. PMID:27788146

  2. Large-scale atomistic simulations demonstrate dominant alloy disorder effects in GaBixAs1 -x/GaAs multiple quantum wells

    NASA Astrophysics Data System (ADS)

    Usman, Muhammad

    2018-04-01

    Bismide semiconductor materials and heterostructures are considered a promising candidate for the design and implementation of photonic, thermoelectric, photovoltaic, and spintronic devices. This work presents a detailed theoretical study of the electronic and optical properties of strongly coupled GaBixAs1 -x /GaAs multiple quantum well (MQW) structures. Based on a systematic set of large-scale atomistic tight-binding calculations, our results reveal that the impact of atomic-scale fluctuations in alloy composition is stronger than the interwell coupling effect, and plays an important role in the electronic and optical properties of the investigated MQW structures. Independent of QW geometry parameters, alloy disorder leads to a strong confinement of charge carriers, a large broadening of the hole energies, and a red-shift in the ground-state transition wavelength. Polarization-resolved optical transition strengths exhibit a striking effect of disorder, where the inhomogeneous broadening could exceed an order of magnitude for MQWs, in comparison to a factor of about 3 for single QWs. The strong influence of alloy disorder effects persists when small variations in the size and composition of MQWs typically expected in a realistic experimental environment are considered. The presented results highlight the limited scope of continuum methods and emphasize on the need for large-scale atomistic approaches to design devices with tailored functionalities based on the novel properties of bismide materials.

  3. Planetesimal Formation through the Streaming Instability

    NASA Astrophysics Data System (ADS)

    Yang, Chao-Chin; Johansen, Anders; Schäfer, Urs

    2015-12-01

    The streaming instability is a promising mechanism to circumvent the barriers in direct dust growth and lead to the formation of planetesimals, as demonstrated by many previous studies. In order to resolve the thin layer of solids, however, most of these studies were focused on a local region of a protoplanetary disk with a limited simulation domain. It remains uncertain how the streaming instability is affected by the disk gas on large scales, and models that have sufficient dynamical range to capture both the thin particle layer and the large-scale disk dynamics are required.We hereby systematically push the limits of the computational domain up to more than the gas scale height, and study the particle-gas interaction on large scales in the saturated state of the streaming instability and the initial mass function of the resulting planetesimals. To overcome the numerical challenges posed by this kind of models, we have developed a new technique to simultaneously relieve the stringent time step constraints due to small-sized particles and strong local solid concentrations. Using these models, we demonstrate that the streaming instability can drive multiple radial, filamentary concentrations of solids, implying that planetesimals are born in well separated belt-like structures. We also find that the initial mass function of planetesimals via the streaming instability has a characteristic exponential form, which is robust against computational domain as well as resolution. These findings will help us further constrain the cosmochemical history of the Solar system as well as the planet formation theory in general.

  4. Inflationary magnetogenesis without the strong coupling problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira, Ricardo J.Z.; Jain, Rajeev Kumar; Sloth, Martin S., E-mail: ferreira@cp3.dias.sdu.dk, E-mail: jain@cp3.dias.sdu.dk, E-mail: sloth@cp3.dias.sdu.dk

    2013-10-01

    The simplest gauge invariant models of inflationary magnetogenesis are known to suffer from the problems of either large backreaction or strong coupling, which make it difficult to self-consistently achieve cosmic magnetic fields from inflation with a field strength larger than 10{sup −32}G today on the Mpc scale. Such a strength is insufficient to act as seed for the galactic dynamo effect, which requires a magnetic field larger than 10{sup −20}G. In this paper we analyze simple extensions of the minimal model, which avoid both the strong coupling and back reaction problems, in order to generate sufficiently large magnetic fields onmore » the Mpc scale today. First we study the possibility that the coupling function which breaks the conformal invariance of electromagnetism is non-monotonic with sharp features. Subsequently, we consider the effect of lowering the energy scale of inflation jointly with a scenario of prolonged reheating where the universe is dominated by a stiff fluid for a short period after inflation. In the latter case, a systematic study shows upper bounds for the magnetic field strength today on the Mpc scale of 10{sup −13}G for low scale inflation and 10{sup −25}G for high scale inflation, thus improving on the previous result by 7-19 orders of magnitude. These results are consistent with the strong coupling and backreaction constraints.« less

  5. Initial eccentricity and constituent quark number scaling of elliptic flow in ideal and viscous dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaudhuri, A. K.

    2010-04-15

    In the Israel-Stewart theory of dissipative hydrodynamics, the scaling properties of elliptic flow in Au+Au collisions are studied. The initial energy density of the fluid was fixed to reproduce STAR data on phi-meson multiplicity in 0-5% Au+Au collisions such that, irrespective of fluid viscosity, entropy at the freeze-out is similar in ideal or in viscous evolution. The initial eccentricity or constituent quark number scaling is only approximate in ideal or minimally viscous (eta/s=1/4pi) fluid. Eccentricity scaling becomes nearly exact in more viscous fluid (eta/s>=0.12). However, in more viscous fluid, constituent quark number scaled elliptic flow for mesons and baryons splitsmore » into separate scaling functions. Simulated flows also do not exhibit 'universal scaling'; that is, elliptic flow scaled by the constituent quark number and charged particles v{sub 2} is not a single function of transverse kinetic energy scaled by the quark number. From a study of the violation of universal scaling, we obtain an estimate of quark-gluon plasma viscosity, eta/s=0.12+-0.03. The error is statistical only. The systematic error in eta/s could be as large.« less

  6. Measuring discharge with ADCPs: Inferences from synthetic velocity profiles

    USGS Publications Warehouse

    Rehmann, C.R.; Mueller, D.S.; Oberg, K.A.

    2009-01-01

    Synthetic velocity profiles are used to determine guidelines for sampling discharge with acoustic Doppler current profilers (ADCPs). The analysis allows the effects of instrument characteristics, sampling parameters, and properties of the flow to be studied systematically. For mid-section measurements, the averaging time required for a single profile measurement always exceeded the 40 s usually recommended for velocity measurements, and it increased with increasing sample interval and increasing time scale of the large eddies. Similarly, simulations of transect measurements show that discharge error decreases as the number of large eddies sampled increases. The simulations allow sampling criteria that account for the physics of the flow to be developed. ?? 2009 ASCE.

  7. Galaxy bias and primordial non-Gaussianity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assassi, Valentin; Baumann, Daniel; Schmidt, Fabian, E-mail: assassi@ias.edu, E-mail: D.D.Baumann@uva.nl, E-mail: fabians@MPA-Garching.MPG.DE

    2015-12-01

    We present a systematic study of galaxy biasing in the presence of primordial non-Gaussianity. For a large class of non-Gaussian initial conditions, we define a general bias expansion and prove that it is closed under renormalization, thereby showing that the basis of operators in the expansion is complete. We then study the effects of primordial non-Gaussianity on the statistics of galaxies. We show that the equivalence principle enforces a relation between the scale-dependent bias in the galaxy power spectrum and that in the dipolar part of the bispectrum. This provides a powerful consistency check to confirm the primordial origin ofmore » any observed scale-dependent bias. Finally, we also discuss the imprints of anisotropic non-Gaussianity as motivated by recent studies of higher-spin fields during inflation.« less

  8. Spatial Variability of Snowpack Properties On Small Slopes

    NASA Astrophysics Data System (ADS)

    Pielmeier, C.; Kronholm, K.; Schneebeli, M.; Schweizer, J.

    The spatial variability of alpine snowpacks is created by a variety of parameters like deposition, wind erosion, sublimation, melting, temperature, radiation and metamor- phism of the snow. Spatial variability is thought to strongly control the avalanche initi- ation and failure propagation processes. Local snowpack measurements are currently the basis for avalanche warning services and there exist contradicting hypotheses about the spatial continuity of avalanche active snow layers and interfaces. Very little about the spatial variability of the snowpack is known so far, therefore we have devel- oped a systematic and objective method to measure the spatial variability of snowpack properties, layering and its relation to stability. For a complete coverage, the analysis of the spatial variability has to entail all scales from mm to km. In this study the small to medium scale spatial variability is investigated, i.e. the range from centimeters to tenths of meters. During the winter 2000/2001 we took systematic measurements in lines and grids on a flat snow test field with grid distances from 5 cm to 0.5 m. Fur- thermore, we measured systematic grids with grid distances between 0.5 m and 2 m in undisturbed flat fields and on small slopes above the tree line at the Choerbschhorn, in the region of Davos, Switzerland. On 13 days we measured the spatial pattern of the snowpack stratigraphy with more than 110 snow micro penetrometer measure- ments at slopes and flat fields. Within this measuring grid we placed 1 rutschblock and 12 stuffblock tests to measure the stability of the snowpack. With the large num- ber of measurements we are able to use geostatistical methods to analyse the spatial variability of the snowpack. Typical correlation lengths are calculated from semivari- ograms. Discerning the systematic trends from random spatial variability is analysed using statistical models. Scale dependencies are shown and recurring scaling patterns are outlined. The importance of the small and medium scale spatial variability for the larger (kilometer) scale spatial variability as well as for the avalanche formation are discussed. Finally, an outlook on spatial models for the snowpack variability is given.

  9. Assessing Hydrological and Energy Budgets in Amazonia through Regional Downscaling, and Comparisons with Global Reanalysis Products

    NASA Astrophysics Data System (ADS)

    Nunes, A.; Ivanov, V. Y.

    2014-12-01

    Although current global reanalyses provide reasonably accurate large-scale features of the atmosphere, systematic errors are still found in the hydrological and energy budgets of such products. In the tropics, precipitation is particularly challenging to model, which is also adversely affected by the scarcity of hydrometeorological datasets in the region. With the goal of producing downscaled analyses that are appropriate for a climate assessment at regional scales, a regional spectral model has used a combination of precipitation assimilation with scale-selective bias correction. The latter is similar to the spectral nudging technique, which prevents the departure of the regional model's internal states from the large-scale forcing. The target area in this study is the Amazon region, where large errors are detected in reanalysis precipitation. To generate the downscaled analysis, the regional climate model used NCEP/DOE R2 global reanalysis as the initial and lateral boundary conditions, and assimilated NOAA's Climate Prediction Center (CPC) MORPHed precipitation (CMORPH), available at 0.25-degree resolution, every 3 hours. The regional model's precipitation was successfully brought closer to the observations, in comparison to the NCEP global reanalysis products, as a result of the impact of a precipitation assimilation scheme on cumulus-convection parameterization, and improved boundary forcing achieved through a new version of scale-selective bias correction. Water and energy budget terms were also evaluated against global reanalyses and other datasets.

  10. Alexithymia in eating disorders: Systematic review and meta-analyses of studies using the Toronto Alexithymia Scale.

    PubMed

    Westwood, Heather; Kerr-Gaffney, Jess; Stahl, Daniel; Tchanturia, Kate

    2017-08-01

    The aim of this review was to synthesise the literature on the use of the Toronto Alexithymia Scale (TAS) in eating disorder populations and Healthy Controls (HCs) and to compare TAS scores in these groups. Electronic databases were searched systematically for studies using the TAS and meta-analyses were performed to statistically compare scores on the TAS between individuals with eating disorders and HCs. Forty-eight studies using the TAS with both a clinical eating disorder group and HCs were identified. Of these, 44 were included in the meta-analyses, separated into: Anorexia Nervosa; Anorexia Nervosa, Restricting subtype; Anorexia Nervosa, Binge-Purge subtype, Bulimia Nervosa and Binge Eating Disorder. For all groups, there were significant differences with medium or large effect sizes between the clinical group and HCs, with the clinical group scoring significantly higher on the TAS, indicating greater difficulty with identifying and labelling emotions. Across the spectrum of eating disorders, individuals report having difficulties recognising or describing their emotions. Given the self-report design of the TAS, research to develop and evaluate treatments and clinician-administered assessments of alexithymia is warranted. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Scaling Theory of Entanglement at the Many-Body Localization Transition.

    PubMed

    Dumitrescu, Philipp T; Vasseur, Romain; Potter, Andrew C

    2017-09-15

    We study the universal properties of eigenstate entanglement entropy across the transition between many-body localized (MBL) and thermal phases. We develop an improved real space renormalization group approach that enables numerical simulation of large system sizes and systematic extrapolation to the infinite system size limit. For systems smaller than the correlation length, the average entanglement follows a subthermal volume law, whose coefficient is a universal scaling function. The full distribution of entanglement follows a universal scaling form, and exhibits a bimodal structure that produces universal subleading power-law corrections to the leading volume law. For systems larger than the correlation length, the short interval entanglement exhibits a discontinuous jump at the transition from fully thermal volume law on the thermal side, to pure area law on the MBL side.

  12. Physical consistency of subgrid-scale models for large-eddy simulation of incompressible turbulent flows

    NASA Astrophysics Data System (ADS)

    Silvis, Maurits H.; Remmerswaal, Ronald A.; Verstappen, Roel

    2017-01-01

    We study the construction of subgrid-scale models for large-eddy simulation of incompressible turbulent flows. In particular, we aim to consolidate a systematic approach of constructing subgrid-scale models, based on the idea that it is desirable that subgrid-scale models are consistent with the mathematical and physical properties of the Navier-Stokes equations and the turbulent stresses. To that end, we first discuss in detail the symmetries of the Navier-Stokes equations, and the near-wall scaling behavior, realizability and dissipation properties of the turbulent stresses. We furthermore summarize the requirements that subgrid-scale models have to satisfy in order to preserve these important mathematical and physical properties. In this fashion, a framework of model constraints arises that we apply to analyze the behavior of a number of existing subgrid-scale models that are based on the local velocity gradient. We show that these subgrid-scale models do not satisfy all the desired properties, after which we explain that this is partly due to incompatibilities between model constraints and limitations of velocity-gradient-based subgrid-scale models. However, we also reason that the current framework shows that there is room for improvement in the properties and, hence, the behavior of existing subgrid-scale models. We furthermore show how compatible model constraints can be combined to construct new subgrid-scale models that have desirable properties built into them. We provide a few examples of such new models, of which a new model of eddy viscosity type, that is based on the vortex stretching magnitude, is successfully tested in large-eddy simulations of decaying homogeneous isotropic turbulence and turbulent plane-channel flow.

  13. Are recreational SCUBA divers with asthma at increased risk?

    PubMed

    Ustrup, Amalie S; Ulrik, Charlotte S

    2017-10-01

    Asthma has traditionally been regarded as a contraindication to self-contained underwater breathing apparatus (SCUBA) diving, although large numbers of patients with asthma dive. The aim of the review is to provide an update on current knowledge on potential disease-related hazards in SCUBA divers with asthma. Systematic literature review based on the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. Seven studies met the criteria for inclusion in the review (comprising a total of 560 subjects). Five studies reported an increased risk for developing diving-related injuries in divers with asthma, based on case reports (n = 1), case history combined with objective assessment (n = 1), and dives and/or simulated dives (n = 3). The remaining studies (n = 2) were based on self-reported diving habits in divers suffering from asthma, obtained from anonymous questionnaires in diving magazines, reported no diving-related injuries among respondents. Due to limited evidence it is difficult to draw valid conclusions, but there are indications that recreational divers with asthma may be at increased risk for diving-related injuries compared to non-asthmatic divers. However, it is of at most importance to obtain further evidence from large-scale, well-designed studies.

  14. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    PubMed Central

    Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine

    2016-01-01

    The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400

  15. Large scale nanoparticle screening for small molecule analysis in laser desorption ionization mass spectrometry

    DOE PAGES

    Yagnik, Gargey B.; Hansen, Rebecca L.; Korte, Andrew R.; ...

    2016-08-30

    Nanoparticles (NPs) have been suggested as efficient matrixes for small molecule profiling and imaging by laser-desorption ionization mass spectrometry (LDI-MS), but so far there has been no systematic study comparing different NPs in the analysis of various classes of small molecules. Here, we present a large scale screening of 13 NPs for the analysis of two dozen small metabolite molecules. Many NPs showed much higher LDI efficiency than organic matrixes in positive mode and some NPs showed comparable efficiencies for selected analytes in negative mode. Our results suggest that a thermally driven desorption process is a key factor for metalmore » oxide NPs, but chemical interactions are also very important, especially for other NPs. Furthermore, the screening results provide a useful guideline for the selection of NPs in the LDI-MS analysis of small molecules.« less

  16. Large scale nanoparticle screening for small molecule analysis in laser desorption ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yagnik, Gargey B.; Hansen, Rebecca L.; Korte, Andrew R.

    Nanoparticles (NPs) have been suggested as efficient matrixes for small molecule profiling and imaging by laser-desorption ionization mass spectrometry (LDI-MS), but so far there has been no systematic study comparing different NPs in the analysis of various classes of small molecules. Here, we present a large scale screening of 13 NPs for the analysis of two dozen small metabolite molecules. Many NPs showed much higher LDI efficiency than organic matrixes in positive mode and some NPs showed comparable efficiencies for selected analytes in negative mode. Our results suggest that a thermally driven desorption process is a key factor for metalmore » oxide NPs, but chemical interactions are also very important, especially for other NPs. Furthermore, the screening results provide a useful guideline for the selection of NPs in the LDI-MS analysis of small molecules.« less

  17. Control of fluxes in metabolic networks

    PubMed Central

    Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu

    2016-01-01

    Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. PMID:27197218

  18. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  19. Excess electron localization in solvated DNA bases.

    PubMed

    Smyth, Maeve; Kohanoff, Jorge

    2011-06-10

    We present a first-principles molecular dynamics study of an excess electron in condensed phase models of solvated DNA bases. Calculations on increasingly large microsolvated clusters taken from liquid phase simulations show that adiabatic electron affinities increase systematically upon solvation, as for optimized gas-phase geometries. Dynamical simulations after vertical attachment indicate that the excess electron, which is initially found delocalized, localizes around the nucleobases within a 15 fs time scale. This transition requires small rearrangements in the geometry of the bases.

  20. Excess Electron Localization in Solvated DNA Bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smyth, Maeve; Kohanoff, Jorge

    2011-06-10

    We present a first-principles molecular dynamics study of an excess electron in condensed phase models of solvated DNA bases. Calculations on increasingly large microsolvated clusters taken from liquid phase simulations show that adiabatic electron affinities increase systematically upon solvation, as for optimized gas-phase geometries. Dynamical simulations after vertical attachment indicate that the excess electron, which is initially found delocalized, localizes around the nucleobases within a 15 fs time scale. This transition requires small rearrangements in the geometry of the bases.

  1. Bias towards dementia: are hip fracture trials excluding too many patients? A systematic review.

    PubMed

    Hebert-Davies, Jonah; Laflamme, G-Yves; Rouleau, Dominique

    2012-12-01

    Patients with hip fractures are older and often present many co-morbidities, including dementia. These patients cannot answer quality of life questionnaires and are generally excluded from trials. We hypothesized that a significant number of patients are being excluded from these studies and this may impact outcomes. This was a two part study; the first analyzing databases of two ongoing large-scale multi-centred hip fracture trials and the second being a systematic review. The FAITH and HEALTH studies were analyzed for exclusion incidence directly related to dementia. The second part consisted of a systematic search of all relevant studies within the last 20 years. In the FAITH study, a total of 1690 subjects were excluded, 375 (22.2%) of which were due to dementia or cognitive impairment. In the HEALTH study, 575 were excluded with dementia/cognitive impairment representing 207 patients (36%). Following the systematic review, 251 articles were identified 17 of which were retained. The overall prevalence of dementia was 27.9% (range 2-51%). Only two studies compared demented and non-demented groups. In these studies significant increases in both mortality and complications were found. In summary, when investigating hip fractures, choosing appropriate objective endpoints is essential to ensure results are also applicable to patients with dementia. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. A Methodology for Integrated, Multiregional Life Cycle Assessment Scenarios under Large-Scale Technological Change.

    PubMed

    Gibon, Thomas; Wood, Richard; Arvesen, Anders; Bergesen, Joseph D; Suh, Sangwon; Hertwich, Edgar G

    2015-09-15

    Climate change mitigation demands large-scale technological change on a global level and, if successfully implemented, will significantly affect how products and services are produced and consumed. In order to anticipate the life cycle environmental impacts of products under climate mitigation scenarios, we present the modeling framework of an integrated hybrid life cycle assessment model covering nine world regions. Life cycle assessment databases and multiregional input-output tables are adapted using forecasted changes in technology and resources up to 2050 under a 2 °C scenario. We call the result of this modeling "technology hybridized environmental-economic model with integrated scenarios" (THEMIS). As a case study, we apply THEMIS in an integrated environmental assessment of concentrating solar power. Life-cycle greenhouse gas emissions for this plant range from 33 to 95 g CO2 eq./kWh across different world regions in 2010, falling to 30-87 g CO2 eq./kWh in 2050. Using regional life cycle data yields insightful results. More generally, these results also highlight the need for systematic life cycle frameworks that capture the actual consequences and feedback effects of large-scale policies in the long term.

  3. Accuracy of the Alberta Infant Motor Scale (AIMS) to detect developmental delay of gross motor skills in preterm infants: a systematic review.

    PubMed

    de Albuquerque, Plínio Luna; Lemos, Andrea; Guerra, Miriam Queiroz de Farias; Eickmann, Sophie Helena

    2015-02-01

    To assess, through a systematic review, the ability of Alberta Infant Motor Scale (AIMS) to diagnose delayed motor development in preterm infants. Systematic searches identified five studies meeting inclusion criteria. These studies were evaluated in terms of: participants' characteristics, main results and risk of bias. The risk of bias was assessed with the Quality Assessment of Diagnostic Accuracy Studies--second edition (QUADAS-2). All five studies included a high risk of bias in at least one of the assessed fields. The most frequent biases included were presented in patient selection and lost follow up. All studies used the Pearson correlation coefficient to assess the diagnostic capability of the Alberta Infant Motor Scale. None of the assessed studies used psychometric measures to analyze the data. Given the evidence, the research supporting the ability of Alberta Infant Motor Scale to diagnose delayed motor development in preterm infants presents limitations. Further studies are suggested in order to avoid the above-mentioned biases to assess the Alberta Infant Motor Scale accuracy in preterm babies.

  4. Joint analysis of galaxy-galaxy lensing and galaxy clustering: Methodology and forecasts for Dark Energy Survey

    DOE PAGES

    Park, Y.; Krause, E.; Dodelson, S.; ...

    2016-09-30

    The joint analysis of galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth function of large scale structure. Our analysis will be carried out on data from the Dark Energy Survey (DES), with its measurements of both the distribution of galaxies and the tangential shears of background galaxies induced by these foreground lenses. We develop a practical approach to modeling the assumptions and systematic effects affecting small scale lensing, which provides halo masses, and large scale galaxy clustering. Introducing parameters that characterize the halo occupation distribution (HOD), photometric redshift uncertainties, and shear measurement errors, we studymore » how external priors on different subsets of these parameters affect our growth constraints. Degeneracies within the HOD model, as well as between the HOD and the growth function, are identified as the dominant source of complication, with other systematic effects sub-dominant. The impact of HOD parameters and their degeneracies necessitate the detailed joint modeling of the galaxy sample that we employ. Finally, we conclude that DES data will provide powerful constraints on the evolution of structure growth in the universe, conservatively/optimistically constraining the growth function to 7.9%/4.8% with its first-year data that covered over 1000 square degrees, and to 3.9%/2.3% with its full five-year data that will survey 5000 square degrees, including both statistical and systematic uncertainties.« less

  5. Joint analysis of galaxy-galaxy lensing and galaxy clustering: Methodology and forecasts for Dark Energy Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.; Krause, E.; Dodelson, S.

    The joint analysis of galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth function of large scale structure. Our analysis will be carried out on data from the Dark Energy Survey (DES), with its measurements of both the distribution of galaxies and the tangential shears of background galaxies induced by these foreground lenses. We develop a practical approach to modeling the assumptions and systematic effects affecting small scale lensing, which provides halo masses, and large scale galaxy clustering. Introducing parameters that characterize the halo occupation distribution (HOD), photometric redshift uncertainties, and shear measurement errors, we studymore » how external priors on different subsets of these parameters affect our growth constraints. Degeneracies within the HOD model, as well as between the HOD and the growth function, are identified as the dominant source of complication, with other systematic effects sub-dominant. The impact of HOD parameters and their degeneracies necessitate the detailed joint modeling of the galaxy sample that we employ. Finally, we conclude that DES data will provide powerful constraints on the evolution of structure growth in the universe, conservatively/optimistically constraining the growth function to 7.9%/4.8% with its first-year data that covered over 1000 square degrees, and to 3.9%/2.3% with its full five-year data that will survey 5000 square degrees, including both statistical and systematic uncertainties.« less

  6. Effectiveness of Virtual Reality in Children With Cerebral Palsy: A Systematic Review and Meta-Analysis of Randomized Controlled Trials.

    PubMed

    Chen, Yuping; Fanchiang, HsinChen D; Howard, Ayanna

    2018-01-01

    Researchers recently investigated the effectiveness of virtual reality (VR) in helping children with cerebral palsy (CP) to improve motor function. A systematic review of randomized controlled trials (RCTs) using a meta-analytic method to examine the effectiveness of VR in children with CP was thus needed. The purpose of this study was to update the current evidence about VR by systematically examining the research literature. A systematic literature search of PubMed, CINAHL, Cochrane Central Register of Controlled Trials, ERIC, PsycINFO, and Web of Science up to December 2016 was conducted. Studies with an RCT design, children with CP, comparisons of VR with other interventions, and movement-related outcomes were included. A template was created to systematically code the demographic, methodological, and miscellaneous variables of each RCT. The Physiotherapy Evidence Database (PEDro) scale was used to evaluate the study quality. Effect size was computed and combined using meta-analysis software. Moderator analyses were also used to explain the heterogeneity of the effect sizes in all RCTs. . The literature search yielded 19 RCT studies with fair to good methodological quality. Overall, VR provided a large effect size (d = 0.861) when compared with other interventions. A large effect of VR on arm function (d = 0.835) and postural control (d = 1.003) and a medium effect on ambulation (d = 0.755) were also found. Only the VR type affected the overall VR effect: an engineer-built system was more effective than a commercial system. The RCTs included in this study were of fair to good quality, had a high level of heterogeneity and small sample sizes, and used various intervention protocols. Then compared with other interventions, VR seems to be an effective intervention for improving motor function in children with CP. © 2017 American Physical Therapy Association

  7. Topology of Large-Scale Structures of Galaxies in two Dimensions—Systematic Effects

    NASA Astrophysics Data System (ADS)

    Appleby, Stephen; Park, Changbom; Hong, Sungwook E.; Kim, Juhan

    2017-02-01

    We study the two-dimensional topology of galactic distribution when projected onto two-dimensional spherical shells. Using the latest Horizon Run 4 simulation data, we construct the genus of the two-dimensional field and consider how this statistic is affected by late-time nonlinear effects—principally gravitational collapse and redshift space distortion (RSD). We also consider systematic and numerical artifacts, such as shot noise, galaxy bias, and finite pixel effects. We model the systematics using a Hermite polynomial expansion and perform a comprehensive analysis of known effects on the two-dimensional genus, with a view toward using the statistic for cosmological parameter estimation. We find that the finite pixel effect is dominated by an amplitude drop and can be made less than 1% by adopting pixels smaller than 1/3 of the angular smoothing length. Nonlinear gravitational evolution introduces time-dependent coefficients of the zeroth, first, and second Hermite polynomials, but the genus amplitude changes by less than 1% between z = 1 and z = 0 for smoothing scales {R}{{G}}> 9 {Mpc}/{{h}}. Non-zero terms are measured up to third order in the Hermite polynomial expansion when studying RSD. Differences in the shapes of the genus curves in real and redshift space are small when we adopt thick redshift shells, but the amplitude change remains a significant ˜ { O }(10 % ) effect. The combined effects of galaxy biasing and shot noise produce systematic effects up to the second Hermite polynomial. It is shown that, when sampling, the use of galaxy mass cuts significantly reduces the effect of shot noise relative to random sampling.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abolhasani, Ali Akbar; School of Physics, Institute for Research in Fundamental Sciences; Mirbabayi, Mehrdad

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we showmore » that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k{sup 2} and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.« less

  9. Search for subgrid scale parameterization by projection pursuit regression

    NASA Technical Reports Server (NTRS)

    Meneveau, C.; Lund, T. S.; Moin, Parviz

    1992-01-01

    The dependence of subgrid-scale stresses on variables of the resolved field is studied using direct numerical simulations of isotropic turbulence, homogeneous shear flow, and channel flow. The projection pursuit algorithm, a promising new regression tool for high-dimensional data, is used to systematically search through a large collection of resolved variables, such as components of the strain rate, vorticity, velocity gradients at neighboring grid points, etc. For the case of isotropic turbulence, the search algorithm recovers the linear dependence on the rate of strain (which is necessary to transfer energy to subgrid scales) but is unable to determine any other more complex relationship. For shear flows, however, new systematic relations beyond eddy viscosity are found. For the homogeneous shear flow, the results suggest that products of the mean rotation rate tensor with both the fluctuating strain rate and fluctuating rotation rate tensors are important quantities in parameterizing the subgrid-scale stresses. A model incorporating these terms is proposed. When evaluated with direct numerical simulation data, this model significantly increases the correlation between the modeled and exact stresses, as compared with the Smagorinsky model. In the case of channel flow, the stresses are found to correlate with products of the fluctuating strain and rotation rate tensors. The mean rates of rotation or strain do not appear to be important in this case, and the model determined for homogeneous shear flow does not perform well when tested with channel flow data. Many questions remain about the physical mechanisms underlying these findings, about possible Reynolds number dependence, and, given the low level of correlations, about their impact on modeling. Nevertheless, demonstration of the existence of causal relations between sgs stresses and large-scale characteristics of turbulent shear flows, in addition to those necessary for energy transfer, provides important insight into the relation between scales in turbulent flows.

  10. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abreu, P.; /Lisbon, IST; Aglietta, M.

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60{sup o}, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the {approx} 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shownmore » to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for. In this work, we have identified and quantified a systematic uncertainty affecting the energy determination of cosmic rays detected by the surface detector array of the Pierre Auger Observatory. This systematic uncertainty, induced by the influence of the geomagnetic field on the shower development, has a strength which depends on both the zenith and the azimuthal angles. Consequently, we have shown that it induces distortions of the estimated cosmic ray event rate at a given energy at the percent level in both the azimuthal and the declination distributions, the latter of which mimics an almost dipolar pattern. We have also shown that the induced distortions are already at the level of the statistical uncertainties for a number of events N {approx_equal} 32 000 (we note that the full Auger surface detector array collects about 6500 events per year with energies above 3 EeV). Accounting for these effects is thus essential with regard to the correct interpretation of large scale anisotropy measurements taking explicitly profit from the declination distribution.« less

  11. Landau damping of electrostatic waves in arbitrarily degenerate quantum plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rightley, Shane, E-mail: shane.rightley@colorado.edu; Uzdensky, Dmitri, E-mail: uzdensky@colorado.edu

    2016-03-15

    We carry out a systematic study of the dispersion relation for linear electrostatic waves in an arbitrarily degenerate quantum electron plasma. We solve for the complex frequency spectrum for arbitrary values of wavenumber k and level of degeneracy μ. Our finding is that for large k and high μ the real part of the frequency ω{sub r} grows linearly with k and scales with μ, only because of the scaling of the Fermi energy. In this regime, the relative Landau damping rate γ/ω{sub r} becomes independent of k and varies inversely with μ. Thus, damping is weak but finite atmore » moderate levels of degeneracy for short wavelengths.« less

  12. Critical gravitational collapse with angular momentum. II. Soft equations of state

    NASA Astrophysics Data System (ADS)

    Gundlach, Carsten; Baumgarte, Thomas W.

    2018-03-01

    We study critical phenomena in the collapse of rotating ultrarelativistic perfect fluids, in which the pressure P is related to the total energy density ρ by P =κ ρ , where κ is a constant. We generalize earlier results for radiation fluids with κ =1 /3 to other values of κ , focusing on κ <1 /9 . For 1 /9 <κ ≲0.49 , the critical solution has only one unstable, growing mode, which is spherically symmetric. For supercritical data it controls the black-hole mass, while for subcritical data it controls the maximum density. For κ <1 /9 , an additional axial l =1 mode becomes unstable. This controls either the black-hole angular momentum, or the maximum angular velocity. In theory, the additional unstable l =1 mode changes the nature of the black-hole threshold completely: at sufficiently large initial rotation rates Ω and sufficient fine-tuning of the initial data to the black-hole threshold we expect to observe nontrivial universal scaling functions (familiar from critical phase transitions in thermodynamics) governing the black-hole mass and angular momentum, and, with further fine-tuning, eventually a finite black-hole mass almost everywhere on the threshold. In practice, however, the second unstable mode grows so slowly that we do not observe this breakdown of scaling at the level of fine-tuning we can achieve, nor systematic deviations from the leading-order power-law scalings of the black-hole mass. We do see systematic effects in the black-hole angular momentum, but it is not clear yet if these are due to the predicted nontrivial scaling functions, or to nonlinear effects at sufficiently large initial angular momentum (which we do not account for in our theoretical model).

  13. Endovascular Mechanical Thrombectomy in Large-Vessel Occlusion Ischemic Stroke Presenting with Low National Institutes of Health Stroke Scale: Systematic Review and Meta-Analysis.

    PubMed

    Griessenauer, Christoph J; Medin, Caroline; Maingard, Julian; Chandra, Ronil V; Ng, Wyatt; Brooks, Duncan Mark; Asadi, Hamed; Killer-Oberpfalzer, Monika; Schirmer, Clemens M; Moore, Justin M; Ogilvy, Christopher S; Thomas, Ajith J; Phan, Kevin

    2018-02-01

    Mechanical thrombectomy has become the standard of care for management of most large vessel occlusion (LVO) strokes. When patients with LVO present with minor stroke symptomatology, no consensus on the role of mechanical thrombectomy exists. A systematic review and meta-analysis were performed to identify studies that focused on mechanical thrombectomy, either as a standalone treatment or with intravenous tissue plasminogen activator (IV tPA), in patients with mild strokes with LVO, defined as a baseline National Institutes of Health Stroke Scale score ≤5 at presentation. Data on methodology, quality criteria, and outcome measures were extracted, and outcomes were compared using odds ratio as a summary statistic. Five studies met the selection criteria and were included. When compared with medical therapy without IV tPA, mechanical thrombectomy and medical therapy with IV tPA were associated with improved 90-day modified Rankin Scale (mRS) score. Among medical patients who were not eligible for IV tPA, those who underwent mechanical thrombectomy were more likely to experience good 90-day mRS than those who were not. There was no significant difference in functional outcome between mechanical thrombectomy and medical therapy with IV tPA, and no treatment subgroup was associated with intracranial hemorrhage or death. In patients with mild strokes due to LVO, mechanical thrombectomy and medical therapy with IV tPA led to better 90-day functional outcome. Mechanical thrombectomy plays an important role in the management of these patients, particularly in those not eligible for IV tPA. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Multiplex Networks of Cortical and Hippocampal Neurons Revealed at Different Timescales

    PubMed Central

    Timme, Nicholas; Ito, Shinya; Myroshnychenko, Maxym; Yeh, Fang-Chin; Hiolski, Emma; Hottowy, Pawel; Beggs, John M.

    2014-01-01

    Recent studies have emphasized the importance of multiplex networks – interdependent networks with shared nodes and different types of connections – in systems primarily outside of neuroscience. Though the multiplex properties of networks are frequently not considered, most networks are actually multiplex networks and the multiplex specific features of networks can greatly affect network behavior (e.g. fault tolerance). Thus, the study of networks of neurons could potentially be greatly enhanced using a multiplex perspective. Given the wide range of temporally dependent rhythms and phenomena present in neural systems, we chose to examine multiplex networks of individual neurons with time scale dependent connections. To study these networks, we used transfer entropy – an information theoretic quantity that can be used to measure linear and nonlinear interactions – to systematically measure the connectivity between individual neurons at different time scales in cortical and hippocampal slice cultures. We recorded the spiking activity of almost 12,000 neurons across 60 tissue samples using a 512-electrode array with 60 micrometer inter-electrode spacing and 50 microsecond temporal resolution. To the best of our knowledge, this preparation and recording method represents a superior combination of number of recorded neurons and temporal and spatial recording resolutions to any currently available in vivo system. We found that highly connected neurons (“hubs”) were localized to certain time scales, which, we hypothesize, increases the fault tolerance of the network. Conversely, a large proportion of non-hub neurons were not localized to certain time scales. In addition, we found that long and short time scale connectivity was uncorrelated. Finally, we found that long time scale networks were significantly less modular and more disassortative than short time scale networks in both tissue types. As far as we are aware, this analysis represents the first systematic study of temporally dependent multiplex networks among individual neurons. PMID:25536059

  15. A quasi-experimental feasibility study to determine the effect of a systematic treatment programme on the scores of the Nottingham Adjustment Scale of individuals with visual field deficits following stroke.

    PubMed

    Taylor, Lisa; Poland, Fiona; Harrison, Peter; Stephenson, Richard

    2011-01-01

    To evaluate a systematic treatment programme developed by the researcher that targeted aspects of visual functioning affected by visual field deficits following stroke. The study design was a non-equivalent control (conventional) group pretest-posttest quasi-experimental feasibility design, using multisite data collection methods at specified stages. The study was undertaken within three acute hospital settings as outpatient follow-up sessions. Individuals who had visual field deficits three months post stroke were studied. A treatment group received routine occupational therapy and an experimental group received, in addition, a systematic treatment programme. The treatment phase of both groups lasted six weeks. The Nottingham Adjustment Scale, a measure developed specifically for visual impairment, was used as the primary outcome measure. The change in Nottingham Adjustment Scale score was compared between the experimental (n = 7) and conventional (n = 8) treatment groups using the Wilcoxon signed ranks test. The result of Z = -2.028 (P = 0.043) showed that there was a statistically significant difference between the change in Nottingham Adjustment Scale score between both groups. The introduction of the systematic treatment programme resulted in a statistically significant change in the scores of the Nottingham Adjustment Scale.

  16. How systematic age underestimation can impede understanding of fish population dynamics: Lessons learned from a Lake Superior cisco stock

    USGS Publications Warehouse

    Yule, D.L.; Stockwell, J.D.; Black, J.A.; Cullis, K.I.; Cholwek, G.A.; Myers, J.T.

    2008-01-01

    Systematic underestimation of fish age can impede understanding of recruitment variability and adaptive strategies (like longevity) and can bias estimates of survivorship. We suspected that previous estimates of annual survival (S; range = 0.20-0.44) for Lake Superior ciscoes Coregonus artedi developed from scale ages were biased low. To test this hypothesis, we estimated the total instantaneous mortality rate of adult ciscoes from the Thunder Bay, Ontario, stock by use of cohort-based catch curves developed from commercial gill-net catches and otolith-aged fish. Mean S based on otolith ages was greater for adult females (0.80) than for adult males (0.75), but these differences were not significant. Applying the results of a study of agreement between scale and otolith ages, we modeled a scale age for each otolith-aged fish to reconstruct catch curves. Using modeled scale ages, estimates of S (0.42 for females, 0.36 for males) were comparable with those reported in past studies. We conducted a November 2005 acoustic and midwater trawl survey to estimate the abundance of ciscoes when the fish were being harvested for roe. Estimated exploitation rates were 0.085 for females and 0.025 for males, and the instantaneous rates of fishing mortality were 0.089 for females and 0.025 for males. The instantaneous rates of natural mortality were 0.131 and 0.265 for females and males, respectively. Using otolith ages, we found that strong year-classes at large during November 2005 were caught in high numbers as age-1 fish in previous annual bottom trawl surveys, whereas weak or absent year-classes were not. For decades, large-scale fisheries on the Great Lakes were allowed to operate because ciscoes were assumed to be short lived and to have regular recruitment. We postulate that the collapse of these fisheries was linked in part to a misunderstanding of cisco biology driven by scale-ageing error. ?? Copyright by the American Fisheries Society 2008.

  17. Constraints on the Origin of Cosmic Rays above 1018 eV from Large-scale Anisotropy Searches in Data of the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antiči'c, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Buroker, L.; Burton, R. E.; Caballero-Mora, K. S.; Caccianiga, B.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gemmeke, H.; Ghia, P. L.; Giller, M.; Gitto, J.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gouffon, P.; Grashorn, E.; Grebe, S.; Griffith, N.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jansen, S.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; LaHurd, D.; Latronico, L.; Lauer, R.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Messina, S.; Meurer, C.; Meyhandan, R.; Mi'canovi'c, S.; Micheletti, M. I.; Minaya, I. A.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niechciol, M.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Oehlschläger, J.; Olinto, A.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrolini, A.; Petrov, Y.; Pfendner, C.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Ponce, V. H.; Pontz, M.; Porcelli, A.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Cabo, I.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Sima, O.; 'Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tapia, A.; Tartare, M.; Taşcău, O.; Tcaciuc, R.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tkaczyk, W.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Westerhoff, S.; Whelan, B. J.; Widom, A.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano Garcia, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.

    2013-01-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 1018 eV at the Pierre Auger Observatory is reported. For the first time, these large-scale anisotropy searches are performed as a function of both the right ascension and the declination and expressed in terms of dipole and quadrupole moments. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Upper limits on dipole and quadrupole amplitudes are derived under the hypothesis that any cosmic ray anisotropy is dominated by such moments in this energy range. These upper limits provide constraints on the production of cosmic rays above 1018 eV, since they allow us to challenge an origin from stationary galactic sources densely distributed in the galactic disk and emitting predominantly light particles in all directions.

  18. Impact of compressibility on heat transport characteristics of large terrestrial planets

    NASA Astrophysics Data System (ADS)

    Čížková, Hana; van den Berg, Arie; Jacobs, Michel

    2017-07-01

    We present heat transport characteristics for mantle convection in large terrestrial exoplanets (M ⩽ 8M⊕) . Our thermal convection model is based on a truncated anelastic liquid approximation (TALA) for compressible fluids and takes into account a selfconsistent thermodynamic description of material properties derived from mineral physics based on a multi-Einstein vibrational approach. We compare heat transport characteristics in compressible models with those obtained with incompressible models based on the classical- and extended Boussinesq approximation (BA and EBA respectively). Our scaling analysis shows that heat flux scales with effective dissipation number as Nu ∼Dieff-0.71 and with Rayleigh number as Nu ∼Raeff0.27. The surface heat flux of the BA models strongly overestimates the values from the corresponding compressible models, whereas the EBA models systematically underestimate the heat flux by ∼10%-15% with respect to a corresponding compressible case. Compressible models are also systematically warmer than the EBA models. Compressibility effects are therefore important for mantle dynamic processes, especially for large rocky exoplanets and consequently also for formation of planetary atmospheres, through outgassing, and the existence of a magnetic field, through thermal coupling of mantle and core dynamic systems.

  19. Genetic studies of type 2 diabetes in South Asians: a systematic overview.

    PubMed

    Chowdhury, Ritam; Narayan, Kabayam M Venkat; Zabetian, Azadeh; Raj, Suraja; Tabassum, Rubina

    2014-01-01

    Diabetes Mellitus, which affects 366 million people worldwide, is a leading cause of mortality, morbidity, and loss of quality of life. South Asians, comprising 24% of the world's population, suffer a large burden of type 2 diabetes. With intriguing risk phenotypes, unique environmental triggers, and potential genetic predisposition, South Asians offer a valuable resource for investigating the pathophysiology of type 2 diabetes. Genomics has proven its potential to underpin some of the etiology of type 2 diabetes by identifying a number of susceptibility genes, but such data are scarce and unclear in South Asians. We present a systematic review of studies on the genetic basis of type 2 diabetes or its complications in South Asians published between 1987-2012, and discuss the findings and limitations of the available data. Of the 91 eligible studies meeting our inclusion criteria, a vast majority included Indian populations, followed by a few in those of Pakistani origin, while other South Asian countries were generally under-represented. Though a large number of studies focused on the replication of findings from genome-wide association studies (GWAS) in European populations, a few studies explored new genes and pathways along with GWAS in South Asians and suggested the potential to unravel population- specific susceptibility genes in this population. We find encouraging improvements in study designs, sample sizes and the numbers of genetic variants investigated over the last five years, which reflect the existing capacity and scope for large-scale genetic studies in South Asians.

  20. Systematic simulations of modified gravity: chameleon models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu

    2013-04-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference withmore » the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.« less

  1. An adaptive response surface method for crashworthiness optimization

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Yang, Ren-Jye; Zhu, Ping

    2013-11-01

    Response surface-based design optimization has been commonly used for optimizing large-scale design problems in the automotive industry. However, most response surface models are built by a limited number of design points without considering data uncertainty. In addition, the selection of a response surface in the literature is often arbitrary. This article uses a Bayesian metric to systematically select the best available response surface among several candidates in a library while considering data uncertainty. An adaptive, efficient response surface strategy, which minimizes the number of computationally intensive simulations, was developed for design optimization of large-scale complex problems. This methodology was demonstrated by a crashworthiness optimization example.

  2. Work-related stress as a cardiovascular risk factor in police officers: a systematic review of evidence.

    PubMed

    Magnavita, N; Capitanelli, I; Garbarino, S; Pira, E

    2018-05-01

    Several studies suggest that work-related stress in police officers may be associated with an increased risk of cardiovascular diseases. A systematic review of studies is, however, still lacking. According to PRISMA statement, a systematic search of PubMed, ISI Web of Science, Cinahl and PsychInfo electronic databases was undertaken. Studies published in English between 1/1/2000 and 31/12/2016 were included. A studies quality assessment was performed using the Newcastle Ottawa scale (NOS). The preliminary search retrieved 752 records. After selection, 16 studies (total population 17,698) were retrieved. The average quality of studies was low. Exposure to stress in cross-sectional studies was inconstantly associated with hypertension, obesity, dyslipidaemia, and impaired glucose metabolism. In addition, there was a prevalence of positive studies showing an association between stress and cardiovascular disease morbidity. Studies of higher quality, such as longitudinal studies on large sample size, were more supportive of a significant positive association between stress and cardiovascular risk factors. Results were, however, often conflicting and inconsistent with regard to definitions and measurement of stress, features of individual study design, study conduct, and conclusions drawn. A sound precautionary principle would be to adopt worksite health promotion programs designed to implement stress management strategies in this category of workers.

  3. Spatial Covariability of Temperature and Hydroclimate as a Function of Timescale During the Common Era

    NASA Astrophysics Data System (ADS)

    McKay, N.

    2017-12-01

    As timescale increases from years to centuries, the spatial scale of covariability in the climate system is hypothesized to increase as well. Covarying spatial scales are larger for temperature than for hydroclimate, however, both aspects of the climate system show systematic changes on large-spatial scales on orbital to tectonic timescales. The extent to which this phenomenon is evident in temperature and hydroclimate at centennial timescales is largely unknown. Recent syntheses of multidecadal to century-scale variability in hydroclimate during the past 2k in the Arctic, North America, and Australasia show little spatial covariability in hydroclimate during the Common Era. To determine 1) the evidence for systematic relationships between the spatial scale of climate covariability as a function of timescale, and 2) whether century-scale hydroclimate variability deviates from the relationship between spatial covariability and timescale, we quantify this phenomenon during the Common Era by calculating the e-folding distance in large instrumental and paleoclimate datasets. We calculate this metric of spatial covariability, at different timescales (1, 10 and 100-yr), for a large network of temperature and precipitation observations from the Global Historical Climatology Network (n=2447), from v2.0.0 of the PAGES2k temperature database (n=692), and from moisture-sensitive paleoclimate records North America, the Arctic, and the Iso2k project (n = 328). Initial results support the hypothesis that the spatial scale of covariability is larger for temperature, than for precipitation or paleoclimate hydroclimate indicators. Spatially, e-folding distances for temperature are largest at low latitudes and over the ocean. Both instrumental and proxy temperature data show clear evidence for increasing spatial extent as a function of timescale, but this phenomenon is very weak in the hydroclimate data analyzed here. In the proxy hydroclimate data, which are predominantly indicators of effective moisture, e-folding distance increases from annual to decadal timescales, but does not continue to increase to centennial timescales. Future work includes examining additional instrumental and proxy datasets of moisture variability, and extending the analysis to millennial timescales of variability.

  4. Cosmology from Cosmic Microwave Background and large- scale structure

    NASA Astrophysics Data System (ADS)

    Xu, Yongzhong

    2003-10-01

    This dissertation consists of a series of studies, constituting four published papers, involving the Cosmic Microwave Background and the large scale structure, which help constrain Cosmological parameters and potential systematic errors. First, we present a method for comparing and combining maps with different resolutions and beam shapes, and apply it to the Saskatoon, QMAP and COBE/DMR data sets. Although the Saskatoon and QMAP maps detect signal at the 21σ and 40σ, levels, respectively, their difference is consistent with pure noise, placing strong limits on possible systematic errors. In particular, we obtain quantitative upper limits on relative calibration and pointing errors. Splitting the combined data by frequency shows similar consistency between the Ka- and Q-bands, placing limits on foreground contamination. The visual agreement between the maps is equally striking. Our combined QMAP+Saskatoon map, nicknamed QMASK, is publicly available at www.hep.upenn.edu/˜xuyz/qmask.html together with its 6495 x 6495 noise covariance matrix. This thoroughly tested data set covers a large enough area (648 square degrees—at the time, the largest degree-scale map available) to allow a statistical comparison with LOBE/DMR, showing good agreement. By band-pass-filtering the QMAP and Saskatoon maps, we are also able to spatially compare them scale-by-scale to check for beam- and pointing-related systematic errors. Using the QMASK map, we then measure the cosmic microwave background (CMB) power spectrum on angular scales ℓ ˜ 30 200 (1° 6°), and we test it for non-Gaussianity using morphological statistics known as Minkowski functionals. We conclude that the QMASK map is neither a very typical nor a very exceptional realization of a Gaussian random field. At least about 20% of the 1000 Gaussian Monte Carlo maps differ more than the QMASK map from the mean morphological parameters of the Gaussian fields. Finally, we compute the real-space power spectrum and the redshift-space distortions of galaxies in the 2dF 100k galaxy redshift survey using pseudo-Karhunen-Loève eigenmodes and the stochastic bias formalism. Our results agree well with those published by the 2dFGRS team, and have the added advantage of producing easy-to-interpret uncorrelated minimum-variance measurements of the galaxy- galaxy, galaxy-velocity and velocity-velocity power spectra in 27 k-bands, with narrow and well-behaved window functions in the range 0.01 h /Mpc < k < 0.8 h/Mpc. We find no significant detection of baryonic wiggles. We measure the galaxy-matter correlation coefficient r > 0.4 and the redshift-distortion parameter β = 0.49 ± 0.16 for r = 1.

  5. SQDFT: Spectral Quadrature method for large-scale parallel O(N) Kohn-Sham calculations at high temperature

    NASA Astrophysics Data System (ADS)

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj; Pask, John E.

    2018-03-01

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method for O(N) Kohn-Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw-Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw-Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. We further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect O(N) scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.

  6. Non-Hookean statistical mechanics of clamped graphene ribbons

    NASA Astrophysics Data System (ADS)

    Bowick, Mark J.; Košmrlj, Andrej; Nelson, David R.; Sknepnek, Rastko

    2017-03-01

    Thermally fluctuating sheets and ribbons provide an intriguing forum in which to investigate strong violations of Hooke's Law: Large distance elastic parameters are in fact not constant but instead depend on the macroscopic dimensions. Inspired by recent experiments on free-standing graphene cantilevers, we combine the statistical mechanics of thin elastic plates and large-scale numerical simulations to investigate the thermal renormalization of the bending rigidity of graphene ribbons clamped at one end. For ribbons of dimensions W ×L (with L ≥W ), the macroscopic bending rigidity κR determined from cantilever deformations is independent of the width when W <ℓth , where ℓth is a thermal length scale, as expected. When W >ℓth , however, this thermally renormalized bending rigidity begins to systematically increase, in agreement with the scaling theory, although in our simulations we were not quite able to reach the system sizes necessary to determine the fully developed power law dependence on W . When the ribbon length L >ℓp , where ℓp is the W -dependent thermally renormalized ribbon persistence length, we observe a scaling collapse and the beginnings of large scale random walk behavior.

  7. Dynamically Consistent Parameterization of Mesoscale Eddies This work aims at parameterization of eddy effects for use in non-eddy-resolving ocean models and focuses on the effect of the stochastic part of the eddy forcing that backscatters and induces eastward jet extension of the western boundary currents and its adjacent recirculation zones.

    NASA Astrophysics Data System (ADS)

    Berloff, P. S.

    2016-12-01

    This work aims at developing a framework for dynamically consistent parameterization of mesoscale eddy effects for use in non-eddy-resolving ocean circulation models. The proposed eddy parameterization framework is successfully tested on the classical, wind-driven double-gyre model, which is solved both with explicitly resolved vigorous eddy field and in the non-eddy-resolving configuration with the eddy parameterization replacing the eddy effects. The parameterization focuses on the effect of the stochastic part of the eddy forcing that backscatters and induces eastward jet extension of the western boundary currents and its adjacent recirculation zones. The parameterization locally approximates transient eddy flux divergence by spatially localized and temporally periodic forcing, referred to as the plunger, and focuses on the linear-dynamics flow solution induced by it. The nonlinear self-interaction of this solution, referred to as the footprint, characterizes and quantifies the induced eddy forcing exerted on the large-scale flow. We find that spatial pattern and amplitude of each footprint strongly depend on the underlying large-scale flow, and the corresponding relationships provide the basis for the eddy parameterization and its closure on the large-scale flow properties. Dependencies of the footprints on other important parameters of the problem are also systematically analyzed. The parameterization utilizes the local large-scale flow information, constructs and scales the corresponding footprints, and then sums them up over the gyres to produce the resulting eddy forcing field, which is interactively added to the model as an extra forcing. Thus, the assumed ensemble of plunger solutions can be viewed as a simple model for the cumulative effect of the stochastic eddy forcing. The parameterization framework is implemented in the simplest way, but it provides a systematic strategy for improving the implementation algorithm.

  8. Cutaneous lichen planus: A systematic review of treatments.

    PubMed

    Fazel, Nasim

    2015-06-01

    Various treatment modalities are available for cutaneous lichen planus. Pubmed, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effects, and Health Technology Assessment Database were searched for all the systematic reviews and randomized controlled trials related to cutaneous lichen planus. Two systematic reviews and nine relevant randomized controlled trials were identified. Acitretin, griseofulvin, hydroxychloroquine and narrow band ultraviolet B are demonstrated to be effective in the treatment of cutaneous lichen planus. Sulfasalazine is effective, but has an unfavorable safety profile. KH1060, a vitamin D analogue, is not beneficial in the management of cutaneous lichen planus. Evidence from large scale randomized trials demonstrating the safety and efficacy for many other treatment modalities used to treat cutaneous lichen planus is simply not available.

  9. Final Results from a Large-Scale National Study of General Education Astronomy Students' Learning Difficulties with Cosmology

    NASA Astrophysics Data System (ADS)

    Wallace, Colin; Prather, Edward; Duncan, Douglas

    2011-10-01

    We recently completed a large-scale, systematic study of general education introductory astronomy students' conceptual and reasoning difficulties related to cosmology. As part of this study, we analyzed a total of 4359 surveys (pre- and post-instruction) containing students' responses to questions about the Big Bang, the evolution and expansion of the universe, using Hubble plots to reason about the age and expansion rate of the universe, and using galaxy rotation curves to infer the presence of dark matter. We also designed, piloted, and validated a new suite of five cosmology Lecture-Tutorials. We found that students who use the new Lecture-Tutorials can achieve larger learning gains than their peers who did not. This material is based in part upon work supported by the National Science Foundation under Grant Nos. 0833364 and 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  10. Final Results from a Large-Scale National Study of General Education Astronomy Students’ Learning Difficulties with Cosmology

    NASA Astrophysics Data System (ADS)

    Wallace, Colin Scott; Prather, E. E.; Duncan, D. K.; Collaboration of Astronomy Teaching Scholars CATS

    2012-01-01

    We recently completed a large-scale, systematic study of general education introductory astronomy students’ conceptual and reasoning difficulties related to cosmology. As part of this study, we analyzed a total of 4359 surveys (pre- and post-instruction) containing students’ responses to questions about the Big Bang, the evolution and expansion of the universe, using Hubble plots to reason about the age and expansion rate of the universe, and using galaxy rotation curves to infer the presence of dark matter. We also designed, piloted, and validated a new suite of five cosmology Lecture-Tutorials. We found that students who use the new Lecture-Tutorials can achieve larger learning gains than their peers who did not. This material is based in part upon work supported by the National Science Foundation under Grant Nos. 0833364 and 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  11. Planck intermediate results. XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Aghanim, N.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Ballardini, M.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battye, R.; Benabed, K.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carron, J.; Challinor, A.; Chiang, H. C.; Colombo, L. P. L.; Combet, C.; Comis, B.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fantaye, Y.; Finelli, F.; Forastieri, F.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Génova-Santos, R. T.; Gerbino, M.; Ghosh, T.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Helou, G.; Henrot-Versillé, S.; Herranz, D.; Hivon, E.; Huang, Z.; Ilić, S.; Jaffe, A. H.; Jones, W. C.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knox, L.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Langer, M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leahy, J. P.; Levrier, F.; Liguori, M.; Lilje, P. B.; López-Caniego, M.; Ma, Y.-Z.; Macías-Pérez, J. F.; Maggio, G.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Matarrese, S.; Mauri, N.; McEwen, J. D.; Meinhold, P. R.; Melchiorri, A.; Mennella, A.; Migliaccio, M.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Moss, A.; Mottet, S.; Naselsky, P.; Natoli, P.; Oxborrow, C. A.; Pagano, L.; Paoletti, D.; Partridge, B.; Patanchon, G.; Patrizii, L.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Plaszczynski, S.; Polastri, L.; Polenta, G.; Puget, J.-L.; Rachen, J. P.; Racine, B.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Rocha, G.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Ruiz-Granados, B.; Salvati, L.; Sandri, M.; Savelainen, M.; Scott, D.; Sirri, G.; Sunyaev, R.; Suur-Uski, A.-S.; Tauber, J. A.; Tenti, M.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Valiviita, J.; Van Tent, F.; Vibert, L.; Vielva, P.; Villa, F.; Vittorio, N.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; White, M.; Zacchei, A.; Zonca, A.

    2016-12-01

    This paper describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducing significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. In a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.

  12. The Pilot Lunar Geologic Mapping Project: Summary Results and Recommendations from the Copernicus Quadrangle

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Gaddis, L. R.; Hagerty, J. J.

    2010-01-01

    The first systematic lunar geologic maps were completed at 1:1M scale for the lunar near side during the 1960s using telescopic and Lunar Orbiter (LO) photographs [1-3]. The program under which these maps were completed established precedents for map base, scale, projection, and boundaries in order to avoid widely discrepant products. A variety of geologic maps were subsequently produced for various purposes, including 1:5M scale global maps [4-9] and large scale maps of high scientific interest (including the Apollo landing sites) [10]. Since that time, lunar science has benefitted from an abundance of surface information, including high resolution images and diverse compositional data sets, which have yielded a host of topical planetary investigations. The existing suite of lunar geologic maps and topical studies provide exceptional context in which to unravel the geologic history of the Moon. However, there has been no systematic approach to lunar geologic mapping since the flight of post-Apollo scientific orbiters. Geologic maps provide a spatial and temporal framework wherein observations can be reliably benchmarked and compared. As such, a lack of a systematic mapping program means that modern (post- Apollo) data sets, their scientific ramifications, and the lunar scientists who investigate these data, are all marginalized in regard to geologic mapping. Marginalization weakens the overall understanding of the geologic evolution of the Moon and unnecessarily partitions lunar research. To bridge these deficiencies, we began a pilot geologic mapping project in 2005 as a means to assess the interest, relevance, and technical methods required for a renewed lunar geologic mapping program [11]. Herein, we provide a summary of the pilot geologic mapping project, which focused on the geologic materials and stratigraphic relationships within the Copernicus quadrangle (0-30degN, 0-45degW).

  13. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    PubMed

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  14. Observational tests of convective core overshooting in stars of intermediate to high mass in the Galaxy

    NASA Technical Reports Server (NTRS)

    Stothers, Richard B.

    1991-01-01

    This study presents the results of 14 tests for the presence of convective overshooting in large convecting stellar cores for stars with masses of 4-17 solar masses which are members of detached close binary systems and of open clusters in the Galaxy. A large body of theoretical and observational data is scrutinized and subjected to averaging in order to minimize accidental and systematic errors. A conservative upper limit of d/HP less than 0.4 is found from at least four tests, as well as a tighter upper limit of d/HP less than 0.2 from one good test that is subject to only mild restrictions and is based on the maximum observed effective temperature of evolved blue supergiants. It is concluded that any current uncertainty about the distance scale for these stars is unimportant in conducting the present tests for convective core overshooting. The correct effective temperature scale for the B0.5-B2 stars is almost certainly close to one of the proposed hot scales.

  15. Protein docking by the interface structure similarity: how much structure is needed?

    PubMed

    Sinha, Rohita; Kundrotas, Petras J; Vakser, Ilya A

    2012-01-01

    The increasing availability of co-crystallized protein-protein complexes provides an opportunity to use template-based modeling for protein-protein docking. Structure alignment techniques are useful in detection of remote target-template similarities. The size of the structure involved in the alignment is important for the success in modeling. This paper describes a systematic large-scale study to find the optimal definition/size of the interfaces for the structure alignment-based docking applications. The results showed that structural areas corresponding to the cutoff values <12 Å across the interface inadequately represent structural details of the interfaces. With the increase of the cutoff beyond 12 Å, the success rate for the benchmark set of 99 protein complexes, did not increase significantly for higher accuracy models, and decreased for lower-accuracy models. The 12 Å cutoff was optimal in our interface alignment-based docking, and a likely best choice for the large-scale (e.g., on the scale of the entire genome) applications to protein interaction networks. The results provide guidelines for the docking approaches, including high-throughput applications to modeled structures.

  16. Interrogation of Mammalian Protein Complex Structure, Function, and Membership Using Genome-Scale Fitness Screens.

    PubMed

    Pan, Joshua; Meyers, Robin M; Michel, Brittany C; Mashtalir, Nazar; Sizemore, Ann E; Wells, Jonathan N; Cassel, Seth H; Vazquez, Francisca; Weir, Barbara A; Hahn, William C; Marsh, Joseph A; Tsherniak, Aviad; Kadoch, Cigall

    2018-05-23

    Protein complexes are assemblies of subunits that have co-evolved to execute one or many coordinated functions in the cellular environment. Functional annotation of mammalian protein complexes is critical to understanding biological processes, as well as disease mechanisms. Here, we used genetic co-essentiality derived from genome-scale RNAi- and CRISPR-Cas9-based fitness screens performed across hundreds of human cancer cell lines to assign measures of functional similarity. From these measures, we systematically built and characterized functional similarity networks that recapitulate known structural and functional features of well-studied protein complexes and resolve novel functional modules within complexes lacking structural resolution, such as the mammalian SWI/SNF complex. Finally, by integrating functional networks with large protein-protein interaction networks, we discovered novel protein complexes involving recently evolved genes of unknown function. Taken together, these findings demonstrate the utility of genetic perturbation screens alone, and in combination with large-scale biophysical data, to enhance our understanding of mammalian protein complexes in normal and disease states. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Systematic Review and Meta-analysis of Indirect Protection Afforded by Vaccinating Children Against Seasonal Influenza: Implications for Policy.

    PubMed

    Yin, J Kevin; Heywood, Anita E; Georgousakis, Melina; King, Catherine; Chiu, Clayton; Isaacs, David; Macartney, Kristine K

    2017-09-01

    Universal childhood vaccination is a potential solution to reduce seasonal influenza burden. We reviewed systematically the literature on "herd"/indirect protection from vaccinating children aged 6 months to 17 years against influenza. Of 30 studies included, 14 (including 1 cluster randomized controlled trial [cRCT]) used live attenuated influenza vaccine, 11 (7 cRCTs) used inactivated influenza vaccine, and 5 (1 cRCT) compared both vaccine types. Twenty of 30 studies reported statistically significant indirect protection effectiveness (IPE) with point estimates ranging from 4% to 66%. Meta-regression suggests that studies with high quality and/or sufficiently large sample size are more likely to report significant IPE. In meta-analyses of 6 cRCTs with full randomization (rated as moderate quality overall), significant IPE was found in 1 cRCT in closely connected communities where school-aged children were vaccinated: 60% (95% confidence interval [CI], 41%-72%; I2 = 0%; N = 2326) against laboratory-confirmed influenza, and 3 household cRCTs in which preschool-aged children were vaccinated: 22% (95% CI, 1%-38%; I2 = 0%; N = 1903) against acute respiratory infections or influenza-like illness. Significant IPE was also reported in a large-scale cRCT (N = 8510) that was not fully randomized, and 3 ecological studies (N > 10000) of moderate quality including 36% reduction in influenza-related mortality among the elderly in a Japanese school-based program. Data on IPE in other settings are heterogeneous and lacked power to draw a firm conclusion. The available evidence suggests that influenza vaccination of children confers indirect protection in some but not all settings. Robust, large-scaled studies are required to better quantify the indirect protection from vaccinating children for different settings/endpoints. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  18. Progressive resistance training increases strength after stroke but this may not carry over to activity: a systematic review.

    PubMed

    Dorsch, Simone; Ada, Louise; Alloggia, Daniella

    2018-04-01

    Does progressive resistance training improve strength and activity after stroke? Does any increase in strength carry over to activity? Systematic review of randomised trials with meta-analysis. Adults who have had a stroke. Progressive resistance training compared with no intervention or placebo. The primary outcome was change in strength. This measurement had to be of maximum voluntary force production and performed in muscles congruent with the muscles trained in the intervention. The secondary outcome was change in activity. This measurement had to be a direct measure of performance that produced continuous or ordinal data, or with scales that produced ordinal data. Eleven studies involving 370 participants were included in this systematic review. The overall effect of progressive resistance training on strength was examined by pooling change scores from six studies with a mean PEDro score of 5.8, representing medium quality. The effect size of progressive resistance training on strength was 0.98 (95% CI 0.67 to 1.29, I 2 =0%). The overall effect of progressive resistance training on activity was examined by pooling change scores from the same six studies. The effect size of progressive resistance training on activity was 0.42 (95% CI -0.08 to 0.91, I 2 =54%). After stroke, progressive resistance training has a large effect on strength compared with no intervention or placebo. There is uncertainty about whether these large increases in strength carry over to improvements in activity. PROSPERO CRD42015025401. [Dorsch S, Ada L, Alloggia D (2018) Progressive resistance training increases strength after stroke but this may not carry over to activity: a systematic review. Journal of Physiotherapy 64: 84-90]. Copyright © 2018 Australian Physiotherapy Association. Published by Elsevier B.V. All rights reserved.

  19. Step scaling and the Yang-Mills gradient flow

    NASA Astrophysics Data System (ADS)

    Lüscher, Martin

    2014-06-01

    The use of the Yang-Mills gradient flow in step-scaling studies of lattice QCD is expected to lead to results of unprecedented precision. Step scaling is usually based on the Schrödinger functional, where time ranges over an interval [0 , T] and all fields satisfy Dirichlet boundary conditions at time 0 and T. In these calculations, potentially important sources of systematic errors are boundary lattice effects and the infamous topology-freezing problem. The latter is here shown to be absent if Neumann instead of Dirichlet boundary conditions are imposed on the gauge field at time 0. Moreover, the expectation values of gauge-invariant local fields at positive flow time (and of other well localized observables) that reside in the center of the space-time volume are found to be largely insensitive to the boundary lattice effects.

  20. The relationship between the Early Childhood Environment Rating Scale and its revised form and child outcomes: A systematic review and meta-analysis.

    PubMed

    Brunsek, Ashley; Perlman, Michal; Falenchuk, Olesya; McMullen, Evelyn; Fletcher, Brooke; Shah, Prakesh S

    2017-01-01

    The Early Childhood Environment Rating Scale (ECERS) and its revised version (ECERS-R) were designed as global measures of quality that assess structural and process aspects of Early Childhood Education and Care (ECEC) programs. Despite frequent use of the ECERS/ECERS-R in research and applied settings, associations between it and child outcomes have not been systematically reviewed. The objective of this research was to evaluate the association between the ECERS/ECERS-R and children's wellbeing. Searches of Medline, PsycINFO, ERIC, websites of large datasets and reference sections of all retrieved articles were completed up to July 3, 2015. Eligible studies provided a statistical link between the ECERS/ECERS-R and child outcomes for preschool-aged children in ECEC programs. Of the 823 studies selected for full review, 73 were included in the systematic review and 16 were meta-analyzed. The combined sample across all eligible studies consisted of 33, 318 preschool-aged children. Qualitative systematic review results revealed that ECERS/ECERS-R total scores were more generally associated with positive outcomes than subscales or factors. Seventeen separate meta-analyses were conducted to assess the strength of association between the ECERS/ECERS-R and measures that assessed children's language, math and social-emotional outcomes. Meta-analyses revealed a small number of weak effects (in the expected direction) between the ECERS/ECERS-R total score and children's language and positive behavior outcomes. The Language-Reasoning subscale was weakly related to a language outcome. The enormous heterogeneity in how studies operationalized the ECERS/ECERS-R, the outcomes measured and statistics reported limited our ability to meta-analyze many studies. Greater consistency in study methodology is needed in this area of research. Despite these methodological challenges, the ECERS/ECERS-R does appear to capture aspects of quality that are important for children's wellbeing; however, the strength of association is weak.

  1. The relationship between the Early Childhood Environment Rating Scale and its revised form and child outcomes: A systematic review and meta-analysis

    PubMed Central

    Brunsek, Ashley; Perlman, Michal; Falenchuk, Olesya; McMullen, Evelyn; Fletcher, Brooke; Shah, Prakesh S.

    2017-01-01

    The Early Childhood Environment Rating Scale (ECERS) and its revised version (ECERS-R) were designed as global measures of quality that assess structural and process aspects of Early Childhood Education and Care (ECEC) programs. Despite frequent use of the ECERS/ECERS-R in research and applied settings, associations between it and child outcomes have not been systematically reviewed. The objective of this research was to evaluate the association between the ECERS/ECERS-R and children’s wellbeing. Searches of Medline, PsycINFO, ERIC, websites of large datasets and reference sections of all retrieved articles were completed up to July 3, 2015. Eligible studies provided a statistical link between the ECERS/ECERS-R and child outcomes for preschool-aged children in ECEC programs. Of the 823 studies selected for full review, 73 were included in the systematic review and 16 were meta-analyzed. The combined sample across all eligible studies consisted of 33, 318 preschool-aged children. Qualitative systematic review results revealed that ECERS/ECERS-R total scores were more generally associated with positive outcomes than subscales or factors. Seventeen separate meta-analyses were conducted to assess the strength of association between the ECERS/ECERS-R and measures that assessed children’s language, math and social-emotional outcomes. Meta-analyses revealed a small number of weak effects (in the expected direction) between the ECERS/ECERS-R total score and children’s language and positive behavior outcomes. The Language-Reasoning subscale was weakly related to a language outcome. The enormous heterogeneity in how studies operationalized the ECERS/ECERS-R, the outcomes measured and statistics reported limited our ability to meta-analyze many studies. Greater consistency in study methodology is needed in this area of research. Despite these methodological challenges, the ECERS/ECERS-R does appear to capture aspects of quality that are important for children’s wellbeing; however, the strength of association is weak. PMID:28586399

  2. Wall Modeled Large Eddy Simulation of Airfoil Trailing Edge Noise

    NASA Astrophysics Data System (ADS)

    Kocheemoolayil, Joseph; Lele, Sanjiva

    2014-11-01

    Large eddy simulation (LES) of airfoil trailing edge noise has largely been restricted to low Reynolds numbers due to prohibitive computational cost. Wall modeled LES (WMLES) is a computationally cheaper alternative that makes full-scale Reynolds numbers relevant to large wind turbines accessible. A systematic investigation of trailing edge noise prediction using WMLES is conducted. Detailed comparisons are made with experimental data. The stress boundary condition from a wall model does not constrain the fluctuating velocity to vanish at the wall. This limitation has profound implications for trailing edge noise prediction. The simulation over-predicts the intensity of fluctuating wall pressure and far-field noise. An improved wall model formulation that minimizes the over-prediction of fluctuating wall pressure is proposed and carefully validated. The flow configurations chosen for the study are from the workshop on benchmark problems for airframe noise computations. The large eddy simulation database is used to examine the adequacy of scaling laws that quantify the dependence of trailing edge noise on Mach number, Reynolds number and angle of attack. Simplifying assumptions invoked in engineering approaches towards predicting trailing edge noise are critically evaluated. We gratefully acknowledge financial support from GE Global Research and thank Cascade Technologies Inc. for providing access to their massively-parallel large eddy simulation framework.

  3. Parameter studies on the energy balance closure problem using large-eddy simulation

    NASA Astrophysics Data System (ADS)

    De Roo, Frederik; Banerjee, Tirtha; Mauder, Matthias

    2017-04-01

    The imbalance of the surface energy budget in eddy-covariance measurements is still a pending problem. A possible cause is the presence of land surface heterogeneity. Heterogeneities of the boundary layer scale or larger are most effective in influencing the boundary layer turbulence, and large-eddy simulations have shown that secondary circulations within the boundary layer can affect the surface energy budget. However, the precise influence of the surface characteristics on the energy imbalance and its partitioning is still unknown. To investigate the influence of surface variables on all the components of the flux budget under convective conditions, we set up a systematic parameter study by means of large-eddy simulation. For the study we use a virtual control volume approach, and we focus on idealized heterogeneity by considering spatially variable surface fluxes. The surface fluxes vary locally in intensity and these patches have different length scales. The main focus lies on heterogeneities of length scales of the kilometer scale and one decade smaller. For each simulation, virtual measurement towers are positioned at functionally different positions. We discriminate between the locally homogeneous towers, located within land use patches, with respect to the more heterogeneous towers, and find, among others, that the flux-divergence and the advection are strongly linearly related within each class. Furthermore, we seek correlators for the energy balance ratio and the energy residual in the simulations. Besides the expected correlation with measurable atmospheric quantities such as the friction velocity, boundary-layer depth and temperature and moisture gradients, we have also found an unexpected correlation with the temperature difference between sonic temperature and surface temperature. In additional simulations with a large number of virtual towers, we investigate higher order correlations, which can be linked to secondary circulations. In a companion presentation (EGU2017-2130) these correlations are investigated and confirmed with the help of micrometeorological measurements from the TERENO sites where the effects of landscape scale surface heterogeneities are deemed to be important.

  4. Atmospheric gravity waves with small vertical-to-horizotal wavelength ratios

    NASA Astrophysics Data System (ADS)

    Song, I. S.; Jee, G.; Kim, Y. H.; Chun, H. Y.

    2017-12-01

    Gravity wave modes with small vertical-to-horizontal wavelength ratios of an order of 10-3 are investigated through the systematic scale analysis of governing equations for gravity wave perturbations embedded in the quasi-geostrophic large-scale flow. These waves can be categorized as acoustic gravity wave modes because their total energy is given by the sum of kinetic, potential, and elastic parts. It is found that these waves can be forced by density fluctuations multiplied by the horizontal gradients of the large-scale pressure (geopotential) fields. These theoretical findings are evaluated using the results of a high-resolution global model (Specified Chemistry WACCM with horizontal resolution of 25 km and vertical resolution of 600 m) by computing the density-related gravity-wave forcing terms from the modeling results.

  5. Radial scaling in inclusive jet production at hadron colliders

    NASA Astrophysics Data System (ADS)

    Taylor, Frank E.

    2018-03-01

    Inclusive jet production in p-p and p ¯ -p collisions shows many of the same kinematic systematics as observed in single-particle inclusive production at much lower energies. In an earlier study (1974) a phenomenology, called radial scaling, was developed for the single-particle inclusive cross sections that attempted to capture the essential underlying physics of pointlike parton scattering and the fragmentation of partons into hadrons suppressed by the kinematic boundary. The phenomenology was successful in emphasizing the underlying systematics of the inclusive particle productions. Here we demonstrate that inclusive jet production at the Large Hadron Collider (LHC) in high-energy p-p collisions and at the Tevatron in p ¯ -p inelastic scattering shows similar behavior. The ATLAS inclusive jet production plotted as a function of this scaling variable is studied for √s of 2.76, 7 and 13 TeV and is compared to p ¯ -p inclusive jet production at 1.96 TeV measured at the CDF and D0 at the Tevatron and p-Pb inclusive jet production at the LHC ATLAS at √sNN=5.02 TeV . Inclusive single-particle production at Fermi National Accelerator Laboratory fixed target and Intersecting Storage Rings energies are compared to inclusive J /ψ production at the LHC measured in ATLAS, CMS and LHCb. Striking common features of the data are discussed.

  6. Spectral nudging to eliminate the effects of domain position and geometry in regional climate model simulations

    NASA Astrophysics Data System (ADS)

    Miguez-Macho, Gonzalo; Stenchikov, Georgiy L.; Robock, Alan

    2004-07-01

    It is well known that regional climate simulations are sensitive to the size and position of the domain chosen for calculations. Here we study the physical mechanisms of this sensitivity. We conducted simulations with the Regional Atmospheric Modeling System (RAMS) for June 2000 over North America at 50 km horizontal resolution using a 7500 km × 5400 km grid and NCEP/NCAR reanalysis as boundary conditions. The position of the domain was displaced in several directions, always maintaining the U.S. in the interior, out of the buffer zone along the lateral boundaries. Circulation biases developed a large scale structure, organized by the Rocky Mountains, resulting from a systematic shifting of the synoptic wave trains that crossed the domain. The distortion of the large-scale circulation was produced by interaction of the modeled flow with the lateral boundaries of the nested domain and varied when the position of the grid was altered. This changed the large-scale environment among the different simulations and translated into diverse conditions for the development of the mesoscale processes that produce most of precipitation for the Great Plains in the summer season. As a consequence, precipitation results varied, sometimes greatly, among the experiments with the different grid positions. To eliminate the dependence of results on the position of the domain, we used spectral nudging of waves longer than 2500 km above the boundary layer. Moisture was not nudged at any level. This constrained the synoptic scales to follow reanalysis while allowing the model to develop the small-scale dynamics responsible for the rainfall. Nudging of the large scales successfully eliminated the variation of precipitation results when the grid was moved. We suggest that this technique is necessary for all downscaling studies with regional models with domain sizes of a few thousand kilometers and larger embedded in global models.

  7. Assessing the harms of cannabis cultivation in Belgium.

    PubMed

    Paoli, Letizia; Decorte, Tom; Kersten, Loes

    2015-03-01

    Since the 1990s, a shift from the importation of foreign cannabis to domestic cultivation has taken place in Belgium, as it has in many other countries. This shift has prompted Belgian policy-making bodies to prioritize the repression of cannabis cultivation. Against this background, the article aims to systematically map and assess for the first time ever the harms associated with cannabis cultivation, covering the whole spectrum of growers. This study is based on a web survey primarily targeting small-scale growers (N=1293) and on three interconnected sets of qualitative data on large-scale growers and traffickers (34 closed criminal proceedings, interviews with 32 criminal justice experts, and with 17 large-scale cannabis growers and three traffickers). The study relied on Greenfield and Paoli's (2013) harm assessment framework to identify the harms associated with cannabis cultivation and to assess the incidence, severity and causes of such harms. Cannabis cultivation has become endemic in Belgium. Despite that, it generates, for Belgium, limited harms of medium-low or medium priority. Large-scale growers tend to produce more harms than the small-scale ones. Virtually all the harms associated with cannabis cultivation are the result of the current criminalizing policies. Given the spread of cannabis cultivation and Belgium's position in Europe, reducing the supply of cannabis does not appear to be a realistic policy objective. Given the limited harms generated, there is scarce scientific justification to prioritize cannabis cultivation in Belgian law enforcement strategies. As most harms are generated by large-scale growers, it is this category of cultivator, if any, which should be the focus of law enforcement repression. Given the policy origin of most harms, policy-makers should seek to develop policies likely to reduce such harms. At the same time, further research is needed to comparatively assess the harms associated with cannabis cultivation (and trafficking) with those arising from use. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Mixture model normalization for non-targeted gas chromatography/mass spectrometry metabolomics data.

    PubMed

    Reisetter, Anna C; Muehlbauer, Michael J; Bain, James R; Nodzenski, Michael; Stevens, Robert D; Ilkayeva, Olga; Metzger, Boyd E; Newgard, Christopher B; Lowe, William L; Scholtens, Denise M

    2017-02-02

    Metabolomics offers a unique integrative perspective for health research, reflecting genetic and environmental contributions to disease-related phenotypes. Identifying robust associations in population-based or large-scale clinical studies demands large numbers of subjects and therefore sample batching for gas-chromatography/mass spectrometry (GC/MS) non-targeted assays. When run over weeks or months, technical noise due to batch and run-order threatens data interpretability. Application of existing normalization methods to metabolomics is challenged by unsatisfied modeling assumptions and, notably, failure to address batch-specific truncation of low abundance compounds. To curtail technical noise and make GC/MS metabolomics data amenable to analyses describing biologically relevant variability, we propose mixture model normalization (mixnorm) that accommodates truncated data and estimates per-metabolite batch and run-order effects using quality control samples. Mixnorm outperforms other approaches across many metrics, including improved correlation of non-targeted and targeted measurements and superior performance when metabolite detectability varies according to batch. For some metrics, particularly when truncation is less frequent for a metabolite, mean centering and median scaling demonstrate comparable performance to mixnorm. When quality control samples are systematically included in batches, mixnorm is uniquely suited to normalizing non-targeted GC/MS metabolomics data due to explicit accommodation of batch effects, run order and varying thresholds of detectability. Especially in large-scale studies, normalization is crucial for drawing accurate conclusions from non-targeted GC/MS metabolomics data.

  9. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    NASA Astrophysics Data System (ADS)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  10. Precomputing upscaled hydraulic conductivity for complex geological structures

    NASA Astrophysics Data System (ADS)

    Mariethoz, G.; Jha, S. K.; George, M.; Maheswarajah, S.; John, V.; De Re, D.; Smith, M.

    2013-12-01

    3D geological models are built to capture the geological heterogeneity at a fine scale. However groundwater modellers are often interested in the hydraulic conductivity (K) values at a much coarser scale to reduce the numerical burden. Upscaling is used to assign conductivity to large volumes, which necessarily causes a loss of information. Recent literature has shown that the connectivity in the channelized structures is an important feature that needs to be taken into account for accurate upscaling. In this work we study the effect of channel parameters, e.g. width, sinuosity, connectivity etc. on the upscaled values of the hydraulic conductivity and the associated uncertainty. We devise a methodology that derives correspondences between a lithological description and the equivalent hydraulic conductivity at a larger scale. The method uses multiple-point geostatistics simulations (MPS) and parameterizes the 3D structures by introducing continuous rotation and affinity parameters. Additional statistical characterization is obtained by transition probabilities and connectivity measures. Equivalent hydraulic conductivity is then estimated by solving a flow problem for the entire heterogeneous domain by applying steady state flow in horizontal and vertical directions. This is systematically performed for many random realisations of the small scale structures to enable a probability distribution for the equivalent upscaled hydraulic conductivity. This process allows deriving systematic relationships between a given depositional environment and precomputed equivalent parameters. A modeller can then exploit the prior knowledge of the depositional environment and expected geological heterogeneity to bypass the step of generating small-scale models, and directly work with upscaled values.

  11. Theory of wavelet-based coarse-graining hierarchies for molecular dynamics.

    PubMed

    Rinderspacher, Berend Christopher; Bardhan, Jaydeep P; Ismail, Ahmed E

    2017-07-01

    We present a multiresolution approach to compressing the degrees of freedom and potentials associated with molecular dynamics, such as the bond potentials. The approach suggests a systematic way to accelerate large-scale molecular simulations with more than two levels of coarse graining, particularly applications of polymeric materials. In particular, we derive explicit models for (arbitrarily large) linear (homo)polymers and iterative methods to compute large-scale wavelet decompositions from fragment solutions. This approach does not require explicit preparation of atomistic-to-coarse-grained mappings, but instead uses the theory of diffusion wavelets for graph Laplacians to develop system-specific mappings. Our methodology leads to a hierarchy of system-specific coarse-grained degrees of freedom that provides a conceptually clear and mathematically rigorous framework for modeling chemical systems at relevant model scales. The approach is capable of automatically generating as many coarse-grained model scales as necessary, that is, to go beyond the two scales in conventional coarse-grained strategies; furthermore, the wavelet-based coarse-grained models explicitly link time and length scales. Furthermore, a straightforward method for the reintroduction of omitted degrees of freedom is presented, which plays a major role in maintaining model fidelity in long-time simulations and in capturing emergent behaviors.

  12. Building work engagement: A systematic review and meta-analysis investigating the effectiveness of work engagement interventions.

    PubMed

    Knight, Caroline; Patterson, Malcolm; Dawson, Jeremy

    2017-07-01

    Low work engagement may contribute towards decreased well-being and work performance. Evaluating, boosting and sustaining work engagement are therefore of interest to many organisations. However, the evidence on which to base interventions has not yet been synthesised. A systematic review with meta-analysis was conducted to assess the evidence for the effectiveness of work engagement interventions. A systematic literature search identified controlled workplace interventions employing a validated measure of work engagement. Most used the Utrecht Work Engagement Scale (UWES). Studies containing the relevant quantitative data underwent random-effects meta-analyses. Results were assessed for homogeneity, systematic sampling error, publication bias and quality. Twenty studies met the inclusion criteria and were categorised into four types of interventions: (i) personal resource building; (ii) job resource building; (iii) leadership training; and (iv) health promotion. The overall effect on work engagement was small, but positive, k  = 14, Hedges g  = 0.29, 95%-CI = 0.12-0.46. Moderator analyses revealed a significant result for intervention style, with a medium to large effect for group interventions. Heterogeneity between the studies was high, and the success of implementation varied. More studies are needed, and researchers are encouraged to collaborate closely with organisations to design interventions appropriate to individual contexts and settings, and include evaluations of intervention implementation. © 2016 The Authors. Journal of Organizational Behavior published by John Wiley & Sons, Ltd.

  13. Measurement properties, feasibility and clinical utility of the Doloplus-2 pain scale in older adults with cognitive impairment: a systematic review.

    PubMed

    Rostad, Hanne Marie; Utne, Inger; Grov, Ellen Karine; Puts, Martine; Halvorsrud, Liv

    2017-11-02

    The Doloplus-2 is a pain assessment scale for assessing pain in older adults with cognitive impairment. It is used in clinical practice and research. However, evidence for its measurement properties, feasibility and clinical utility remain incomplete. This systematic review synthesizes previous research on the measurement properties, feasibility and clinical utility of the scale. We conducted a systematic search in three databases (CINAHL, Medline and PsycINFO) for studies published in English, French, German, Dutch/Flemish or a Scandinavian language between 1990 and April 2017. We also reviewed the Doloplus-2 homepage and reference lists of included studies to supplement our search. Two reviewers independently reviewed titles and abstracts and performed the quality assessment and data abstraction. A total of 24 studies were included in this systematic review. The quality of the studies varied, but many lacked sufficient detail about the samples and response rates. The Doloplus-2 has been studied using diverse samples in a variety of settings; most study participants were in long-term care and in people with dementia. Sixteen studies addressed various aspects of the scale's feasibility and clinical utility, but their results are limited and inconsistent across settings and samples. Support for the scale's reliability, validity and responsiveness varied widely across the studies. Generally, the reliability coefficients reached acceptable benchmarks, but the evidence for different aspects of the scale's validity and responsiveness was incomplete. Additional high-quality studies are warranted to determine in which populations of older adults with cognitive impairment the Doloplus-2 is reliable, valid and feasible. The ability of the Doloplus-2 to meaningfully quantify pain, measure treatment response and improve patient outcomes also needs further investigation. PROSPERO reg. no.: CRD42016049697 registered 20. Oct. 2016.

  14. Optimized spatial priorities for biodiversity conservation in China: a systematic conservation planning perspective.

    PubMed

    Wu, Ruidong; Long, Yongcheng; Malanson, George P; Garber, Paul A; Zhang, Shuang; Li, Diqiang; Zhao, Peng; Wang, Longzhu; Duo, Hairui

    2014-01-01

    By addressing several key features overlooked in previous studies, i.e. human disturbance, integration of ecosystem- and species-level conservation features, and principles of complementarity and representativeness, we present the first national-scale systematic conservation planning for China to determine the optimized spatial priorities for biodiversity conservation. We compiled a spatial database on the distributions of ecosystem- and species-level conservation features, and modeled a human disturbance index (HDI) by aggregating information using several socioeconomic proxies. We ran Marxan with two scenarios (HDI-ignored and HDI-considered) to investigate the effects of human disturbance, and explored the geographic patterns of the optimized spatial conservation priorities. Compared to when HDI was ignored, the HDI-considered scenario resulted in (1) a marked reduction (∼9%) in the total HDI score and a slight increase (∼7%) in the total area of the portfolio of priority units, (2) a significant increase (∼43%) in the total irreplaceable area and (3) more irreplaceable units being identified in almost all environmental zones and highly-disturbed provinces. Thus the inclusion of human disturbance is essential for cost-effective priority-setting. Attention should be targeted to the areas that are characterized as moderately-disturbed, <2,000 m in altitude, and/or intermediately- to extremely-rugged in terrain to identify potentially important regions for implementing cost-effective conservation. We delineated 23 primary large-scale priority areas that are significant for conserving China's biodiversity, but those isolated priority units in disturbed regions are in more urgent need of conservation actions so as to prevent immediate and severe biodiversity loss. This study presents a spatially optimized national-scale portfolio of conservation priorities--effectively representing the overall biodiversity of China while minimizing conflicts with economic development. Our results offer critical insights for current conservation and strategic land-use planning in China. The approach is transferable and easy to implement by end-users, and applicable for national- and local-scale systematic conservation prioritization practices.

  15. Optimized Spatial Priorities for Biodiversity Conservation in China: A Systematic Conservation Planning Perspective

    PubMed Central

    Wu, Ruidong; Long, Yongcheng; Malanson, George P.; Garber, Paul A.; Zhang, Shuang; Li, Diqiang; Zhao, Peng; Wang, Longzhu; Duo, Hairui

    2014-01-01

    By addressing several key features overlooked in previous studies, i.e. human disturbance, integration of ecosystem- and species-level conservation features, and principles of complementarity and representativeness, we present the first national-scale systematic conservation planning for China to determine the optimized spatial priorities for biodiversity conservation. We compiled a spatial database on the distributions of ecosystem- and species-level conservation features, and modeled a human disturbance index (HDI) by aggregating information using several socioeconomic proxies. We ran Marxan with two scenarios (HDI-ignored and HDI-considered) to investigate the effects of human disturbance, and explored the geographic patterns of the optimized spatial conservation priorities. Compared to when HDI was ignored, the HDI-considered scenario resulted in (1) a marked reduction (∼9%) in the total HDI score and a slight increase (∼7%) in the total area of the portfolio of priority units, (2) a significant increase (∼43%) in the total irreplaceable area and (3) more irreplaceable units being identified in almost all environmental zones and highly-disturbed provinces. Thus the inclusion of human disturbance is essential for cost-effective priority-setting. Attention should be targeted to the areas that are characterized as moderately-disturbed, <2,000 m in altitude, and/or intermediately- to extremely-rugged in terrain to identify potentially important regions for implementing cost-effective conservation. We delineated 23 primary large-scale priority areas that are significant for conserving China's biodiversity, but those isolated priority units in disturbed regions are in more urgent need of conservation actions so as to prevent immediate and severe biodiversity loss. This study presents a spatially optimized national-scale portfolio of conservation priorities – effectively representing the overall biodiversity of China while minimizing conflicts with economic development. Our results offer critical insights for current conservation and strategic land-use planning in China. The approach is transferable and easy to implement by end-users, and applicable for national- and local-scale systematic conservation prioritization practices. PMID:25072933

  16. Multiscale modeling of lithium ion batteries: thermal aspects

    PubMed Central

    Zausch, Jochen

    2015-01-01

    Summary The thermal behavior of lithium ion batteries has a huge impact on their lifetime and the initiation of degradation processes. The development of hot spots or large local overpotentials leading, e.g., to lithium metal deposition depends on material properties as well as on the nano- und microstructure of the electrodes. In recent years a theoretical structure emerges, which opens the possibility to establish a systematic modeling strategy from atomistic to continuum scale to capture and couple the relevant phenomena on each scale. We outline the building blocks for such a systematic approach and discuss in detail a rigorous approach for the continuum scale based on rational thermodynamics and homogenization theories. Our focus is on the development of a systematic thermodynamically consistent theory for thermal phenomena in batteries at the microstructure scale and at the cell scale. We discuss the importance of carefully defining the continuum fields for being able to compare seemingly different phenomenological theories and for obtaining rules to determine unknown parameters of the theory by experiments or lower-scale theories. The resulting continuum models for the microscopic and the cell scale are numerically solved in full 3D resolution. The complex very localized distributions of heat sources in a microstructure of a battery and the problems of mapping these localized sources on an averaged porous electrode model are discussed by comparing the detailed 3D microstructure-resolved simulations of the heat distribution with the result of the upscaled porous electrode model. It is shown, that not all heat sources that exist on the microstructure scale are represented in the averaged theory due to subtle cancellation effects of interface and bulk heat sources. Nevertheless, we find that in special cases the averaged thermal behavior can be captured very well by porous electrode theory. PMID:25977870

  17. Accounting for baryonic effects in cosmic shear tomography: Determining a minimal set of nuisance parameters using PCA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eifler, Tim; Krause, Elisabeth; Dodelson, Scott

    2014-05-28

    Systematic uncertainties that have been subdominant in past large-scale structure (LSS) surveys are likely to exceed statistical uncertainties of current and future LSS data sets, potentially limiting the extraction of cosmological information. Here we present a general framework (PCA marginalization) to consistently incorporate systematic effects into a likelihood analysis. This technique naturally accounts for degeneracies between nuisance parameters and can substantially reduce the dimension of the parameter space that needs to be sampled. As a practical application, we apply PCA marginalization to account for baryonic physics as an uncertainty in cosmic shear tomography. Specifically, we use CosmoLike to run simulatedmore » likelihood analyses on three independent sets of numerical simulations, each covering a wide range of baryonic scenarios differing in cooling, star formation, and feedback mechanisms. We simulate a Stage III (Dark Energy Survey) and Stage IV (Large Synoptic Survey Telescope/Euclid) survey and find a substantial bias in cosmological constraints if baryonic physics is not accounted for. We then show that PCA marginalization (employing at most 3 to 4 nuisance parameters) removes this bias. Our study demonstrates that it is possible to obtain robust, precise constraints on the dark energy equation of state even in the presence of large levels of systematic uncertainty in astrophysical processes. We conclude that the PCA marginalization technique is a powerful, general tool for addressing many of the challenges facing the precision cosmology program.« less

  18. Nebula Scale Mixing Between Non-Carbonaceous and Carbonaceous Chondrite Reservoirs: Testing the Grand Tack Model with Almahata Sitta Stones

    NASA Technical Reports Server (NTRS)

    Yin, Q.-Z.; Sanborn, M. E.; Goodrich, C. A.; Zolensky, M.; Fioretti, A. M.; Shaddad, M.; Kohl, I. E.; Young, E. D.

    2018-01-01

    There is an increasing number of Cr-O-Ti isotope studies that show that solar system materials are divided into two main populations, one carbonaceous chondrite (CC)-like and the other is non-carbonaceous (NCC)-like, with minimal mixing between them attributed to a gap opened in the propoplanetary disk due to Jupiter's formation. The Grand Tack model suggests that there should be a particular time in the disk history when this gap is breached and ensuring a subsequent large-scale mixing between S- and C-type asteroids (inner solar system and outer solar system materials), an idea supported by our recent work on chondrule (Delta)17O-(epsilon)54Cr isotope systematics.

  19. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  20. Large-scale shell-model calculation with core excitations for neutron-rich nuclei beyond 132Sn

    NASA Astrophysics Data System (ADS)

    Jin, Hua; Hasegawa, Munetake; Tazaki, Shigeru; Kaneko, Kazunari; Sun, Yang

    2011-10-01

    The structure of neutron-rich nuclei with a few nucleons beyond 132Sn is investigated by means of large-scale shell-model calculations. For a considerably large model space, including neutron core excitations, a new effective interaction is determined by employing the extended pairing-plus-quadrupole model with monopole corrections. The model provides a systematical description for energy levels of A=133-135 nuclei up to high spins and reproduces available data of electromagnetic transitions. The structure of these nuclei is analyzed in detail, with emphasis of effects associated with core excitations. The results show evidence of hexadecupole correlation in addition to octupole correlation in this mass region. The suggested feature of magnetic rotation in 135Te occurs in the present shell-model calculation.

  1. Motivational interviewing: a systematic review and meta-analysis

    PubMed Central

    Rubak, Sune; Sandbæk, Annelli; Lauritzen, Torsten; Christensen, Bo

    2005-01-01

    Background Motivational Interviewing is a well-known, scientifically tested method of counselling clients developed by Miller and Rollnick and viewed as a useful intervention strategy in the treatment of lifestyle problems and disease. Aim To evaluate the effectiveness of motivational interviewing in different areas of disease and to identify factors shaping outcomes. Design of study A systematic review and meta-analysis of randomised controlled trials using motivational interviewing as the intervention. Method After selection criteria a systematic literature search in 16 databases produced 72 randomised controlled trials the first of which was published in 1991. A quality assessment was made with a validated scale. A meta-analysis was performed as a generic inverse variance meta-analysis. Results Meta-analysis showed a significant effect (95% confidence interval) for motivational interviewing for combined effect estimates for body mass index, total blood cholesterol, systolic blood pressure, blood alcohol concentration and standard ethanol content, while combined effect estimates for cigarettes per day and for HbA1c were not significant. Motivational interviewing had a significant and clinically relevant effect in approximately three out of four studies, with an equal effect on physiological (72%) and psychological (75%) diseases. Psychologists and physicians obtained an effect in approximately 80% of the studies, while other healthcare providers obtained an effect in 46% of the studies. When using motivational interviewing in brief encounters of 15 minutes, 64% of the studies showed an effect. More than one encounter with the patient ensures the effectiveness of motivational interviewing. Conclusion Motivational interviewing in a scientific setting outperforms traditional advice giving in the treatment of a broad range of behavioural problems and diseases. Large-scale studies are now needed to prove that motivational interviewing can be implemented into daily clinical work in primary and secondary health care. PMID:15826439

  2. Final Report: The Influence of Novel Behavioral Strategies in Promoting the Diffusion of Solar Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillingham, Kenneth; Bollinger, Bryan

    This is the final report for a systematic, evidence-based project using an unprecedented series of large-scale field experiments to examine the effectiveness and cost-effectiveness of novel approaches to reduce the soft costs of solar residential photovoltaics. The approaches were based around grassroots marketing campaigns called ‘Solarize’ campaigns, that were designed to lower costs and increase adoption of solar technology. This study quantified the effectiveness and cost-effectiveness of the Solarize programs and tested new approaches to further improve the model.

  3. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  4. School Mental Health: The Impact of State and Local Capacity-Building Training

    ERIC Educational Resources Information Center

    Stephan, Sharon; Paternite, Carl; Grimm, Lindsey; Hurwitz, Laura

    2014-01-01

    Despite a growing number of collaborative partnerships between schools and community-based organizations to expand school mental health (SMH) service capacity in the United States, there have been relatively few systematic initiatives focused on key strategies for large-scale SMH capacity building with state and local education systems. Based on a…

  5. Evidence of evolutionary history and selective sweeps in the genome of Meishan pig reveals its genetic and phenotypic characterization

    USDA-ARS?s Scientific Manuscript database

    Meishan is a famous Chinese indigenous pig breed known for its extremely high fecundity. To explore if Meishan has unique evolutionary process and genome characteristics differing from other pig breeds, we systematically analyzed its genetic divergence, and demographic history by large-scale reseque...

  6. Large-scale mapping of mutations affecting zebrafish development.

    PubMed

    Geisler, Robert; Rauch, Gerd-Jörg; Geiger-Rudolph, Silke; Albrecht, Andrea; van Bebber, Frauke; Berger, Andrea; Busch-Nentwich, Elisabeth; Dahm, Ralf; Dekens, Marcus P S; Dooley, Christopher; Elli, Alexandra F; Gehring, Ines; Geiger, Horst; Geisler, Maria; Glaser, Stefanie; Holley, Scott; Huber, Matthias; Kerr, Andy; Kirn, Anette; Knirsch, Martina; Konantz, Martina; Küchler, Axel M; Maderspacher, Florian; Neuhauss, Stephan C; Nicolson, Teresa; Ober, Elke A; Praeg, Elke; Ray, Russell; Rentzsch, Brit; Rick, Jens M; Rief, Eva; Schauerte, Heike E; Schepp, Carsten P; Schönberger, Ulrike; Schonthaler, Helia B; Seiler, Christoph; Sidi, Samuel; Söllner, Christian; Wehner, Anja; Weiler, Christian; Nüsslein-Volhard, Christiane

    2007-01-09

    Large-scale mutagenesis screens in the zebrafish employing the mutagen ENU have isolated several hundred mutant loci that represent putative developmental control genes. In order to realize the potential of such screens, systematic genetic mapping of the mutations is necessary. Here we report on a large-scale effort to map the mutations generated in mutagenesis screening at the Max Planck Institute for Developmental Biology by genome scanning with microsatellite markers. We have selected a set of microsatellite markers and developed methods and scoring criteria suitable for efficient, high-throughput genome scanning. We have used these methods to successfully obtain a rough map position for 319 mutant loci from the Tübingen I mutagenesis screen and subsequent screening of the mutant collection. For 277 of these the corresponding gene is not yet identified. Mapping was successful for 80 % of the tested loci. By comparing 21 mutation and gene positions of cloned mutations we have validated the correctness of our linkage group assignments and estimated the standard error of our map positions to be approximately 6 cM. By obtaining rough map positions for over 300 zebrafish loci with developmental phenotypes, we have generated a dataset that will be useful not only for cloning of the affected genes, but also to suggest allelism of mutations with similar phenotypes that will be identified in future screens. Furthermore this work validates the usefulness of our methodology for rapid, systematic and inexpensive microsatellite mapping of zebrafish mutations.

  7. Large scale systematic proteomic quantification from non-metastatic to metastatic colorectal cancer

    NASA Astrophysics Data System (ADS)

    Yin, Xuefei; Zhang, Yang; Guo, Shaowen; Jin, Hong; Wang, Wenhai; Yang, Pengyuan

    2015-07-01

    A systematic proteomic quantification of formalin-fixed, paraffin-embedded (FFPE) colorectal cancer tissues from stage I to stage IIIC was performed in large scale. 1017 proteins were identified with 338 proteins in quantitative changes by label free method, while 341 proteins were quantified with significant expression changes among 6294 proteins by iTRAQ method. We found that proteins related to migration expression increased and those for binding and adherent decreased during the colorectal cancer development according to the gene ontology (GO) annotation and ingenuity pathway analysis (IPA). The integrin alpha 5 (ITA5) in integrin family was focused, which was consistent with the metastasis related pathway. The expression level of ITA5 decreased in metastasis tissues and the result has been further verified by Western blotting. Another two cell migration related proteins vitronectin (VTN) and actin-related protein (ARP3) were also proved to be up-regulated by both mass spectrometry (MS) based quantification results and Western blotting. Up to now, our result shows one of the largest dataset in colorectal cancer proteomics research. Our strategy reveals a disease driven omics-pattern for the metastasis colorectal cancer.

  8. Hyaluronic acid in the treatment of knee osteoarthritis: a systematic review and meta-analysis with emphasis on the efficacy of different products.

    PubMed

    Colen, Sascha; van den Bekerom, Michel P J; Mulier, Michiel; Haverkamp, Daniël

    2012-08-01

    Although accepted as a conservative treatment option for knee osteoarthritis, the debate about the effectiveness of intra-articular treatment with hyaluronic acid (HA) is still ongoing because of contrasting outcomes in different clinical studies. Several well designed clinical studies showed a significant improvement in pain at follow-up compared with baseline but no significant improvement comparing the efficacy of HA with placebo (saline) or with other conservative treatment options. Notwithstanding the effectiveness of different types of intra-articular HA products, the question of whether one HA product is better than another is still unanswered. In this systematic review we compare the effects of intra-articularly administered HA with intra-articularly administered placebo in general and, more specifically, the effects of individual HA products with placebo. We also compare the efficacy of different HA products. A systematic review of randomized controlled trials (RCTs) was conducted using databases including MEDLINE, Cochrane Database of Systematic Reviews, Cochrane Clinical Trial Register and EMBASE. Seventy-four RCTs were included in this systematic review. HA improves pain by approximately 40-50% compared with baseline levels. However, when compared with saline the difference in efficacy is not that large. Due to a large 'placebo effect' of saline (approximately 30% pain reduction, persisting for at least 3 months) we determined a weighted mean difference between the efficacy of HA and saline of just 10.20 using the visual analog scale for pain. It is debatable whether this difference reaches the minimum clinically important difference. Comparing the different HA products, which vary in the molecular weight, concentration, and volume of HA, we were not able to conclude that one brand has a better efficacy than another due to the heterogeneity of the studies and outcomes. In the future it will be important to determine the exact mechanism of action of placebo as this may give us an idea of how to treat osteoarthritis more efficiently. Due to the limitations of this review (follow-up of just 3 months and large heterogeneity of the included studies), it is also important to compare the different HA products to determine which product(s), or which molecular weight range, concentration, or volume of HA is the best option to treat osteoarthritis. Our recommendation is to start large (multicenter) RCTs to give us more evidence about the efficacy of the different HA products.

  9. Multifractal spectrum and lacunarity as measures of complexity of osseointegration.

    PubMed

    de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko

    2016-07-01

    The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.

  10. Four-center bubbled BPS solutions with a Gibbons-Hawking base

    NASA Astrophysics Data System (ADS)

    Heidmann, Pierre

    2017-10-01

    We construct four-center bubbled BPS solutions with a Gibbons-Hawking base space. We give a systematic procedure to build scaling solutions: starting from three-supertube configurations and using generalized spectral flows and gauge transformations to extend to solutions with four Gibbons-Hawking centers. This allows us to construct very large families of smooth horizonless solutions that have the same charges and angular momentum as supersymmetric black holes with a macroscopically large horizon area. Our construction reveals that all scaling solutions with four Gibbons Hawking centers have an angular momentum at around 99% of the cosmic censorship bound. We give both an analytical and a numerical explanation for this unexpected feature.

  11. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  12. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring should be strengthened.

  13. The effect of large-scale model time step and multiscale coupling frequency on cloud climatology, vertical structure, and rainfall extremes in a superparameterized GCM

    DOE PAGES

    Yu, Sungduk; Pritchard, Michael S.

    2015-12-17

    The effect of global climate model (GCM) time step—which also controls how frequently global and embedded cloud resolving scales are coupled—is examined in the Superparameterized Community Atmosphere Model ver 3.0. Systematic bias reductions of time-mean shortwave cloud forcing (~10 W/m 2) and longwave cloud forcing (~5 W/m 2) occur as scale coupling frequency increases, but with systematically increasing rainfall variance and extremes throughout the tropics. An overarching change in the vertical structure of deep tropical convection, favoring more bottom-heavy deep convection as a global model time step is reduced may help orchestrate these responses. The weak temperature gradient approximation ismore » more faithfully satisfied when a high scale coupling frequency (a short global model time step) is used. These findings are distinct from the global model time step sensitivities of conventionally parameterized GCMs and have implications for understanding emergent behaviors of multiscale deep convective organization in superparameterized GCMs. Lastly, the results may also be useful for helping to tune them.« less

  14. The effect of large-scale model time step and multiscale coupling frequency on cloud climatology, vertical structure, and rainfall extremes in a superparameterized GCM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Sungduk; Pritchard, Michael S.

    The effect of global climate model (GCM) time step—which also controls how frequently global and embedded cloud resolving scales are coupled—is examined in the Superparameterized Community Atmosphere Model ver 3.0. Systematic bias reductions of time-mean shortwave cloud forcing (~10 W/m 2) and longwave cloud forcing (~5 W/m 2) occur as scale coupling frequency increases, but with systematically increasing rainfall variance and extremes throughout the tropics. An overarching change in the vertical structure of deep tropical convection, favoring more bottom-heavy deep convection as a global model time step is reduced may help orchestrate these responses. The weak temperature gradient approximation ismore » more faithfully satisfied when a high scale coupling frequency (a short global model time step) is used. These findings are distinct from the global model time step sensitivities of conventionally parameterized GCMs and have implications for understanding emergent behaviors of multiscale deep convective organization in superparameterized GCMs. Lastly, the results may also be useful for helping to tune them.« less

  15. A SYSTEMATIC SEARCH FOR COROTATING INTERACTION REGIONS IN APPARENTLY SINGLE GALACTIC WOLF-RAYET STARS. II. A GLOBAL VIEW OF THE WIND VARIABILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chene, A.-N.; St-Louis, N., E-mail: achene@astro-udec.cl, E-mail: stlouis@astro.umontreal.ca

    This study is the second part of a survey searching for large-scale spectroscopic variability in apparently single Wolf-Rayet (WR) stars. In a previous paper (Paper I), we described and characterized the spectroscopic variability level of 25 WR stars observable from the northern hemisphere and found 3 new candidates presenting large-scale wind variability, potentially originating from large-scale structures named corotating interaction regions (CIRs). In this second paper, we discuss an additional 39 stars observable from the southern hemisphere. For each star in our sample, we obtained 4-5 high-resolution spectra with a signal-to-noise ratio of {approx}100 and determined its variability level usingmore » the approach described in Paper I. In total, 10 new stars are found to show large-scale spectral variability of which 7 present CIR-type changes (WR 8, WR 44, WR55, WR 58, WR 61, WR 63, WR 100). Of the remaining stars, 20 were found to show small-amplitude changes and 9 were found to show no spectral variability as far as can be concluded from the data on hand. Also, we discuss the spectroscopic variability level of all single galactic WR stars that are brighter than v {approx} 12.5, and some WR stars with 12.5 < v {<=} 13.5, i.e., all the stars presented in our two papers and four more stars for which spectra have already been published in the literature. We find that 23/68 stars (33.8%) present large-scale variability, but only 12/54 stars ({approx}22.1%) are potentially of CIR type. Also, we find that 31/68 stars (45.6%) only show small-scale variability, most likely due to clumping in the wind. Finally, no spectral variability is detected based on the data on hand for 14/68 (20.6%) stars. Interestingly, the variability with the highest amplitude also has the widest mean velocity dispersion.« less

  16. Systematic identification of proteins that elicit drug side effects

    PubMed Central

    Kuhn, Michael; Al Banchaabouchi, Mumna; Campillos, Monica; Jensen, Lars Juhl; Gross, Cornelius; Gavin, Anne-Claude; Bork, Peer

    2013-01-01

    Side effect similarities of drugs have recently been employed to predict new drug targets, and networks of side effects and targets have been used to better understand the mechanism of action of drugs. Here, we report a large-scale analysis to systematically predict and characterize proteins that cause drug side effects. We integrated phenotypic data obtained during clinical trials with known drug–target relations to identify overrepresented protein–side effect combinations. Using independent data, we confirm that most of these overrepresentations point to proteins which, when perturbed, cause side effects. Of 1428 side effects studied, 732 were predicted to be predominantly caused by individual proteins, at least 137 of them backed by existing pharmacological or phenotypic data. We prove this concept in vivo by confirming our prediction that activation of the serotonin 7 receptor (HTR7) is responsible for hyperesthesia in mice, which, in turn, can be prevented by a drug that selectively inhibits HTR7. Taken together, we show that a large fraction of complex drug side effects are mediated by individual proteins and create a reference for such relations. PMID:23632385

  17. Investigation of low-latitude hydrogen emission in terms of a two-component interstellar gas model

    NASA Technical Reports Server (NTRS)

    Baker, P. L.; Burton, W. B.

    1975-01-01

    High-resolution 21-cm hydrogen line observations at low galactic latitude are analyzed to determine the large-scale distribution of galactic hydrogen. Distribution parameters are found by model fitting, optical depth effects are computed using a two-component gas model suggested by the observations, and calculations are made for a one-component uniform spin-temperature gas model to show the systematic departures between this model and data obtained by incorrect treatment of the optical depth effects. Synthetic 21-cm line profiles are computed from the two-component model, and the large-scale trends of the observed emission profiles are reproduced together with the magnitude of the small-scale emission irregularities. Values are determined for the thickness of the galactic hydrogen disk between half density points, the total observed neutral hydrogen mass of the galaxy, and the central number density of the intercloud hydrogen atoms. It is shown that typical hydrogen clouds must be between 1 and 13 pc in diameter and that optical thinness exists on large-scale despite the presence of optically thin gas.

  18. Just enough inflation: power spectrum modifications at large scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cicoli, Michele; Downes, Sean; Dutta, Bhaskar

    2014-12-01

    We show that models of 'just enough' inflation, where the slow-roll evolution lasted only 50- 60 e-foldings, feature modifications of the CMB power spectrum at large angular scales. We perform a systematic analytic analysis in the limit of a sudden transition between any possible non-slow-roll background evolution and the final stage of slow-roll inflation. We find a high degree of universality since most common backgrounds like fast-roll evolution, matter or radiation-dominance give rise to a power loss at large angular scales and a peak together with an oscillatory behaviour at scales around the value of the Hubble parameter at themore » beginning of slow-roll inflation. Depending on the value of the equation of state parameter, different pre-inflationary epochs lead instead to an enhancement of power at low ℓ, and so seem disfavoured by recent observational hints for a lack of CMB power at ℓ∼< 40. We also comment on the importance of initial conditions and the possibility to have multiple pre-inflationary stages.« less

  19. Supernova explosions in magnetized, primordial dark matter haloes

    NASA Astrophysics Data System (ADS)

    Seifried, D.; Banerjee, R.; Schleicher, D.

    2014-05-01

    The first supernova explosions are potentially relevant sources for the production of the first large-scale magnetic fields. For this reason, we present a set of high-resolution simulations studying the effect of supernova explosions on magnetized, primordial haloes. We focus on the evolution of an initially small-scale magnetic field formed during the collapse of the halo. We vary the degree of magnetization, the halo mass, and the amount of explosion energy in order to account for expected variations as well as to infer systematical dependences of the results on initial conditions. Our simulations suggest that core collapse supernovae with an explosion energy of 1051 erg and more violent pair instability supernovae with 1053 erg are able to disrupt haloes with masses up to about 106 and 107 M⊙, respectively. The peak of the magnetic field spectra shows a continuous shift towards smaller k-values, i.e. larger length scales, over time reaching values as low as k = 4. On small scales, the magnetic energy decreases at the cost of the energy on large scales resulting in a well-ordered magnetic field with a strength up to ˜10-8 G depending on the initial conditions. The coherence length of the magnetic field inferred from the spectra reaches values up to 250 pc in agreement with those obtained from autocorrelation functions. We find the coherence length to be as large as 50 per cent of the radius of the supernova bubble. Extrapolating this relation to later stages, we suggest that significantly strong magnetic fields with coherence lengths as large as 1.5 kpc could be created. We discuss possible implications of our results on processes like recollapse of the halo, first galaxy formation, and the magnetization of the intergalactic medium.

  20. Beneficial Effects of Pre-operative Exercise Therapy in Patients with an Abdominal Aortic Aneurysm: A Systematic Review.

    PubMed

    Pouwels, S; Willigendael, E M; van Sambeek, M R H M; Nienhuijs, S W; Cuypers, P W M; Teijink, J A W

    2015-01-01

    The impact of post-operative complications in abdominal aortic aneurysm (AAA) surgery is substantial, and increases with age and concomitant co-morbidities. This systematic review focuses on the possible effects of pre-operative exercise therapy (PET) in patients with AAA on post-operative complications,aerobic capacity, physical fitness, and recovery. A systematic search on PET prior to AAA surgery was conducted. The methodological quality of the included studies was rated using the Physiotherapy Evidence Database scale. The agreement between the reviewers was assessed with Cohen's kappa. Five studies were included, with a methodological quality ranging from moderate to good. Cohen's kappa was 0.79. Three studies focused on patients with an AAA (without indication for surgical repair) with physical fitness as the outcome measure. One study focused on PET in patients awaiting AAA surgery and one study focused on the effects of PET on post-operative complications, length of stay, and recovery. PET has beneficial effects on various physical fitness variables of patients with an AAA. Whether this leads to less complications or faster recovery remains unclear. In view of the large impact of post-operative complications, it is valuable to explore the possible benefits of a PET program in AAA surgery.

  1. A scalable double-barcode sequencing platform for characterization of dynamic protein-protein interactions.

    PubMed

    Schlecht, Ulrich; Liu, Zhimin; Blundell, Jamie R; St Onge, Robert P; Levy, Sasha F

    2017-05-25

    Several large-scale efforts have systematically catalogued protein-protein interactions (PPIs) of a cell in a single environment. However, little is known about how the protein interactome changes across environmental perturbations. Current technologies, which assay one PPI at a time, are too low throughput to make it practical to study protein interactome dynamics. Here, we develop a highly parallel protein-protein interaction sequencing (PPiSeq) platform that uses a novel double barcoding system in conjunction with the dihydrofolate reductase protein-fragment complementation assay in Saccharomyces cerevisiae. PPiSeq detects PPIs at a rate that is on par with current assays and, in contrast with current methods, quantitatively scores PPIs with enough accuracy and sensitivity to detect changes across environments. Both PPI scoring and the bulk of strain construction can be performed with cell pools, making the assay scalable and easily reproduced across environments. PPiSeq is therefore a powerful new tool for large-scale investigations of dynamic PPIs.

  2. Dynamics of oxygen supply and consumption during mainstream large-scale composting in China.

    PubMed

    Zeng, Jianfei; Shen, Xiuli; Han, Lujia; Huang, Guangqun

    2016-11-01

    This study characterized some physicochemical and biological parameters to systematically evaluate the dynamics of oxygen supply and consumption during large-scale trough composting in China. The results showed that long active phases, low maximum temperatures, low organic matter losses and high pore methane concentrations were observed in different composting layers. Pore oxygen concentrations in the top, middle and bottom layers maintained <5vol.% for 40, 42 and 45days, respectively, which accounted for more than 89% of the whole period. After each mechanical turning, oxygen was consumed at a stable respiration rate to a concentration of 5vol.% in no more than 99min and remained anaerobic in the subsequent static condition. The daily percentage of time under aerobic condition was no more than 14% of a single day. Therefore, improving FAS, adjusting aeration interval or combining turning with forced aeration was suggested to provide sufficient oxygen during composting. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Control of fluxes in metabolic networks.

    PubMed

    Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu

    2016-07-01

    Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. © 2016 Basler et al.; Published by Cold Spring Harbor Laboratory Press.

  4. Mechanism of Arachidonic Acid Accumulation during Aging in Mortierella alpina: A Large-Scale Label-Free Comparative Proteomics Study.

    PubMed

    Yu, Yadong; Li, Tao; Wu, Na; Ren, Lujing; Jiang, Ling; Ji, Xiaojun; Huang, He

    2016-11-30

    Arachidonic acid (ARA) is an important polyunsaturated fatty acid having various beneficial physiological effects on the human body. The aging of Mortierella alpina has long been known to significantly improve ARA yield, but the exact mechanism is still elusive. Herein, multiple approaches including large-scale label-free comparative proteomics were employed to systematically investigate the mechanism mentioned above. Upon ultrastructural observation, abnormal mitochondria were found to aggregate around shrunken lipid droplets. Proteomics analysis revealed a total of 171 proteins with significant alterations of expression during aging. Pathway analysis suggested that reactive oxygen species (ROS) were accumulated and stimulated the activation of the malate/pyruvate cycle and isocitrate dehydrogenase, which might provide additional NADPH for ARA synthesis. EC 4.2.1.17-hydratase might be a key player in ARA accumulation during aging. These findings provide a valuable resource for efforts to further improve the ARA content in the oil produced by aging M. alpina.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratapa, Phanisri P.; Suryanarayana, Phanish; Pask, John E.

    We present the Clenshaw–Curtis Spectral Quadrature (SQ) method for real-space O(N) Density Functional Theory (DFT) calculations. In this approach, all quantities of interest are expressed as bilinear forms or sums over bilinear forms, which are then approximated by spatially localized Clenshaw–Curtis quadrature rules. This technique is identically applicable to both insulating and metallic systems, and in conjunction with local reformulation of the electrostatics, enables the O(N) evaluation of the electronic density, energy, and atomic forces. The SQ approach also permits infinite-cell calculations without recourse to Brillouin zone integration or large supercells. We employ a finite difference representation in order tomore » exploit the locality of electronic interactions in real space, enable systematic convergence, and facilitate large-scale parallel implementation. In particular, we derive expressions for the electronic density, total energy, and atomic forces that can be evaluated in O(N) operations. We demonstrate the systematic convergence of energies and forces with respect to quadrature order as well as truncation radius to the exact diagonalization result. In addition, we show convergence with respect to mesh size to established O(N 3) planewave results. In conclusion, we establish the efficiency of the proposed approach for high temperature calculations and discuss its particular suitability for large-scale parallel computation.« less

  6. Spectral Quadrature method for accurate O ( N ) electronic structure calculations of metals and insulators

    DOE PAGES

    Pratapa, Phanisri P.; Suryanarayana, Phanish; Pask, John E.

    2015-12-02

    We present the Clenshaw–Curtis Spectral Quadrature (SQ) method for real-space O(N) Density Functional Theory (DFT) calculations. In this approach, all quantities of interest are expressed as bilinear forms or sums over bilinear forms, which are then approximated by spatially localized Clenshaw–Curtis quadrature rules. This technique is identically applicable to both insulating and metallic systems, and in conjunction with local reformulation of the electrostatics, enables the O(N) evaluation of the electronic density, energy, and atomic forces. The SQ approach also permits infinite-cell calculations without recourse to Brillouin zone integration or large supercells. We employ a finite difference representation in order tomore » exploit the locality of electronic interactions in real space, enable systematic convergence, and facilitate large-scale parallel implementation. In particular, we derive expressions for the electronic density, total energy, and atomic forces that can be evaluated in O(N) operations. We demonstrate the systematic convergence of energies and forces with respect to quadrature order as well as truncation radius to the exact diagonalization result. In addition, we show convergence with respect to mesh size to established O(N 3) planewave results. In conclusion, we establish the efficiency of the proposed approach for high temperature calculations and discuss its particular suitability for large-scale parallel computation.« less

  7. Evaluation of the synoptic and mesoscale predictive capabilities of a mesoscale atmospheric simulation system

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K.; Keyser, D. A.; Mccumber, M. C.

    1983-01-01

    The overall performance characteristics of a limited area, hydrostatic, fine (52 km) mesh, primitive equation, numerical weather prediction model are determined in anticipation of satellite data assimilations with the model. The synoptic and mesoscale predictive capabilities of version 2.0 of this model, the Mesoscale Atmospheric Simulation System (MASS 2.0), were evaluated. The two part study is based on a sample of approximately thirty 12h and 24h forecasts of atmospheric flow patterns during spring and early summer. The synoptic scale evaluation results benchmark the performance of MASS 2.0 against that of an operational, synoptic scale weather prediction model, the Limited area Fine Mesh (LFM). The large sample allows for the calculation of statistically significant measures of forecast accuracy and the determination of systematic model errors. The synoptic scale benchmark is required before unsmoothed mesoscale forecast fields can be seriously considered.

  8. Forest-fire model as a supercritical dynamic model in financial systems

    NASA Astrophysics Data System (ADS)

    Lee, Deokjae; Kim, Jae-Young; Lee, Jeho; Kahng, B.

    2015-02-01

    Recently large-scale cascading failures in complex systems have garnered substantial attention. Such extreme events have been treated as an integral part of self-organized criticality (SOC). Recent empirical work has suggested that some extreme events systematically deviate from the SOC paradigm, requiring a different theoretical framework. We shed additional theoretical light on this possibility by studying financial crisis. We build our model of financial crisis on the well-known forest fire model in scale-free networks. Our analysis shows a nontrivial scaling feature indicating supercritical behavior, which is independent of system size. Extreme events in the supercritical state result from bursting of a fat bubble, seeds of which are sown by a protracted period of a benign financial environment with few shocks. Our findings suggest that policymakers can control the magnitude of financial meltdowns by keeping the economy operating within reasonable duration of a benign environment.

  9. Computerised cognitive training in acquired brain injury: A systematic review of outcomes using the International Classification of Functioning (ICF).

    PubMed

    Sigmundsdottir, Linda; Longley, Wendy A; Tate, Robyn L

    2016-10-01

    Computerised cognitive training (CCT) is an increasingly popular intervention for people experiencing cognitive symptoms. This systematic review evaluated the evidence for CCT in adults with acquired brain injury (ABI), focusing on how outcome measures used reflect efficacy across components of the International Classification of Functioning, Disability and Health. Database searches were conducted of studies investigating CCT to treat cognitive symptoms in adult ABI. Scientific quality was rated using the PEDro-P and RoBiNT Scales. Ninety-six studies met the criteria. Most studies examined outcomes using measures of mental functions (93/96, 97%); fewer studies included measures of activities/participation (41/96, 43%) or body structures (8/96, 8%). Only 14 studies (15%) provided Level 1 evidence (randomised controlled trials with a PEDro-P score ≥ 6/10), with these studies suggesting strong evidence for CCT improving processing speed in multiple sclerosis (MS) and moderate evidence for improving memory in MS and brain tumour populations. There is a large body of research examining the efficacy of CCT, but relatively few Level 1 studies and evidence is largely limited to body function outcomes. The routine use of outcome measures of activities/participation would provide more meaningful evidence for the efficacy of CCT. The use of body structure outcome measures (e.g., neuroimaging) is a newly emerging area, with potential to increase understanding of mechanisms of action for CCT.

  10. Carboniferous climate teleconnections archived in coupled bioapatite δ18OPO4 and 87Sr/86Sr records from the epicontinental Donets Basin, Ukraine

    NASA Astrophysics Data System (ADS)

    Montañez, Isabel P.; Osleger, Dillon J.; Chen, Jitao; Wortham, Barbara E.; Stamm, Robert G.; Nemyrovska, Tamara I.; Griffin, Julie M.; Poletaev, Vladislav I.; Wardlaw, Bruce R.

    2018-06-01

    Reconstructions of paleo-seawater chemistry are largely inferred from biogenic records of epicontinental seas. Recent studies provide considerable evidence for large-scale spatial and temporal variability in the environmental dynamics of these semi-restricted seas that leads to the decoupling of epicontinental isotopic records from those of the open ocean. We present conodont apatite δ18OPO4 and 87Sr/86Sr records spanning 24 Myr of the late Mississippian through Pennsylvanian derived from the U-Pb calibrated cyclothemic succession of the Donets Basin, eastern Ukraine. On a 2 to 6 Myr-scale, systematic fluctuations in bioapatite δ18OPO4 and 87Sr/86Sr broadly follow major shifts in the Donets onlap-offlap history and inferred regional climate, but are distinct from contemporaneous more open-water δ18OPO4 and global seawater Sr isotope trends. A -1 to -6‰ offset in Donets δ18OPO4 values from those of more open-water conodonts and greater temporal variability in δ18OPO4 and 87Sr/86Sr records are interpreted to primarily record climatically driven changes in local environmental processes in the Donets sea. Systematic isotopic shifts associated with Myr-scale sea-level fluctuations, however, indicate an extrabasinal driver. We propose a mechanistic link to glacioeustasy through a teleconnection between high-latitude ice changes and atmospheric pCO2 and regional monsoonal circulation in the Donets region. Inferred large-magnitude changes in Donets seawater salinity and temperature, not archived in the more open-water or global contemporaneous records, indicate a modification of the global climate signal in the epicontinental sea through amplification or dampening of the climate signal by local and regional environmental processes. This finding of global climate change filtered through local processes has implications for the use of conodont δ18OPO4 and 87Sr/86Sr values as proxies of paleo-seawater composition, mean temperature, and glacioeustasy.

  11. Carboniferous climate teleconnections archived in coupled bioapatite δ18OPO4 and 87Sr/86Sr records from the epicontinental Donets Basin, Ukraine

    USGS Publications Warehouse

    Montanez, Isabel P.; Osleger, Dillon J.; Chen, J.-H.; Wortham, Barbara E.; Stamm, Robert G.; Nemyrovska, Tamara I.; Griffin, Julie M.; Poletaev, Vladislav I.; Wardlaw, Bruce R.

    2018-01-01

    Reconstructions of paleo-seawater chemistry are largely inferred from biogenic records of epicontinental seas. Recent studies provide considerable evidence for large-scale spatial and temporal variability in the environmental dynamics of these semi-restricted seas that leads to the decoupling of epicontinental isotopic records from those of the open ocean. We present conodont apatite δ18OPO4 and 87Sr/86Sr records spanning 24 Myr of the late Mississippian through Pennsylvanian derived from the U–Pb calibrated cyclothemic succession of the Donets Basin, eastern Ukraine. On a 2 to 6 Myr-scale, systematic fluctuations in bioapatite δ18OPO4 and 87Sr/86Sr broadly follow major shifts in the Donets onlap–offlap history and inferred regional climate, but are distinct from contemporaneous more open-water δ18OPO4 and global seawater Sr isotope trends. A −1 to −6‰ offset in Donets δ18OPO4 values from those of more open-water conodonts and greater temporal variability in δ18OPO4 and 87Sr/86Sr records are interpreted to primarily record climatically driven changes in local environmental processes in the Donets sea. Systematic isotopic shifts associated with Myr-scale sea-level fluctuations, however, indicate an extrabasinal driver. We propose a mechanistic link to glacioeustasy through a teleconnection between high-latitude ice changes and atmospheric pCO2 and regional monsoonal circulation in the Donets region. Inferred large-magnitude changes in Donets seawater salinity and temperature, not archived in the more open-water or global contemporaneous records, indicate a modification of the global climate signal in the epicontinental sea through amplification or dampening of the climate signal by local and regional environmental processes. This finding of global climate change filtered through local processes has implications for the use of conodont δ18OPO4 and 87Sr/86Sr values as proxies of paleo-seawater composition, mean temperature, and glacioeustasy.

  12. Observations of vertical winds and the origin of thermospheric gravity waves launched by auroral substorms and westward travelling surges

    NASA Technical Reports Server (NTRS)

    Rees, D.

    1986-01-01

    Several sequences of observations of strong vertical winds in the upper thermosphere are discussed, in conjunction with models of the generation of such winds. In the auroral oval, the strongest upward winds are observed in or close to regions of intense auroral precipitation and strong ionospheric currents. The strongest winds, of the order of 100 to 200 m/sec are usually upward, and are both localized and of relatively short duration (10 to 20 min). In regions adjacent to those displaying strong upward winds, and following periods of upward winds, downward winds of rather lower magnitude (40 to about 80 m/sec) may be observed. Strong and rapid changes of horizontal winds are correlated with these rapid vertical wind variations. Considered from a large scale viewpoint, this class of strongly time dependent winds propagate globally, and may be considered to be gravity waves launched from an auroral source. During periods of very disturbed geomagnetic activity, there may be regions within and close to the auroral oval where systematic vertical winds of the order of 50 m/sec will occur for periods of several hours. Such persistent winds are part of a very strong large scale horizontal wind circulation set up in the polar regions during a major geomagnetic disturbance. This second class of strong horizontal and vertical winds corresponds more to a standing wave than to a gravity wave, and it is not as effective as the first class in generating large scale propagating gravity waves and correlated horizontal and vertical oscillations. A third class of significant (10 to 30 m/sec) vertical winds can be associated with systematic features of the average geomagnetic energy and momentum input to the polar thermosphere, and appear in statistical studies of the average vertical wind as a function of Universal Time at a given location.

  13. North American Extreme Temperature Events and Related Large Scale Meteorological Patterns: A Review of Statistical Methods, Dynamics, Modeling, and Trends

    NASA Technical Reports Server (NTRS)

    Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.; hide

    2015-01-01

    The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.

  14. SQDFT: Spectral Quadrature method for large-scale parallel O ( N ) Kohn–Sham calculations at high temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method formore » $$\\mathscr{O}(N)$$ Kohn–Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw–Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw–Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. Here, we further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect $$\\mathscr{O}(N)$$ scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.« less

  15. Time-sliced perturbation theory for large scale structure I: general formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blas, Diego; Garny, Mathias; Sibiryakov, Sergey

    2016-07-01

    We present a new analytic approach to describe large scale structure formation in the mildly non-linear regime. The central object of the method is the time-dependent probability distribution function generating correlators of the cosmological observables at a given moment of time. Expanding the distribution function around the Gaussian weight we formulate a perturbative technique to calculate non-linear corrections to cosmological correlators, similar to the diagrammatic expansion in a three-dimensional Euclidean quantum field theory, with time playing the role of an external parameter. For the physically relevant case of cold dark matter in an Einstein-de Sitter universe, the time evolution ofmore » the distribution function can be found exactly and is encapsulated by a time-dependent coupling constant controlling the perturbative expansion. We show that all building blocks of the expansion are free from spurious infrared enhanced contributions that plague the standard cosmological perturbation theory. This paves the way towards the systematic resummation of infrared effects in large scale structure formation. We also argue that the approach proposed here provides a natural framework to account for the influence of short-scale dynamics on larger scales along the lines of effective field theory.« less

  16. Cosmological measurements with general relativistic galaxy correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raccanelli, Alvise; Montanari, Francesco; Durrer, Ruth

    We investigate the cosmological dependence and the constraining power of large-scale galaxy correlations, including all redshift-distortions, wide-angle, lensing and gravitational potential effects on linear scales. We analyze the cosmological information present in the lensing convergence and in the gravitational potential terms describing the so-called ''relativistic effects'', and we find that, while smaller than the information contained in intrinsic galaxy clustering, it is not negligible. We investigate how neglecting them does bias cosmological measurements performed by future spectroscopic and photometric large-scale surveys such as SKA and Euclid. We perform a Fisher analysis using the CLASS code, modified to include scale-dependent galaxymore » bias and redshift-dependent magnification and evolution bias. Our results show that neglecting relativistic terms, especially lensing convergence, introduces an error in the forecasted precision in measuring cosmological parameters of the order of a few tens of percent, in particular when measuring the matter content of the Universe and primordial non-Gaussianity parameters. The analysis suggests a possible substantial systematic error in cosmological parameter constraints. Therefore, we argue that radial correlations and integrated relativistic terms need to be taken into account when forecasting the constraining power of future large-scale number counts of galaxy surveys.« less

  17. SQDFT: Spectral Quadrature method for large-scale parallel O ( N ) Kohn–Sham calculations at high temperature

    DOE PAGES

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj; ...

    2017-12-07

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method formore » $$\\mathscr{O}(N)$$ Kohn–Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw–Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw–Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. Here, we further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect $$\\mathscr{O}(N)$$ scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.« less

  18. A systematic review of systematic reviews on interventions for caregivers of people with chronic conditions.

    PubMed

    Corry, Margarita; While, Alison; Neenan, Kathleen; Smith, Valerie

    2015-04-01

    To evaluate the effectiveness of interventions to support caregivers of people with selected chronic conditions. Informal caregivers provide millions of care hours each week contributing to significant healthcare savings. Despite much research evaluating a range of interventions for caregivers, their impact remains unclear. A systematic review of systematic reviews of interventions to support caregivers of people with selected chronic conditions. The electronic databases of PubMed, CINAHL, British Nursing Index, PsycINFO, Social Science Index (January 1990-May 2014) and The Cochrane Library (Issue 6, June 2014), were searched using Medical Subject Heading and index term combinations of the keywords caregiver, systematic review, intervention and named chronic conditions. Papers were included if they reported a systematic review of interventions for caregivers of people with chronic conditions. The methodological quality of the included reviews was independently assessed by two reviewers using R-AMSTAR. Data were independently extracted by two reviewers using a pre-designed data extraction form. Narrative synthesis of review findings was used to present the results. Eight systematic reviews were included. There was evidence that education and support programme interventions improved caregiver quality of life. Information-giving interventions improved caregiver knowledge for stroke caregivers. Education, support and information-giving interventions warrant further investigation across caregiver groups. A large-scale funded programme for caregiver research is required to ensure that studies are of high quality to inform service development across settings. © 2014 John Wiley & Sons Ltd.

  19. Large-scale imputation of epigenomic datasets for systematic annotation of diverse human tissues.

    PubMed

    Ernst, Jason; Kellis, Manolis

    2015-04-01

    With hundreds of epigenomic maps, the opportunity arises to exploit the correlated nature of epigenetic signals, across both marks and samples, for large-scale prediction of additional datasets. Here, we undertake epigenome imputation by leveraging such correlations through an ensemble of regression trees. We impute 4,315 high-resolution signal maps, of which 26% are also experimentally observed. Imputed signal tracks show overall similarity to observed signals and surpass experimental datasets in consistency, recovery of gene annotations and enrichment for disease-associated variants. We use the imputed data to detect low-quality experimental datasets, to find genomic sites with unexpected epigenomic signals, to define high-priority marks for new experiments and to delineate chromatin states in 127 reference epigenomes spanning diverse tissues and cell types. Our imputed datasets provide the most comprehensive human regulatory region annotation to date, and our approach and the ChromImpute software constitute a useful complement to large-scale experimental mapping of epigenomic information.

  20. Method for revealing biases in precision mass measurements

    NASA Astrophysics Data System (ADS)

    Vabson, V.; Vendt, R.; Kübarsepp, T.; Noorma, M.

    2013-02-01

    A practical method for the quantification of systematic errors of large-scale automatic comparators is presented. This method is based on a comparison of the performance of two different comparators. First, the differences of 16 equal partial loads of 1 kg are measured with a high-resolution mass comparator featuring insignificant bias and 1 kg maximum load. At the second stage, a large-scale comparator is tested by using combined loads with known mass differences. Comparing the different results, the biases of any comparator can be easily revealed. These large-scale comparator biases are determined over a 16-month period, and for the 1 kg loads, a typical pattern of biases in the range of ±0.4 mg is observed. The temperature differences recorded inside the comparator concurrently with mass measurements are found to remain within a range of ±30 mK, which obviously has a minor effect on the detected biases. Seasonal variations imply that the biases likely arise mainly due to the functioning of the environmental control at the measurement location.

  1. Integrated water and renewable energy management: the Acheloos-Peneios region case study

    NASA Astrophysics Data System (ADS)

    Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.

  2. Spatial heterogeneity in zooplankton summer distribution in the eastern Chukchi Sea in 2012-2013 as a result of large-scale interactions of water masses

    NASA Astrophysics Data System (ADS)

    Pinchuk, Alexei I.; Eisner, Lisa B.

    2017-01-01

    Interest in the Arctic shelf ecosystems has increased in recent years as the climate has rapidly warmed and sea ice declined. These changing conditions prompted the broad-scale multidisciplinary Arctic Ecosystem integrated survey (Arctic Eis) aimed at systematic, comparative analyses of interannual variability of the shelf ecosystem. In this study, we compared zooplankton composition and geographical distribution in relation to water properties on the eastern Chukchi and northern Bering Sea shelves during the summers of 2012 and 2013. In 2012, waters of Pacific origin prevailed over the study area carrying expatriate oceanic species (e.g. copepods Neocalanus spp., Eucalanus bungii) from the Bering Sea outer shelf well onto the northeastern Chukchi shelf. In contrast, in 2013, zooplankton of Pacific origin was mainly distributed over the southern Chukchi shelf, suggesting a change of advection pathways into the Arctic. These changes also manifested in the emergence of large lipid-rich Arctic zooplankton (e.g. Calanus hyperboreus) on the northeastern Chukchi shelf in 2013. The predominant copepod Calanus glacialis was composed of two distinct populations originating from the Bering Sea and from the Arctic, with the Arctic population expanding over a broader range in 2013. The observed interannual variability in zooplankton distribution on the Chukchi Sea shelf may be explained by previously described systematic oceanographic patterns derived from long-term observations. Variability in oceanic circulation and related zooplankton distributions (e.g. changes in southwestward advection of C. hyperboreus) may impact keystone predators such as Arctic Cod (Boreogadus saida) that feed on energy-rich zooplankton.

  3. Basic Scale on Insomnia complaints and Quality of Sleep (BaSIQS): reliability, initial validity and normative scores in higher education students.

    PubMed

    Allen Gomes, Ana; Ruivo Marques, Daniel; Meia-Via, Ana Maria; Meia-Via, Mariana; Tavares, José; Fernandes da Silva, Carlos; Pinto de Azevedo, Maria Helena

    2015-04-01

    Based on successive samples totaling more than 5000 higher education students, we scrutinized the reliability, structure, initial validity and normative scores of a brief self-report seven-item scale to screen for the continuum of nighttime insomnia complaints/perceived sleep quality, used by our team for more than a decade, henceforth labeled the Basic Scale on Insomnia complaints and Quality of Sleep (BaSIQS). In study/sample 1 (n = 1654), the items were developed based on part of a larger survey on higher education sleep-wake patterns. The test-retest study was conducted in an independent small group (n = 33) with a 2-8 week gap. In study/sample 2 (n = 360), focused mainly on validity, the BaSIQS was completed together with the Pittsburgh Sleep Quality Index (PSQI). In study 3, a large recent sample of students from universities all over the country (n = 2995) answered the BaSIQS items, based on which normative scores were determined, and an additional question on perceived sleep problems in order to further analyze the scale's validity. Regarding reliability, Cronbach alpha coefficients were systematically higher than 0.7, and the test-retest correlation coefficient was greater than 0.8. Structure analyses revealed consistently satisfactory two-factor and single-factor solutions. Concerning validity analyses, BaSIQS scores were significantly correlated with PSQI component scores and overall score (r = 0.652 corresponding to a large association); mean scores were significantly higher in those students classifying themselves as having sleep problems (p < 0.0001, d = 0.99 corresponding to a large effect size). In conclusion, the BaSIQS is very easy to administer, and appears to be a reliable and valid scale in higher education students. It might be a convenient short tool in research and applied settings to rapidly assess sleep quality or screen for insomnia complaints, and it may be easily used in other populations with minor adaptations.

  4. Reconstruction of halo power spectrum from redshift-space galaxy distribution: cylinder-grouping method and halo exclusion effect

    NASA Astrophysics Data System (ADS)

    Okumura, Teppei; Takada, Masahiro; More, Surhud; Masaki, Shogo

    2017-07-01

    The peculiar velocity field measured by redshift-space distortions (RSD) in galaxy surveys provides a unique probe of the growth of large-scale structure. However, systematic effects arise when including satellite galaxies in the clustering analysis. Since satellite galaxies tend to reside in massive haloes with a greater halo bias, the inclusion boosts the clustering power. In addition, virial motions of the satellite galaxies cause a significant suppression of the clustering power due to non-linear RSD effects. We develop a novel method to recover the redshift-space power spectrum of haloes from the observed galaxy distribution by minimizing the contamination of satellite galaxies. The cylinder-grouping method (CGM) we study effectively excludes satellite galaxies from a galaxy sample. However, we find that this technique produces apparent anisotropies in the reconstructed halo distribution over all the scales which mimic RSD. On small scales, the apparent anisotropic clustering is caused by exclusion of haloes within the anisotropic cylinder used by the CGM. On large scales, the misidentification of different haloes in the large-scale structures, aligned along the line of sight, into the same CGM group causes the apparent anisotropic clustering via their cross-correlation with the CGM haloes. We construct an empirical model for the CGM halo power spectrum, which includes correction terms derived using the CGM window function at small scales as well as the linear matter power spectrum multiplied by a simple anisotropic function at large scales. We apply this model to a mock galaxy catalogue at z = 0.5, designed to resemble Sloan Digital Sky Survey-III Baryon Oscillation Spectroscopic Survey (BOSS) CMASS galaxies, and find that our model can predict both the monopole and quadrupole power spectra of the host haloes up to k < 0.5 {{h Mpc^{-1}}} to within 5 per cent.

  5. Investigation of rock samples by neutron diffraction and ultrasonic sounding

    NASA Astrophysics Data System (ADS)

    Burilichev, D. E.; Ivankina, T. I.; Klima, K.; Locajicek, T.; Nikitin, A. N.; Pros, Z.

    2000-03-01

    The interpretation of large-scale geophysical anisotropies largely depends upon the knowledge of rock anisotropies of any kind (compositions, foliations, grain shape, physical properties). Almost all physical rock properties (e.g. elastic, thermal, magnetic properties) are related to the textures of the rock constituents since they are anisotropic for the single crystal. Although anisotropy determinations are numerous, systematic investigations are scarce. Therefore, several rock samples with different microfabrics were selected for texture analysis and to determine its P-wave distributions at various confining pressures.

  6. Hierarchical coarse-graining strategy for protein-membrane systems to access mesoscopic scales

    PubMed Central

    Ayton, Gary S.; Lyman, Edward

    2014-01-01

    An overall multiscale simulation strategy for large scale coarse-grain simulations of membrane protein systems is presented. The protein is modeled as a heterogeneous elastic network, while the lipids are modeled using the hybrid analytic-systematic (HAS) methodology, where in both cases atomistic level information obtained from molecular dynamics simulation is used to parameterize the model. A feature of this approach is that from the outset liposome length scales are employed in the simulation (i.e., on the order of ½ a million lipids plus protein). A route to develop highly coarse-grained models from molecular-scale information is proposed and results for N-BAR domain protein remodeling of a liposome are presented. PMID:20158037

  7. Effect of inventory method on niche models: random versus systematic error

    Treesearch

    Heather E. Lintz; Andrew N. Gray; Bruce McCune

    2013-01-01

    Data from large-scale biological inventories are essential for understanding and managing Earth's ecosystems. The Forest Inventory and Analysis Program (FIA) of the U.S. Forest Service is the largest biological inventory in North America; however, the FIA inventory recently changed from an amalgam of different approaches to a nationally-standardized approach in...

  8. What Works to Improve Reading Outcomes in Latin-America? A Systematic Review of the Evidence

    ERIC Educational Resources Information Center

    de Hoop, Thomas; Klochikin, Evgeny; Stone, Rebecca

    2016-01-01

    Improvements in students' learning achievement have lagged behind in low-and middle-income countries despite significant progress in school enrollment numbers. Large-scale early grade reading assessments (e.g., "Annual Status of Education Report" [ASER], 2013; EdData II, n.d.) have shown low reading rates and worryingly high…

  9. Guide for preparing active solar heating systems operation and maintenance manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This book presents a systematic and standardized approach to the preparation of operation and maintenance manuals for active solar heating systems. Provides an industry consensus of the best operating and maintenance procedures for large commercial-scale solar service water and space heating systems. A sample O M manual is included. 3-ring binder included.

  10. Structural similitude and design of scaled down laminated models

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Rezaeepazhand, J.

    1993-01-01

    The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial similarity are discussed. The procedure consists of systematically observing the effect of each parameter and corresponding scaling laws. Then acceptable intervals and limitations for these parameters and scaling laws are discussed. In each case, a set of valid scaling factors and corresponding response scaling laws that accurately predict the response of prototypes from experimental models is introduced. The examples used include rectangular laminated plates under destabilizing loads, applied individually, vibrational characteristics of same plates, as well as cylindrical bending of beam-plates.

  11. From cosmology to cold atoms: observation of Sakharov oscillations in a quenched atomic superfluid.

    PubMed

    Hung, Chen-Lung; Gurarie, Victor; Chin, Cheng

    2013-09-13

    Predicting the dynamics of many-body systems far from equilibrium is a challenging theoretical problem. A long-predicted phenomenon in hydrodynamic nonequilibrium systems is the occurrence of Sakharov oscillations, which manifest in the anisotropy of the cosmic microwave background and the large-scale correlations of galaxies. Here, we report the observation of Sakharov oscillations in the density fluctuations of a quenched atomic superfluid through a systematic study in both space and time domains and with tunable interaction strengths. Our work suggests a different approach to the study of nonequilibrium dynamics of quantum many-body systems and the exploration of their analogs in cosmology and astrophysics.

  12. Lack of genetic association between TREM2 and Alzheimer's disease in East Asian population: a systematic review and meta-analysis.

    PubMed

    Huang, Man; Wang, Dejun; Xu, Zhijun; Xu, Yongshan; Xu, Xiaoping; Ma, Yuefeng; Xia, Zheng

    2015-09-01

    Large-scale genome-wide association studies have identified TREM2 variants to be significantly associated with Alzheimer's disease (AD) in caucasian population. The goal of this systematic study and meta-analysis was to assess the association between Triggering receptor expressed on myeloid cells 2 (TREM2) variants and AD in East Asian population. In this study, literatures were searched in PubMed, MEDLINE, EMBASE, and the Cochrane library to screen citations from January 1990 to June 2014. Data analysis was done by using the Stata 12 software. Twelve studies were considered for analysis. A total of 13 535 patients with AD and 22 976 healthy controls were studied. The results showed that rs75932628 variant was significantly associated with AD in caucasian population (P < .001, odds ratio ¼ 3.17, 95% confidence interval 2.45-4.09). However, the association was not found in East Asian population. In our study, we found that TREM2 variant is likely not associated with AD in East Asian population.

  13. Scaling analysis of cloud and rain water in marine stratocumulus and implications for scale-aware microphysical parameterizations

    NASA Astrophysics Data System (ADS)

    Witte, M.; Morrison, H.; Jensen, J. B.; Bansemer, A.; Gettelman, A.

    2017-12-01

    The spatial covariance of cloud and rain water (or in simpler terms, small and large drops, respectively) is an important quantity for accurate prediction of the accretion rate in bulk microphysical parameterizations that account for subgrid variability using assumed probability density functions (pdfs). Past diagnoses of this covariance from remote sensing, in situ measurements and large eddy simulation output have implicitly assumed that the magnitude of the covariance is insensitive to grain size (i.e. horizontal resolution) and averaging length, but this is not the case because both cloud and rain water exhibit scale invariance across a wide range of scales - from tens of centimeters to tens of kilometers in the case of cloud water, a range that we will show is primarily limited by instrumentation and sampling issues. Since the individual variances systematically vary as a function of spatial scale, it should be expected that the covariance follows a similar relationship. In this study, we quantify the scaling properties of cloud and rain water content and their covariability from high frequency in situ aircraft measurements of marine stratocumulus taken over the southeastern Pacific Ocean aboard the NSF/NCAR C-130 during the VOCALS-REx field experiment of October-November 2008. First we confirm that cloud and rain water scale in distinct manners, indicating that there is a statistically and potentially physically significant difference in the spatial structure of the two fields. Next, we demonstrate that the covariance is a strong function of spatial scale, which implies important caveats regarding the ability of limited-area models with domains smaller than a few tens of kilometers across to accurately reproduce the spatial organization of precipitation. Finally, we present preliminary work on the development of a scale-aware parameterization of cloud-rain water subgrid covariability based in multifractal analysis intended for application in large-scale model microphysics schemes.

  14. A proposed method to investigate reliability throughout a questionnaire.

    PubMed

    Wentzel-Larsen, Tore; Norekvål, Tone M; Ulvik, Bjørg; Nygård, Ottar; Pripp, Are H

    2011-10-05

    Questionnaires are used extensively in medical and health care research and depend on validity and reliability. However, participants may differ in interest and awareness throughout long questionnaires, which can affect reliability of their answers. A method is proposed for "screening" of systematic change in random error, which could assess changed reliability of answers. A simulation study was conducted to explore whether systematic change in reliability, expressed as changed random error, could be assessed using unsupervised classification of subjects by cluster analysis (CA) and estimation of intraclass correlation coefficient (ICC). The method was also applied on a clinical dataset from 753 cardiac patients using the Jalowiec Coping Scale. The simulation study showed a relationship between the systematic change in random error throughout a questionnaire and the slope between the estimated ICC for subjects classified by CA and successive items in a questionnaire. This slope was proposed as an awareness measure--to assessing if respondents provide only a random answer or one based on a substantial cognitive effort. Scales from different factor structures of Jalowiec Coping Scale had different effect on this awareness measure. Even though assumptions in the simulation study might be limited compared to real datasets, the approach is promising for assessing systematic change in reliability throughout long questionnaires. Results from a clinical dataset indicated that the awareness measure differed between scales.

  15. Measurement Scales of Suicidal Ideation and Attitudes: A Systematic Review Article

    PubMed Central

    Ghasemi, Parvin; Shaghaghi, Abdolreza; Allahverdipour, Hamid

    2015-01-01

    Background: The main aim of this study was to accumulate research evidence that introduce validated scales to measure suicidal attitudes and ideation and provide an empirical framework for adopting a relevant assessment tool in studies on suicide and suicidal behaviors. Methods: Medical Subject Headings’ (MeSH) terms were used to search Ovid Medline, PROQUEST, Wiley online library, Science Direct and PubMed for the published articles in English that reported application of an scale to measure suicidal attitudes and ideation from January 1974 onward. Results: Fourteen suicidal attitude scale and 15 scales for assessing suicidal ideation were identified in this systematic review. No gold standard approach was recognized to study suicide related attitudes and ideations. Conclusion: Special focus on generally agreed dimensions of suicidal ideation and attitudes and cross-cultural validation of the introduced scales to be applicable in different ethnic and socially diverse populations could be a promising area of research for scholars. PMID:26634193

  16. Progressive Mid-latitude Afforestation: Local and Remote Climate Impacts in the Framework of Two Coupled Earth System Models

    NASA Astrophysics Data System (ADS)

    Lague, Marysa

    Vegetation influences the atmosphere in complex and non-linear ways, such that large-scale changes in vegetation cover can drive changes in climate on both local and global scales. Large-scale land surface changes have been shown to introduce excess energy to one hemisphere, causing a shift in atmospheric circulation on a global scale. However, past work has not quantified how the climate response scales with the area of vegetation. Here, we systematically evaluate the response of climate to linearly increasing the area of forest cover over the northern mid-latitudes. We show that the magnitude of afforestation of the northern mid-latitudes determines the climate response in a non-linear fashion, and identify a threshold in vegetation-induced cloud feedbacks - a concept not previously addressed by large-scale vegetation manipulation experiments. Small increases in tree cover drive compensating cloud feedbacks, while latent heat fluxes reach a threshold after sufficiently large increases in tree cover, causing the troposphere to warm and dry, subsequently reducing cloud cover. Increased absorption of solar radiation at the surface is driven by both surface albedo changes and cloud feedbacks. We identify how vegetation-induced changes in cloud cover further feedback on changes in the global energy balance. We also show how atmospheric cross-equatorial energy transport changes as the area of afforestation is incrementally increased (a relationship which has not previously been demonstrated). This work demonstrates that while some climate effects (such as energy transport) of large scale mid-latitude afforestation scale roughly linearly across a wide range of afforestation areas, others (such as the local partitioning of the surface energy budget) are non-linear, and sensitive to the particular magnitude of mid-latitude forcing. Our results highlight the importance of considering both local and remote climate responses to large-scale vegetation change, and explore the scaling relationship between changes in vegetation cover and the resulting climate impacts.

  17. From Fibrils to Toughness: Multi-Scale Mechanics of Fibrillating Interfaces in Stretchable Electronics

    PubMed Central

    van der Sluis, Olaf; Vossen, Bart; Geers, Marc

    2018-01-01

    Metal-elastomer interfacial systems, often encountered in stretchable electronics, demonstrate remarkably high interface fracture toughness values. Evidently, a large gap exists between the rather small adhesion energy levels at the microscopic scale (‘intrinsic adhesion’) and the large measured macroscopic work-of-separation. This energy gap is closed here by unravelling the underlying dissipative mechanisms through a systematic numerical/experimental multi-scale approach. This self-containing contribution collects and reviews previously published results and addresses the remaining open questions by providing new and independent results obtained from an alternative experimental set-up. In particular, the experimental studies on Cu-PDMS (Poly(dimethylsiloxane)) samples conclusively reveal the essential role of fibrillation mechanisms at the micro-meter scale during the metal-elastomer delamination process. The micro-scale numerical analyses on single and multiple fibrils show that the dynamic release of the stored elastic energy by multiple fibril fracture, including the interaction with the adjacent deforming bulk PDMS and its highly nonlinear behaviour, provide a mechanistic understanding of the high work-of-separation. An experimentally validated quantitative relation between the macroscopic work-of-separation and peel front height is established from the simulation results. Finally, it is shown that a micro-mechanically motivated shape of the traction-separation law in cohesive zone models is essential to describe the delamination process in fibrillating metal-elastomer systems in a physically meaningful way. PMID:29393908

  18. Towards national-scale greenhouse gas emissions evaluation with robust uncertainty estimates

    NASA Astrophysics Data System (ADS)

    Rigby, Matthew; Swallow, Ben; Lunt, Mark; Manning, Alistair; Ganesan, Anita; Stavert, Ann; Stanley, Kieran; O'Doherty, Simon

    2016-04-01

    Through the Deriving Emissions related to Climate Change (DECC) network and the Greenhouse gAs Uk and Global Emissions (GAUGE) programme, the UK's greenhouse gases are now monitored by instruments mounted on telecommunications towers and churches, on a ferry that performs regular transects of the North Sea, on-board a research aircraft and from space. When combined with information from high-resolution chemical transport models such as the Met Office Numerical Atmospheric dispersion Modelling Environment (NAME), these measurements are allowing us to evaluate emissions more accurately than has previously been possible. However, it has long been appreciated that current methods for quantifying fluxes using atmospheric data suffer from uncertainties, primarily relating to the chemical transport model, that have been largely ignored to date. Here, we use novel model reduction techniques for quantifying the influence of a set of potential systematic model errors on the outcome of a national-scale inversion. This new technique has been incorporated into a hierarchical Bayesian framework, which can be shown to reduce the influence of subjective choices on the outcome of inverse modelling studies. Using estimates of the UK's methane emissions derived from DECC and GAUGE tall-tower measurements as a case study, we will show that such model systematic errors have the potential to significantly increase the uncertainty on national-scale emissions estimates. Therefore, we conclude that these factors must be incorporated in national emissions evaluation efforts, if they are to be credible.

  19. Chandra Early Type Galaxy Atals

    NASA Astrophysics Data System (ADS)

    Kim, Dong-Woo; Anderson, Craig; Burke, Douglas J.; Fabbiano, Giuseppina; Fruscione, Antonella; Lauer, Jennifer; McCollough, Michael; Morgan, Douglas; Mossman, Amy; O'Sullivan, Ewan; Paggi, Alessandro; Vrtilek, Saeqa Dil; Trinchieri, Ginevra

    2017-08-01

    The hot gas in early type galaxies (ETGs) plays a crucial role in understanding their formation and evolution. As the hot gas is often extended to the outskirts beyond the optical size, the large scale structural features identified by Chandra (including jets, cavities, cold fronts, filaments and tails) point to key evolutionary mechanisms, e.g., AGN feedback, merging history, accretion, stripping and star formation and its quenching. We have systematically analyzed the archival Chandra data of ~100 ETGs to study the hot ISM. We produce the uniformly derived data products with spatially resolved spectral information and will make them accessible via a public web site. With 2D spectral infomation, we further discuss gas morphology, scaling relations, X-ray based mass profiles and their implications related to various physical mechanisms (e.g., stellar and AGN feedback).

  20. [Silhouette scales and body satisfaction in adolescents: a systematic literature review].

    PubMed

    Côrtes, Marcela Guimarães; Meireles, Adriana Lúcia; Friche, Amélia Augusta de Lima; Caiaffa, Waleska Teixeira; Xavier, César Coelho

    2013-03-01

    The purpose of this study was to summarize studies on adolescents' body satisfaction, focusing on the use of silhouette scales. A systematic review was carried out on MEDLINE, LILACS, SciELO, and in unpublished papers. The final analysis included 36 studies. The majority adopted the scale proposed by Stunkard et al., self-administered, presented in ascending order and on a single sheet of paper. Most studies compared characteristics on satisfaction and dissatisfaction, used the chi-square test, and did not test for confounding. Among 18 studies included in the meta-analysis, prevalence of body dissatisfaction ranged from 32.2% to 83%. The review showed wide heterogeneity between studies (p-value = 0.000; I(2) = 87.39) even after sub-group analysis and the absence of relevant information for proper comparison of studies. The article concludes by recommending greater rigor in application of the scales and presentation of study methods on body satisfaction assessed by silhouette scales, in addition to new methodological studies and those that elucidate factors related to body satisfaction.

  1. Building work engagement: A systematic review and meta‐analysis investigating the effectiveness of work engagement interventions

    PubMed Central

    Patterson, Malcolm; Dawson, Jeremy

    2016-01-01

    Summary Low work engagement may contribute towards decreased well‐being and work performance. Evaluating, boosting and sustaining work engagement are therefore of interest to many organisations. However, the evidence on which to base interventions has not yet been synthesised. A systematic review with meta‐analysis was conducted to assess the evidence for the effectiveness of work engagement interventions. A systematic literature search identified controlled workplace interventions employing a validated measure of work engagement. Most used the Utrecht Work Engagement Scale (UWES). Studies containing the relevant quantitative data underwent random‐effects meta‐analyses. Results were assessed for homogeneity, systematic sampling error, publication bias and quality. Twenty studies met the inclusion criteria and were categorised into four types of interventions: (i) personal resource building; (ii) job resource building; (iii) leadership training; and (iv) health promotion. The overall effect on work engagement was small, but positive, k = 14, Hedges g = 0.29, 95%‐CI = 0.12–0.46. Moderator analyses revealed a significant result for intervention style, with a medium to large effect for group interventions. Heterogeneity between the studies was high, and the success of implementation varied. More studies are needed, and researchers are encouraged to collaborate closely with organisations to design interventions appropriate to individual contexts and settings, and include evaluations of intervention implementation. © 2016 The Authors. Journal of Organizational Behavior published by John Wiley & Sons, Ltd. PMID:28781428

  2. Systematic effects of foreground removal in 21-cm surveys of reionization

    NASA Astrophysics Data System (ADS)

    Petrovic, Nada; Oh, S. Peng

    2011-05-01

    21-cm observations have the potential to revolutionize our understanding of the high-redshift Universe. Whilst extremely bright radio continuum foregrounds exist at these frequencies, their spectral smoothness can be exploited to allow efficient foreground subtraction. It is well known that - regardless of other instrumental effects - this removes power on scales comparable to the survey bandwidth. We investigate associated systematic biases. We show that removing line-of-sight fluctuations on large scales aliases into suppression of the 3D power spectrum across a broad range of scales. This bias can be dealt with by correctly marginalizing over small wavenumbers in the 1D power spectrum; however, the unbiased estimator will have unavoidably larger variance. We also show that Gaussian realizations of the power spectrum permit accurate and extremely rapid Monte Carlo simulations for error analysis; repeated realizations of the fully non-Gaussian field are unnecessary. We perform Monte Carlo maximum likelihood simulations of foreground removal which yield unbiased, minimum variance estimates of the power spectrum in agreement with Fisher matrix estimates. Foreground removal also distorts the 21-cm probability distribution function (PDF), reducing the contrast between neutral and ionized regions, with potentially serious consequences for efforts to extract information from the PDF. We show that it is the subtraction of large-scale modes which is responsible for this distortion, and that it is less severe in the earlier stages of reionization. It can be reduced by using larger bandwidths. In the late stages of reionization, identification of the largest ionized regions (which consist of foreground emission only) provides calibration points which potentially allow recovery of large-scale modes. Finally, we also show that (i) the broad frequency response of synchrotron and free-free emission will smear out any features in the electron momentum distribution and ensure spectrally smooth foregrounds and (ii) extragalactic radio recombination lines should be negligible foregrounds.

  3. An engineering closure for heavily under-resolved coarse-grid CFD in large applications

    NASA Astrophysics Data System (ADS)

    Class, Andreas G.; Yu, Fujiang; Jordan, Thomas

    2016-11-01

    Even though high performance computation allows very detailed description of a wide range of scales in scientific computations, engineering simulations used for design studies commonly merely resolve the large scales thus speeding up simulation time. The coarse-grid CFD (CGCFD) methodology is developed for flows with repeated flow patterns as often observed in heat exchangers or porous structures. It is proposed to use inviscid Euler equations on a very coarse numerical mesh. This coarse mesh needs not to conform to the geometry in all details. To reinstall physics on all smaller scales cheap subgrid models are employed. Subgrid models are systematically constructed by analyzing well-resolved generic representative simulations. By varying the flow conditions in these simulations correlations are obtained. These comprehend for each individual coarse mesh cell a volume force vector and volume porosity. Moreover, for all vertices, surface porosities are derived. CGCFD is related to the immersed boundary method as both exploit volume forces and non-body conformal meshes. Yet, CGCFD differs with respect to the coarser mesh and the use of Euler equations. We will describe the methodology based on a simple test case and the application of the method to a 127 pin wire-wrap fuel bundle.

  4. Algorithm sensitivity analysis and parameter tuning for tissue image segmentation pipelines

    PubMed Central

    Kurç, Tahsin M.; Taveira, Luís F. R.; Melo, Alba C. M. A.; Gao, Yi; Kong, Jun; Saltz, Joel H.

    2017-01-01

    Abstract Motivation: Sensitivity analysis and parameter tuning are important processes in large-scale image analysis. They are very costly because the image analysis workflows are required to be executed several times to systematically correlate output variations with parameter changes or to tune parameters. An integrated solution with minimum user interaction that uses effective methodologies and high performance computing is required to scale these studies to large imaging datasets and expensive analysis workflows. Results: The experiments with two segmentation workflows show that the proposed approach can (i) quickly identify and prune parameters that are non-influential; (ii) search a small fraction (about 100 points) of the parameter search space with billions to trillions of points and improve the quality of segmentation results (Dice and Jaccard metrics) by as much as 1.42× compared to the results from the default parameters; (iii) attain good scalability on a high performance cluster with several effective optimizations. Conclusions: Our work demonstrates the feasibility of performing sensitivity analyses, parameter studies and auto-tuning with large datasets. The proposed framework can enable the quantification of error estimations and output variations in image segmentation pipelines. Availability and Implementation: Source code: https://github.com/SBU-BMI/region-templates/. Contact: teodoro@unb.br Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28062445

  5. Algorithm sensitivity analysis and parameter tuning for tissue image segmentation pipelines.

    PubMed

    Teodoro, George; Kurç, Tahsin M; Taveira, Luís F R; Melo, Alba C M A; Gao, Yi; Kong, Jun; Saltz, Joel H

    2017-04-01

    Sensitivity analysis and parameter tuning are important processes in large-scale image analysis. They are very costly because the image analysis workflows are required to be executed several times to systematically correlate output variations with parameter changes or to tune parameters. An integrated solution with minimum user interaction that uses effective methodologies and high performance computing is required to scale these studies to large imaging datasets and expensive analysis workflows. The experiments with two segmentation workflows show that the proposed approach can (i) quickly identify and prune parameters that are non-influential; (ii) search a small fraction (about 100 points) of the parameter search space with billions to trillions of points and improve the quality of segmentation results (Dice and Jaccard metrics) by as much as 1.42× compared to the results from the default parameters; (iii) attain good scalability on a high performance cluster with several effective optimizations. Our work demonstrates the feasibility of performing sensitivity analyses, parameter studies and auto-tuning with large datasets. The proposed framework can enable the quantification of error estimations and output variations in image segmentation pipelines. Source code: https://github.com/SBU-BMI/region-templates/ . teodoro@unb.br. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  6. Large-scale, low-cost synthesis of monodispersed gold nanorods using a gemini surfactant

    NASA Astrophysics Data System (ADS)

    Xu, Yong; Zhao, Yang; Chen, Lei; Wang, Xuchun; Sun, Jianxia; Wu, Haihua; Bao, Feng; Fan, Jian; Zhang, Qiao

    2015-04-01

    In this work, we demonstrate that monodispersed gold nanorods (AuNRs) can be obtained in a large-scale and cost-effective way. By using an industrial grade gemini surfactant (P16-8-16), the cost of the synthesis of high-quality AuNRs can be significantly reduced by 90%. The synthesis can be scaled up to over 4 L. The aspect ratio of AuNRs can be well tuned from ~2.4 to ~6.3, resulting in a wide tunability of the SPR properties. Systematic studies reveal that P16-8-16 could have a dual function: it can not only act as a capping ligand to stabilize AuNRs but also it can pre-reduce Au3+ to Au+ by the unsaturated C&z.dbd;C bond. Furthermore, the shape of AuNRs can be tailored from straight nanorods to ``dog-bones'' by simply varying the concentration of the surfactant. A mechanistic study shows that the shape change can be attributed to the presence of excess bromide ions because of the complex effect between bromide ions and gold ions. This work will not only help to achieve the industrial production of AuNRs, but also promote research into practical applications of various nanomaterials.In this work, we demonstrate that monodispersed gold nanorods (AuNRs) can be obtained in a large-scale and cost-effective way. By using an industrial grade gemini surfactant (P16-8-16), the cost of the synthesis of high-quality AuNRs can be significantly reduced by 90%. The synthesis can be scaled up to over 4 L. The aspect ratio of AuNRs can be well tuned from ~2.4 to ~6.3, resulting in a wide tunability of the SPR properties. Systematic studies reveal that P16-8-16 could have a dual function: it can not only act as a capping ligand to stabilize AuNRs but also it can pre-reduce Au3+ to Au+ by the unsaturated C&z.dbd;C bond. Furthermore, the shape of AuNRs can be tailored from straight nanorods to ``dog-bones'' by simply varying the concentration of the surfactant. A mechanistic study shows that the shape change can be attributed to the presence of excess bromide ions because of the complex effect between bromide ions and gold ions. This work will not only help to achieve the industrial production of AuNRs, but also promote research into practical applications of various nanomaterials. Electronic supplementary information (ESI) available: Digital pictures during the growth process of AuNRs, TEM images of nanoparticles obtained without P16-8-16 or silver, and HRTEM image and SAED patterns of quadrupeds. See DOI: 10.1039/c5nr00343a

  7. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  8. Renormalization group analysis of turbulence

    NASA Technical Reports Server (NTRS)

    Smith, Leslie M.

    1989-01-01

    The objective is to understand and extend a recent theory of turbulence based on dynamic renormalization group (RNG) techniques. The application of RNG methods to hydrodynamic turbulence was explored most extensively by Yakhot and Orszag (1986). An eddy viscosity was calculated which was consistent with the Kolmogorov inertial range by systematic elimination of the small scales in the flow. Further, assumed smallness of the nonlinear terms in the redefined equations for the large scales results in predictions for important flow constants such as the Kolmogorov constant. It is emphasized that no adjustable parameters are needed. The parameterization of the small scales in a self-consistent manner has important implications for sub-grid modeling.

  9. Evaluating the Impact of Conceptual Knowledge Engineering on the Design and Usability of a Clinical and Translational Science Collaboration Portal

    PubMed Central

    Payne, Philip R.O.; Borlawsky, Tara B.; Rice, Robert; Embi, Peter J.

    2010-01-01

    With the growing prevalence of large-scale, team science endeavors in the biomedical and life science domains, the impetus to implement platforms capable of supporting asynchronous interaction among multidisciplinary groups of collaborators has increased commensurately. However, there is a paucity of literature describing systematic approaches to identifying the information needs of targeted end-users for such platforms, and the translation of such requirements into practicable software component design criteria. In previous studies, we have reported upon the efficacy of employing conceptual knowledge engineering (CKE) techniques to systematically address both of the preceding challenges in the context of complex biomedical applications. In this manuscript we evaluate the impact of CKE approaches relative to the design of a clinical and translational science collaboration portal, and report upon the preliminary qualitative users satisfaction as reported for the resulting system. PMID:21347146

  10. Mindfulness Meditation for Chronic Pain: Systematic Review and Meta-analysis.

    PubMed

    Hilton, Lara; Hempel, Susanne; Ewing, Brett A; Apaydin, Eric; Xenakis, Lea; Newberry, Sydne; Colaiaco, Ben; Maher, Alicia Ruelaz; Shanman, Roberta M; Sorbero, Melony E; Maglione, Margaret A

    2017-04-01

    Chronic pain patients increasingly seek treatment through mindfulness meditation. This study aims to synthesize evidence on efficacy and safety of mindfulness meditation interventions for the treatment of chronic pain in adults. We conducted a systematic review on randomized controlled trials (RCTs) with meta-analyses using the Hartung-Knapp-Sidik-Jonkman method for random-effects models. Quality of evidence was assessed using the GRADE approach. Outcomes included pain, depression, quality of life, and analgesic use. Thirty-eight RCTs met inclusion criteria; seven reported on safety. We found low-quality evidence that mindfulness meditation is associated with a small decrease in pain compared with all types of controls in 30 RCTs. Statistically significant effects were also found for depression symptoms and quality of life. While mindfulness meditation improves pain and depression symptoms and quality of life, additional well-designed, rigorous, and large-scale RCTs are needed to decisively provide estimates of the efficacy of mindfulness meditation for chronic pain.

  11. Black hole mass measurement using molecular gas kinematics: what ALMA can do

    NASA Astrophysics Data System (ADS)

    Yoon, Ilsang

    2017-04-01

    We study the limits of the spatial and velocity resolution of radio interferometry to infer the mass of supermassive black holes (SMBHs) in galactic centres using the kinematics of circum-nuclear molecular gas, by considering the shapes of the galaxy surface brightness profile, signal-to-noise ratios (S/Ns) of the position-velocity diagram (PVD) and systematic errors due to the spatial and velocity structure of the molecular gas. We argue that for fixed galaxy stellar mass and SMBH mass, the spatial and velocity scales that need to be resolved increase and decrease, respectively, with decreasing Sérsic index of the galaxy surface brightness profile. We validate our arguments using simulated PVDs for varying beam size and velocity channel width. Furthermore, we consider the systematic effects on the inference of the SMBH mass by simulating PVDs including the spatial and velocity structure of the molecular gas, which demonstrates that their impacts are not significant for a PVD with good S/N unless the spatial and velocity scale associated with the systematic effects are comparable to or larger than the angular resolution and velocity channel width of the PVD from pure circular motion. Also, we caution that a bias in a galaxy surface brightness profile owing to the poor resolution of a galaxy photometric image can largely bias the SMBH mass by an order of magnitude. This study shows the promise and the limits of ALMA observations for measuring SMBH mass using molecular gas kinematics and provides a useful technical justification for an ALMA proposal with the science goal of measuring SMBH mass.

  12. Is Implicit Theory of Mind a Real and Robust Phenomenon? Results From a Systematic Replication Study.

    PubMed

    Kulke, Louisa; von Duhn, Britta; Schneider, Dana; Rakoczy, Hannes

    2018-06-01

    Recently, theory-of-mind research has been revolutionized by findings from novel implicit tasks suggesting that at least some aspects of false-belief reasoning develop earlier in ontogeny than previously assumed and operate automatically throughout adulthood. Although these findings are the empirical basis for far-reaching theories, systematic replications are still missing. This article reports a preregistered large-scale attempt to replicate four influential anticipatory-looking implicit theory-of-mind tasks using original stimuli and procedures. Results showed that only one of the four paradigms was reliably replicated. A second set of studies revealed, further, that this one paradigm was no longer replicated once confounds were removed, which calls its validity into question. There were also no correlations between paradigms, and thus, no evidence for their convergent validity. In conclusion, findings from anticipatory-looking false-belief paradigms seem less reliable and valid than previously assumed, thus limiting the conclusions that can be drawn from them.

  13. Large-scale systematic analysis of 2D fingerprint methods and parameters to improve virtual screening enrichments.

    PubMed

    Sastry, Madhavi; Lowrie, Jeffrey F; Dixon, Steven L; Sherman, Woody

    2010-05-24

    A systematic virtual screening study on 11 pharmaceutically relevant targets has been conducted to investigate the interrelation between 8 two-dimensional (2D) fingerprinting methods, 13 atom-typing schemes, 13 bit scaling rules, and 12 similarity metrics using the new cheminformatics package Canvas. In total, 157 872 virtual screens were performed to assess the ability of each combination of parameters to identify actives in a database screen. In general, fingerprint methods, such as MOLPRINT2D, Radial, and Dendritic that encode information about local environment beyond simple linear paths outperformed other fingerprint methods. Atom-typing schemes with more specific information, such as Daylight, Mol2, and Carhart were generally superior to more generic atom-typing schemes. Enrichment factors across all targets were improved considerably with the best settings, although no single set of parameters performed optimally on all targets. The size of the addressable bit space for the fingerprints was also explored, and it was found to have a substantial impact on enrichments. Small bit spaces, such as 1024, resulted in many collisions and in a significant degradation in enrichments compared to larger bit spaces that avoid collisions.

  14. Effects of geometry and fluid elasticity during polymeric droplet pinch-off in microfluidic environments

    NASA Astrophysics Data System (ADS)

    Steinhaus, Ben; Shen, Amy; Sureshkumar, Radhakrishna

    2006-11-01

    We investigate the effects of fluid elasticity and channel geometry on polymeric droplet pinch-off by performing systematic experiments using viscoelastic polymer solutions which possess practically shear rate-independent viscosity (Boger fluids). Four different geometric sizes (width and depth are scaled up proportionally at the ratio of 0.5, 1, 2, 20) are used to study the effect of the length scale, which in turn influences the ratio of elastic to viscous forces as well as the Rayleigh time scale associated with the interfacial instability of a cylindrical column of liquid. We observe a power law relationship between the dimensionless (scaled with respect to the Rayleigh time scale) capillary pinch-off time, T, and the elasticity number, E, defined as the ratio of the fluid relaxation time to the time scale of viscous diffusion. In general, T increases dramatically with increasing E. The inhibition of ``bead-on-a-string'' formation is observed for flows with effective Deborah number, De, defined as the ratio of the fluid relaxation time to the Rayleigh time scale becomes greater than 10. For sufficiently large values of De, the Rayleigh instability may be modified substantially by fluid elasticity.

  15. Statistical detection of systematic election irregularities

    PubMed Central

    Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan

    2012-01-01

    Democratic societies are built around the principle of free and fair elections, and that each citizen’s vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons. PMID:23010929

  16. Statistical detection of systematic election irregularities.

    PubMed

    Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan

    2012-10-09

    Democratic societies are built around the principle of free and fair elections, and that each citizen's vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons.

  17. Accuracy of Prediction Instruments for Diagnosing Large Vessel Occlusion in Individuals With Suspected Stroke: A Systematic Review for the 2018 Guidelines for the Early Management of Patients With Acute Ischemic Stroke.

    PubMed

    Smith, Eric E; Kent, David M; Bulsara, Ketan R; Leung, Lester Y; Lichtman, Judith H; Reeves, Mathew J; Towfighi, Amytis; Whiteley, William N; Zahuranec, Darin B

    2018-03-01

    Endovascular thrombectomy is a highly efficacious treatment for large vessel occlusion (LVO). LVO prediction instruments, based on stroke signs and symptoms, have been proposed to identify stroke patients with LVO for rapid transport to endovascular thrombectomy-capable hospitals. This evidence review committee was commissioned by the American Heart Association/American Stroke Association to systematically review evidence for the accuracy of LVO prediction instruments. Medline, Embase, and Cochrane databases were searched on October 27, 2016. Study quality was assessed with the Quality Assessment of Diagnostic Accuracy-2 tool. Thirty-six relevant studies were identified. Most studies (21 of 36) recruited patients with ischemic stroke, with few studies in the prehospital setting (4 of 36) and in populations that included hemorrhagic stroke or stroke mimics (12 of 36). The most frequently studied prediction instrument was the National Institutes of Health Stroke Scale. Most studies had either some risk of bias or unclear risk of bias. Reported discrimination of LVO mostly ranged from 0.70 to 0.85, as measured by the C statistic. In meta-analysis, sensitivity was as high as 87% and specificity was as high as 90%, but no threshold on any instruments predicted LVO with both high sensitivity and specificity. With a positive LVO prediction test, the probability of LVO could be 50% to 60% (depending on the LVO prevalence in the population), but the probability of LVO with a negative test could still be ≥10%. No scale predicted LVO with both high sensitivity and high specificity. Systems that use LVO prediction instruments for triage will miss some patients with LVO and milder stroke. More prospective studies are needed to assess the accuracy of LVO prediction instruments in the prehospital setting in all patients with suspected stroke, including patients with hemorrhagic stroke and stroke mimics. © 2018 American Heart Association, Inc.

  18. Controls on fallen leaf chemistry and forest floor element masses in native and novel forests across a tropical island

    Treesearch

    H.E. Erickson; E.H. Helmer; T.J. Brandeis; A.E. Lugo

    2014-01-01

    Litter chemistry varies across landscapes according to factors rarely examined simultaneously. We analyzed 11 elements in forest floor (fallen) leaves and additional litter components from 143 forest inventory plots systematically located across Puerto Rico, a tropical island recovering from large-scale forest clearing. We assessed whether three existing, independently...

  19. Using Systematic Item Selection Methods to Improve Universal Design of Assessments. Policy Directions. Number 18

    ERIC Educational Resources Information Center

    Johnstone, Christopher; Thurlow, Martha; Moore, Michael; Altman, Jason

    2006-01-01

    The No Child Left Behind Act of 2001 (NCLB) and other recent changes in federal legislation have placed greater emphasis on accountability in large-scale testing. Included in this emphasis are regulations that require assessments to be accessible. States are accountable for the success of all students, and tests should be designed in a way that…

  20. Lessons Learned from PISA: A Systematic Review of Peer-Reviewed Articles on the Programme for International Student Assessment

    ERIC Educational Resources Information Center

    Hopfenbeck, Therese N.; Lenkeit, Jenny; El Masri, Yasmine; Cantrell, Kate; Ryan, Jeanne; Baird, Jo-Anne

    2018-01-01

    International large-scale assessments are on the rise, with the Programme for International Student Assessment (PISA) seen by many as having strategic prominence in education policy debates. The present article reviews PISA-related English-language peer-reviewed articles from the programme's first cycle in 2000 to its most current in 2015. Five…

  1. Disrupted Topological Patterns of Large-Scale Network in Conduct Disorder

    PubMed Central

    Jiang, Yali; Liu, Weixiang; Ming, Qingsen; Gao, Yidian; Ma, Ren; Zhang, Xiaocui; Situ, Weijun; Wang, Xiang; Yao, Shuqiao; Huang, Bingsheng

    2016-01-01

    Regional abnormalities in brain structure and function, as well as disrupted connectivity, have been found repeatedly in adolescents with conduct disorder (CD). Yet, the large-scale brain topology associated with CD is not well characterized, and little is known about the systematic neural mechanisms of CD. We employed graphic theory to investigate systematically the structural connectivity derived from cortical thickness correlation in a group of patients with CD (N = 43) and healthy controls (HCs, N = 73). Nonparametric permutation tests were applied for between-group comparisons of graphical metrics. Compared with HCs, network measures including global/local efficiency and modularity all pointed to hypo-functioning in CD, despite of preserved small-world organization in both groups. The hubs distribution is only partially overlapped with each other. These results indicate that CD is accompanied by both impaired integration and segregation patterns of brain networks, and the distribution of highly connected neural network ‘hubs’ is also distinct between groups. Such misconfiguration extends our understanding regarding how structural neural network disruptions may underlie behavioral disturbances in adolescents with CD, and potentially, implicates an aberrant cytoarchitectonic profiles in the brain of CD patients. PMID:27841320

  2. An objective algorithm for estimating maximum oceanic mixed layer depth using seasonality indices derived from Argo temperature/salinity profiles

    NASA Astrophysics Data System (ADS)

    Chen, Ge; Yu, Fangjie

    2015-01-01

    In this study, we propose a new algorithm for estimating the annual maximum mixed layer depth (M2LD) analogous to a full range of local "ventilation" depth, and corresponding to the deepest surface to which atmospheric influence can be "felt." Two "seasonality indices" are defined, respectively, for temperature and salinity through Fourier analysis of their time series using Argo data, on the basis of which a significant local minimum of the index corresponding to a maximum penetration depth can be identified. A final M2LD is then determined by maximizing the thermal and haline effects. Unlike most of the previous schemes which use arbitrary thresholds or subjective criteria, the new algorithm is objective, robust, and property adaptive provided a significant periodic geophysical forcing such as annual cycle is available. The validity of our methodology is confirmed by the spatial correlation of the tropical dominance of saline effect (mainly related to rainfall cycle) and the extratropical dominance of thermal effect (mainly related to solar cycle). It is also recognized that the M2LD distribution is characterized by the coexistence of basin-scale zonal structures and eddy-scale local patches. In addition to the fundamental buoyancy forcing caused mainly by latitude-dependent solar radiation, the impressive two-scale pattern is found to be primarily attributable to (1) large-wave climate due to extreme winds (large scale) and (2) systematic eddy shedding as a result of persistent winds (mesoscale). Moreover, a general geographical consistency and a good quantitative agreement are found between the new algorithm and those published in the literature. However, a major discrepancy in our result is the existence of a constantly deeper M2LD band compared with other results in the midlatitude oceans of both hemispheres. Given the better correspondence of our M2LDs with the depth of the oxygen saturation limit, it is argued that there might be a systematic underestimation with existing criteria in these regions. Our results demonstrate that the M2LD may serve as an integrated proxy for studying the coherent multidisciplinary variabilities of the coupled ocean-atmosphere system.

  3. Systematic review of empowerment measures in health promotion.

    PubMed

    Cyril, Sheila; Smith, Ben J; Renzaho, Andre M N

    2016-12-01

    Empowerment, a multi-level construct comprising individual, community and organizational domains, is a fundamental value and goal in health promotion. While a range of scales have been developed for the measurement of empowerment, the qualities of these have not been rigorously assessed. The aim of this study was to evaluate the measurement properties of quantitative empowerment scales and their applicability in health promotion programs. A systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was done to evaluate empowerment scales across three dimensions: item development, reliability and validity. This was followed by assessment of measurement properties using a ratings scale with criteria addressing an a priori explicit theoretical framework, assessment of content validity, internal consistency and factor analysis to test structural validity. Of the 20 studies included in this review, only 8 (40%) used literature reviews, expert panels and empirical studies to develop scale items and 9 (45%) of studies fulfilled ≥5 criteria on the ratings scale. Two studies (10%) measured community empowerment and one study measured organizational empowerment, the rest (85%) measured individual empowerment. This review highlights important gaps in the measurement of community and organizational domains of empowerment using quantitative scales. A priority for future empowerment research is to investigate and explore approaches such as mixed methods to enable adequate measurement of empowerment across all three domains. This would help health promotion practitioners to effectively measure empowerment as a driver of change and an outcome in health promotion programs. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Internalized Homonegativity: A Systematic Mapping Review of Empirical Research

    PubMed Central

    Berg, Rigmor C.; Munthe-Kaas, Heather M.; Ross, Michael W.

    2016-01-01

    ABSTRACT Internalized homonegativity (IH) is an important variable affecting the wellbeing of lesbian, gay, and bisexual (LGB) persons. We included 201 studies in a systematic mapping review of IH. Most studies were conducted in North America and examined IH as a predictor of poor health. The primary focus of 14 studies was IH scale measurement, and, in total, these studies detailed nine distinct scales. Eighteen studies compared levels of IH in LGB populations, four described prevention programs, and one investigated IH using qualitative methods. Our review indicates that further research is needed, particularly qualitative research and ways to ameliorate IH. PMID:26436322

  5. Applying systematic review search methods to the grey literature: a case study examining guidelines for school-based breakfast programs in Canada.

    PubMed

    Godin, Katelyn; Stapleton, Jackie; Kirkpatrick, Sharon I; Hanning, Rhona M; Leatherdale, Scott T

    2015-10-22

    Grey literature is an important source of information for large-scale review syntheses. However, there are many characteristics of grey literature that make it difficult to search systematically. Further, there is no 'gold standard' for rigorous systematic grey literature search methods and few resources on how to conduct this type of search. This paper describes systematic review search methods that were developed and applied to complete a case study systematic review of grey literature that examined guidelines for school-based breakfast programs in Canada. A grey literature search plan was developed to incorporate four different searching strategies: (1) grey literature databases, (2) customized Google search engines, (3) targeted websites, and (4) consultation with contact experts. These complementary strategies were used to minimize the risk of omitting relevant sources. Since abstracts are often unavailable in grey literature documents, items' abstracts, executive summaries, or table of contents (whichever was available) were screened. Screening of publications' full-text followed. Data were extracted on the organization, year published, who they were developed by, intended audience, goal/objectives of document, sources of evidence/resources cited, meals mentioned in the guidelines, and recommendations for program delivery. The search strategies for identifying and screening publications for inclusion in the case study review was found to be manageable, comprehensive, and intuitive when applied in practice. The four search strategies of the grey literature search plan yielded 302 potentially relevant items for screening. Following the screening process, 15 publications that met all eligibility criteria remained and were included in the case study systematic review. The high-level findings of the case study systematic review are briefly described. This article demonstrated a feasible and seemingly robust method for applying systematic search strategies to identify web-based resources in the grey literature. The search strategy we developed and tested is amenable to adaptation to identify other types of grey literature from other disciplines and answering a wide range of research questions. This method should be further adapted and tested in future research syntheses.

  6. Wind power for the electric-utility industry: Policy incentives for fuel conservation

    NASA Astrophysics Data System (ADS)

    March, F.; Dlott, E. H.; Korn, D. H.; Madio, F. R.; McArthur, R. C.; Vachon, W. A.

    1982-06-01

    A systematic method for evaluating the economics of solar-electric/conservation technologies as fuel-savings investments for electric utilities in the presence of changing federal incentive policies is presented. The focus is on wind energy conversion systems (WECS) as the solar technology closest to near-term large scale implementation. Commercially available large WECS are described, along with computer models to calculate the economic impact of the inclusion of WECS as 10% of the base-load generating capacity on a grid. A guide to legal structures and relationships which impinge on large-scale WECS utilization is developed, together with a quantitative examination of the installation of 1000 MWe of WECS capacity by a utility in the northeast states. Engineering and financial analyses were performed, with results indicating government policy changes necessary to encourage the entrance of utilities into the field of windpower utilization.

  7. Lagrangian or Eulerian; real or Fourier? Not all approaches to large-scale structure are created equal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tassev, Svetlin, E-mail: tassev@astro.princeton.edu

    We present a pedagogical systematic investigation of the accuracy of Eulerian and Lagrangian perturbation theories of large-scale structure. We show that significant differences exist between them especially when trying to model the Baryon Acoustic Oscillations (BAO). We find that the best available model of the BAO in real space is the Zel'dovich Approximation (ZA), giving an accuracy of ∼<3% at redshift of z = 0 in modelling the matter 2-pt function around the acoustic peak. All corrections to the ZA around the BAO scale are perfectly perturbative in real space. Any attempt to achieve better precision requires calibrating the theorymore » to simulations because of the need to renormalize those corrections. In contrast, theories which do not fully preserve the ZA as their solution, receive O(1) corrections around the acoustic peak in real space at z = 0, and are thus of suspicious convergence at low redshift around the BAO. As an example, we find that a similar accuracy of 3% for the acoustic peak is achieved by Eulerian Standard Perturbation Theory (SPT) at linear order only at z ≈ 4. Thus even when SPT is perturbative, one needs to include loop corrections for z∼<4 in real space. In Fourier space, all models perform similarly, and are controlled by the overdensity amplitude, thus recovering standard results. However, that comes at a price. Real space cleanly separates the BAO signal from non-linear dynamics. In contrast, Fourier space mixes signal from short mildly non-linear scales with the linear signal from the BAO to the level that non-linear contributions from short scales dominate. Therefore, one has little hope in constructing a systematic theory for the BAO in Fourier space.« less

  8. Systems Science and Childhood Obesity: A Systematic Review and New Directions

    PubMed Central

    Foster, E. Michael

    2013-01-01

    As a public health problem, childhood obesity operates at multiple levels, ranging from individual health behaviors to school and community characteristics to public policies. Examining obesity, particularly childhood obesity, from any single perspective is likely to fail, and systems science methods offer a possible solution. We systematically reviewed studies that examined the causes and/or consequences of obesity from a systems science perspective. The 21 included studies addressed four general areas of systems science in obesity: (1) translating interventions to a large scale, (2) the effect of obesity on other health or economic outcomes, (3) the effect of geography on obesity, and (4) the effect of social networks on obesity. In general, little research addresses obesity from a true, integrated systems science perspective, and the available research infrequently focuses on children. This shortcoming limits the ability of that research to inform public policy. However, we believe that the largely incremental approaches used in current systems science lay a foundation for future work and present a model demonstrating the system of childhood obesity. Systems science perspective and related methods are particularly promising in understanding the link between childhood obesity and adult outcomes. Systems models emphasize the evolution of agents and their interactions; such evolution is particularly salient in the context of a developing child. PMID:23710344

  9. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    PubMed

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.

  10. A proposed method to investigate reliability throughout a questionnaire

    PubMed Central

    2011-01-01

    Background Questionnaires are used extensively in medical and health care research and depend on validity and reliability. However, participants may differ in interest and awareness throughout long questionnaires, which can affect reliability of their answers. A method is proposed for "screening" of systematic change in random error, which could assess changed reliability of answers. Methods A simulation study was conducted to explore whether systematic change in reliability, expressed as changed random error, could be assessed using unsupervised classification of subjects by cluster analysis (CA) and estimation of intraclass correlation coefficient (ICC). The method was also applied on a clinical dataset from 753 cardiac patients using the Jalowiec Coping Scale. Results The simulation study showed a relationship between the systematic change in random error throughout a questionnaire and the slope between the estimated ICC for subjects classified by CA and successive items in a questionnaire. This slope was proposed as an awareness measure - to assessing if respondents provide only a random answer or one based on a substantial cognitive effort. Scales from different factor structures of Jalowiec Coping Scale had different effect on this awareness measure. Conclusions Even though assumptions in the simulation study might be limited compared to real datasets, the approach is promising for assessing systematic change in reliability throughout long questionnaires. Results from a clinical dataset indicated that the awareness measure differed between scales. PMID:21974842

  11. Factors that predict remission of infant atopic dermatitis: a systematic review.

    PubMed

    von Kobyletzki, Laura; Svensson, Åke; Apfelbacher, Christian; Schmitt, Jochen

    2015-04-01

    The individual prognosis of infants with atopic dermatitis (AD) is important for parents, healthcare professionals, and society. The aim of this study was to investigate predictors for remission of infant AD until school age. A systematic review was carried out of clinical and epidemiological studies investigating the effect of filaggrin gene (FLG) loss-of-function mutations, sex, exposure to pets, topical anti-inflammatory treatment, disease severity, and atopic sensitization during infancy on complete remission of infant-onset AD until 6-7 years of age. Systematic electronic searches until September 2013, data abstraction, and study quality assessment (Newcastle-Ottawa Scale) were performed. From 3,316 abstracts identified, 2 studies of good study quality were included. Parental allergies and sex did not significantly affect remission. For non-remission of AD, the included articles reported an association with any atopic sensitization at 2 years old (adjusted odds ratio [aOR] 2.76; 95% confidence interval (CI) 1.29-5.91), frequent scratching with early AD (aOR 5.86; 95% CI 3.04-11.29), objective severity score at 2 years old (aOR 1.10; 95% CI 1.07-1.14), and exposure to pets (cat OR 2.33; 95% CI 0.85-6.38). It is largely unknown which factors predict remission of infant AD. This is a highly relevant research gap that hinders patient information on the prognosis of infant-onset AD.

  12. Transport Coefficients from Large Deviation Functions

    NASA Astrophysics Data System (ADS)

    Gao, Chloe; Limmer, David

    2017-10-01

    We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.

  13. A non-perturbative exploration of the high energy regime in Nf=3 QCD. ALPHA Collaboration

    NASA Astrophysics Data System (ADS)

    Dalla Brida, Mattia; Fritzsch, Patrick; Korzec, Tomasz; Ramos, Alberto; Sint, Stefan; Sommer, Rainer

    2018-05-01

    Using continuum extrapolated lattice data we trace a family of running couplings in three-flavour QCD over a large range of scales from about 4 to 128 GeV. The scale is set by the finite space time volume so that recursive finite size techniques can be applied, and Schrödinger functional (SF) boundary conditions enable direct simulations in the chiral limit. Compared to earlier studies we have improved on both statistical and systematic errors. Using the SF coupling to implicitly define a reference scale 1/L_0≈ 4 GeV through \\bar{g}^2(L_0) =2.012, we quote L_0 Λ ^{N_f=3}_{{\\overline{MS}}} =0.0791(21). This error is dominated by statistics; in particular, the remnant perturbative uncertainty is negligible and very well controlled, by connecting to infinite renormalization scale from different scales 2^n/L_0 for n=0,1,\\ldots ,5. An intermediate step in this connection may involve any member of a one-parameter family of SF couplings. This provides an excellent opportunity for tests of perturbation theory some of which have been published in a letter (ALPHA collaboration, M. Dalla Brida et al. in Phys Rev Lett 117(18):182001, 2016). The results indicate that for our target precision of 3 per cent in L_0 Λ ^{N_f=3}_{{\\overline{MS}}}, a reliable estimate of the truncation error requires non-perturbative data for a sufficiently large range of values of α _s=\\bar{g}^2/(4π ). In the present work we reach this precision by studying scales that vary by a factor 2^5= 32, reaching down to α _s≈ 0.1. We here provide the details of our analysis and an extended discussion.

  14. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE) - A Systematic Review of Rating Scales

    PubMed Central

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Background Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students’ communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. Methods We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Results Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Discussion Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students’ academic success. PMID:27031506

  15. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE)--A Systematic Review of Rating Scales.

    PubMed

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students' communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students' academic success.

  16. A subcontinental view of forest plant invasions

    Treesearch

    Christopher M. Oswalt; Songlin Fei; Qinfeng Guo; Basil V. Iannone III; Sonja N. Oswalt; Bryan C. Pijanowski; Kevin M. Potter

    2015-01-01

    Over the last few decades, considerable attention has focused on small-scale studies of invasive plants and invaded systems. Unfortunately, small scale studies rarely provide comprehensive insight into the complexities of biological invasions at macroscales. Systematic and repeated monitoring of biological invasions at broad scales are rare. In this report, we...

  17. A Global Survey of Oceanic Mesoscale Convective Systems in Association with the Large-scale Water Vapor and Vertical Wind Shear

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Zhan, T.

    2017-12-01

    Sizes and organizations of mesoscale scale convective systems (MCSs) usually are related to both their precipitation characteristics and anvil productivity, which are crucial but not well-represented in current climate models. This study aims to further our knowledge about MCSs by documenting the relationship between MCSs and their associated large-scale environmental moisture and wind shear in different phases of large-scale convection. A dataset derived from MODIS and AMSR-E and TRMM, CMOPH and ERA-Interim reanalysis are used. Larger and merged systems tend to occur more frequently when the large-scale convection is stronger. At the occurrence time of MCSs, the middle troposphere relative humidity (MRH, 800-400hPa) shows large increases ( 15%) from the suppressed to the active phases. Differences of the MRH across phases appear in a large area and reaches its maximum at 650 850 km away from the center of MCSs. Higher MRH is found within 650 km around the center of merged and large MCSs in all phases. This distance is much larger than the size of any single MCSs. The MRH shows larger spatial gradients around merged MCSs, indicating that moisture tends to cluster around merged systems. Similar spatial differences of MRH appear at all phases 1-3 days before the MCSs occur. In lower troposphere (1000-850hPa), differences in the relative humidity are much smaller than that of MRH. In all phases around all MCSs the oceanic boundary layer is always effectively moisturized (RH>92%). Temporally the lower troposphere relative humidity is dominated by diurnal variations. No clear difference across systems of the wind shear is found when the domain-wide upward motion is dominated. In all cases there are always large low-level (1000-750hPa) wind shear (7-9m/s) and middle level (1000-750hPa) wind shear (11-15m/s) occurring at large distances (>500km) away from MCSs. However, both the low-level and the middle level wind shear closely around the MCSs converge to moderate values of 3-4.2m/s and 5-7m/s, respectively. Indicating that weak or moderate wind shear conditions favor developments of MCSs. Small but systematical differences in wind shear across phases are found. This study provides an observational reference for both cloud resovling or climate models to diagnose and improve their representaions of organized convection.

  18. Effect of Home Exercise Program in Patients With Knee Osteoarthritis: A Systematic Review and Meta-analysis.

    PubMed

    Anwer, Shahnawaz; Alghadir, Ahmad; Brismée, Jean-Michel

    2016-01-01

    The Osteoarthritis Research Society International recommended that nonpharmacological methods include patient education programs, weight reduction, coping strategies, and exercise programs for the management of knee osteoarthritis (OA). However, neither a systematic review nor a meta-analysis has been published regarding the effectiveness of home exercise programs for the management of knee OA. The purpose of this systematic review was to examine the evidence regarding the effect of home exercise programs with and without supervised clinic-based exercises in the management of knee OA. We searched PubMed, CINAHL, Embase, Scopus, and PEDro for research articles published prior to September 2014 using key words such as pain, exercise, home exercise program, rehabilitation, supervised exercise program, and physiotherapy in combination with Medical Subject Headings "Osteoarthritis knee." We selected randomized and case-controlled trials published in English language. To verify the quality of the selected studies, we applied the PEDro Scale. Two evaluators individually selected the studies based on titles, excluding those articles that were not related to the objectives of this review. One evaluator extracted data from the included studies. A second evaluator independently verified extracted data for accuracy. A total of 31 studies were found in the search. Of these, 19 studies met the inclusion criteria and were further analyzed. Seventeen of these 19 studies reached high methodological quality on the PEDro scale. Although the methods and home exercise program interventions varied widely in these studies, most found significant improvements in pain and function in individuals with knee OA. The analysis indicated that both home exercise programs with and without supervised clinic-based exercises were beneficial in the management of knee OA. The large evidence of high-quality trials supports the effectiveness of home exercise programs with and without supervised clinic-based exercises in the rehabilitation of knee OA. In addition, small but growing evidence supports the effectiveness of other types of exercise such as tai chi, balance, and proprioceptive training for individuals with knee OA.

  19. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Diane R.; Wolfrum, Edward J.; Virk, Parminder

    Rice plants ( Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, establishedmore » an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. Lastly, we find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice.« less

  20. Efficient processing of fluorescence images using directional multiscale representations.

    PubMed

    Labate, D; Laezza, F; Negi, P; Ozcan, B; Papadakis, M

    2014-01-01

    Recent advances in high-resolution fluorescence microscopy have enabled the systematic study of morphological changes in large populations of cells induced by chemical and genetic perturbations, facilitating the discovery of signaling pathways underlying diseases and the development of new pharmacological treatments. In these studies, though, due to the complexity of the data, quantification and analysis of morphological features are for the vast majority handled manually, slowing significantly data processing and limiting often the information gained to a descriptive level. Thus, there is an urgent need for developing highly efficient automated analysis and processing tools for fluorescent images. In this paper, we present the application of a method based on the shearlet representation for confocal image analysis of neurons. The shearlet representation is a newly emerged method designed to combine multiscale data analysis with superior directional sensitivity, making this approach particularly effective for the representation of objects defined over a wide range of scales and with highly anisotropic features. Here, we apply the shearlet representation to problems of soma detection of neurons in culture and extraction of geometrical features of neuronal processes in brain tissue, and propose it as a new framework for large-scale fluorescent image analysis of biomedical data.

  1. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice

    PubMed Central

    Wang, Diane R.; Wolfrum, Edward J.; Virk, Parminder; Ismail, Abdelbagi; Greenberg, Anthony J.; McCouch, Susan R.

    2016-01-01

    Rice plants (Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, established an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. We find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice. PMID:27707775

  2. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice

    DOE PAGES

    Wang, Diane R.; Wolfrum, Edward J.; Virk, Parminder; ...

    2016-10-05

    Rice plants ( Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, establishedmore » an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. Lastly, we find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice.« less

  3. Efficient processing of fluorescence images using directional multiscale representations

    PubMed Central

    Labate, D.; Laezza, F.; Negi, P.; Ozcan, B.; Papadakis, M.

    2017-01-01

    Recent advances in high-resolution fluorescence microscopy have enabled the systematic study of morphological changes in large populations of cells induced by chemical and genetic perturbations, facilitating the discovery of signaling pathways underlying diseases and the development of new pharmacological treatments. In these studies, though, due to the complexity of the data, quantification and analysis of morphological features are for the vast majority handled manually, slowing significantly data processing and limiting often the information gained to a descriptive level. Thus, there is an urgent need for developing highly efficient automated analysis and processing tools for fluorescent images. In this paper, we present the application of a method based on the shearlet representation for confocal image analysis of neurons. The shearlet representation is a newly emerged method designed to combine multiscale data analysis with superior directional sensitivity, making this approach particularly effective for the representation of objects defined over a wide range of scales and with highly anisotropic features. Here, we apply the shearlet representation to problems of soma detection of neurons in culture and extraction of geometrical features of neuronal processes in brain tissue, and propose it as a new framework for large-scale fluorescent image analysis of biomedical data. PMID:28804225

  4. Metabolomic Modularity Analysis (MMA) to Quantify Human Liver Perfusion Dynamics.

    PubMed

    Sridharan, Gautham Vivek; Bruinsma, Bote Gosse; Bale, Shyam Sundhar; Swaminathan, Anandh; Saeidi, Nima; Yarmush, Martin L; Uygun, Korkut

    2017-11-13

    Large-scale -omics data are now ubiquitously utilized to capture and interpret global responses to perturbations in biological systems, such as the impact of disease states on cells, tissues, and whole organs. Metabolomics data, in particular, are difficult to interpret for providing physiological insight because predefined biochemical pathways used for analysis are inherently biased and fail to capture more complex network interactions that span multiple canonical pathways. In this study, we introduce a nov-el approach coined Metabolomic Modularity Analysis (MMA) as a graph-based algorithm to systematically identify metabolic modules of reactions enriched with metabolites flagged to be statistically significant. A defining feature of the algorithm is its ability to determine modularity that highlights interactions between reactions mediated by the production and consumption of cofactors and other hub metabolites. As a case study, we evaluated the metabolic dynamics of discarded human livers using time-course metabolomics data and MMA to identify modules that explain the observed physiological changes leading to liver recovery during subnormothermic machine perfusion (SNMP). MMA was performed on a large scale liver-specific human metabolic network that was weighted based on metabolomics data and identified cofactor-mediated modules that would not have been discovered by traditional metabolic pathway analyses.

  5. Sequencing and annotation of mitochondrial genomes from individual parasitic helminths.

    PubMed

    Jex, Aaron R; Littlewood, D Timothy; Gasser, Robin B

    2015-01-01

    Mitochondrial (mt) genomics has significant implications in a range of fundamental areas of parasitology, including evolution, systematics, and population genetics as well as explorations of mt biochemistry, physiology, and function. Mt genomes also provide a rich source of markers to aid molecular epidemiological and ecological studies of key parasites. However, there is still a paucity of information on mt genomes for many metazoan organisms, particularly parasitic helminths, which has often related to challenges linked to sequencing from tiny amounts of material. The advent of next-generation sequencing (NGS) technologies has paved the way for low cost, high-throughput mt genomic research, but there have been obstacles, particularly in relation to post-sequencing assembly and analyses of large datasets. In this chapter, we describe protocols for the efficient amplification and sequencing of mt genomes from small portions of individual helminths, and highlight the utility of NGS platforms to expedite mt genomics. In addition, we recommend approaches for manual or semi-automated bioinformatic annotation and analyses to overcome the bioinformatic "bottleneck" to research in this area. Taken together, these approaches have demonstrated applicability to a range of parasites and provide prospects for using complete mt genomic sequence datasets for large-scale molecular systematic and epidemiological studies. In addition, these methods have broader utility and might be readily adapted to a range of other medium-sized molecular regions (i.e., 10-100 kb), including large genomic operons, and other organellar (e.g., plastid) and viral genomes.

  6. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  7. The accurate particle tracer code

    NASA Astrophysics Data System (ADS)

    Wang, Yulei; Liu, Jian; Qin, Hong; Yu, Zhi; Yao, Yicun

    2017-11-01

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runaway electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world's fastest computer, the Sunway TaihuLight supercomputer, by supporting master-slave architecture of Sunway many-core processors. Based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.

  8. Past and present cosmic structure in the SDSS DR7 main sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasche, J.; Leclercq, F.; Wandelt, B.D., E-mail: jasche@iap.fr, E-mail: florent.leclercq@polytechnique.org, E-mail: wandelt@iap.fr

    2015-01-01

    We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structuremore » formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than ∼ 3 Mpc/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.« less

  9. Effectiveness of information and communication technologies interventions to increase mental health literacy: A systematic review.

    PubMed

    Tay, Jing Ling; Tay, Yi Fen; Klainin-Yobas, Piyanee

    2018-06-13

    Most mental health conditions affect adolescent and young adults. The onset of many mental disorders occurs in the young age. This is a critical period to implement interventions to enhance mental health literacy (MHL) and to prevent the occurrence of mental health problems. This systematic review examined the effectiveness of information and communication technologies interventions on MHL (recognition of conditions, stigma and help-seeking). The authors searched for both published and unpublished studies. Nineteen studies were included with 9 randomized controlled trials and 10 quasi-experimental studies. Informational interventions were useful to enhance MHL of less-known disorders such as anxiety disorder and anorexia, but not depression. Interventions that were effective in enhancing depression MHL comprised active component such as videos or quizzes. Interventions that successfully elevated MHL also reduced stigma. Elevated MHL levels did not improve help-seeking, and reduction in stigma levels did not enhance help-seeking behaviours. Future good quality, large-scale, multi-sites randomized controlled trials are necessary to evaluate MHL interventions. © 2018 John Wiley & Sons Australia, Ltd.

  10. Atmospheric energetics in regions of intense convective activity

    NASA Technical Reports Server (NTRS)

    Fuelberg, H. E.

    1977-01-01

    Synoptic-scale budgets of kinetic and total potential energy are computed using 3- and 6-h data at nine times from NASA's fourth Atmospheric Variability Experiment (AVE IV). Two intense squall lines occurred during the period. Energy budgets for areas that enclose regions of intense convection are shown to have systematic changes that relate to the life cycles of the convection. Some of the synoptic-scale energy processes associated with the convection are found to be larger than those observed in the vicinity of mature cyclones. Volumes enclosing intense convection are found to have large values of cross-contour conversion of potential to kinetic energy and large horizontal export of kinetic energy. Although small net vertical transport of kinetic energy is observed, values at individual layers indicate large upward transport. Transfer of kinetic energy from grid to subgrid scales of motion occurs in the volumes. Latent heat release is large in the middle and upper troposphere and is thought to be the cause of the observed cyclic changes in the budget terms. Total potential energy is found to be imported horizontally in the lower half of the atmosphere, transported aloft, and then exported horizontally. Although local changes of kinetic energy and total potential energy are small, interaction between volumes enclosing convection with surrounding larger volumes is quite large.

  11. DGDFT: A massively parallel method for large scale density functional theory calculations.

    PubMed

    Hu, Wei; Lin, Lin; Yang, Chao

    2015-09-28

    We describe a massively parallel implementation of the recently developed discontinuous Galerkin density functional theory (DGDFT) method, for efficient large-scale Kohn-Sham DFT based electronic structure calculations. The DGDFT method uses adaptive local basis (ALB) functions generated on-the-fly during the self-consistent field iteration to represent the solution to the Kohn-Sham equations. The use of the ALB set provides a systematic way to improve the accuracy of the approximation. By using the pole expansion and selected inversion technique to compute electron density, energy, and atomic forces, we can make the computational complexity of DGDFT scale at most quadratically with respect to the number of electrons for both insulating and metallic systems. We show that for the two-dimensional (2D) phosphorene systems studied here, using 37 basis functions per atom allows us to reach an accuracy level of 1.3 × 10(-4) Hartree/atom in terms of the error of energy and 6.2 × 10(-4) Hartree/bohr in terms of the error of atomic force, respectively. DGDFT can achieve 80% parallel efficiency on 128,000 high performance computing cores when it is used to study the electronic structure of 2D phosphorene systems with 3500-14 000 atoms. This high parallel efficiency results from a two-level parallelization scheme that we will describe in detail.

  12. Friction in debris flows: inferences from large-scale flume experiments

    USGS Publications Warehouse

    Iverson, Richard M.; LaHusen, Richard G.; ,

    1993-01-01

    A recently constructed flume, 95 m long and 2 m wide, permits systematic experimentation with unsteady, nonuniform flows of poorly sorted geological debris. Preliminary experiments with water-saturated mixtures of sand and gravel show that they flow in a manner consistent with Coulomb frictional behavior. The Coulomb flow model of Savage and Hutter (1989, 1991), modified to include quasi-static pore-pressure effects, predicts flow-front velocities and flow depths reasonably well. Moreover, simple scaling analyses show that grain friction, rather than liquid viscosity or grain collisions, probably dominates shear resistance and momentum transport in the experimental flows. The same scaling indicates that grain friction is also important in many natural debris flows.

  13. Development of renormalization group analysis of turbulence

    NASA Technical Reports Server (NTRS)

    Smith, L. M.

    1990-01-01

    The renormalization group (RG) procedure for nonlinear, dissipative systems is now quite standard, and its applications to the problem of hydrodynamic turbulence are becoming well known. In summary, the RG method isolates self similar behavior and provides a systematic procedure to describe scale invariant dynamics in terms of large scale variables only. The parameterization of the small scales in a self consistent manner has important implications for sub-grid modeling. This paper develops the homogeneous, isotropic turbulence and addresses the meaning and consequence of epsilon-expansion. The theory is then extended to include a weak mean flow and application of the RG method to a sequence of models is shown to converge to the Navier-Stokes equations.

  14. Combining states without scale hierarchies with ordered parton showers

    DOE PAGES

    Fischer, Nadine; Prestel, Stefan

    2017-09-12

    Here, we present a parameter-free scheme to combine fixed-order multi-jet results with parton-shower evolution. The scheme produces jet cross sections with leading-order accuracy in the complete phase space of multiple emissions, resumming large logarithms when appropriate, while not arbitrarily enforcing ordering on momentum configurations beyond the reach of the parton-shower evolution equation. This then requires the development of a matrix-element correction scheme for complex phase-spaces including ordering conditions as well as a systematic scale-setting procedure for unordered phase-space points. Our algorithm does not require a merging-scale parameter. We implement the new method in the Vincia framework and compare to LHCmore » data.« less

  15. What Googling Trends Tell Us About Public Interest in Earthquakes

    NASA Astrophysics Data System (ADS)

    Tan, Y. J.; Maharjan, R.

    2017-12-01

    Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.

  16. Frailty Defined by FRAIL Scale as a Predictor of Mortality: A Systematic Review and Meta-analysis.

    PubMed

    Kojima, Gotaro

    2018-06-01

    To conduct a systematic review of the literature on prospective cohort studies examining mortality risk according to frailty defined by FRAIL scale, and to perform a meta-analysis to synthesize the pooled risk estimates. Systematic review and meta-analysis. Embase, Scopus, MEDLINE, CINAHL, and PsycINFO were systematically searched in March 2018. References of included studies were reviewed and a forward citation tracking was performed on relevant review papers for additional studies. Additional data necessary for a meta-analysis were requested from corresponding authors. Community-dwelling middle-aged and older adults. Mortality risk due to frailty as defined by the FRAIL scale. After removing duplicates, there are 81 citations for title, abstract, and full-text screening. Eight studies were included in this review. Four studies calculated the area under the receiver operating characteristic curve, which ranged from 0.54 to 0.70. A random-effects meta-analysis was conducted on 3 studies that provided adjusted hazard ratios (HRs) of mortality risk according to 3 frailty groups (robust, prefrail, and frail) defined by FRAIL scale. Both frailty and prefrailty were significantly associated with higher mortality risk than robustness [pooled HR = 3.53, 95% confidence interval (CI) = 1.66-7.49, P = .001; pooled HR = 1.75, 95% CI = 1.14-2.70, P = .01, respectively]. No evidence of publication bias was observed. This study demonstrated that FRAIL scale is a tool that can effectively identify frailty/prefrailty status, as well as quantify frailty status in a graded manner in relation to mortality risk. Although its feasibility is of note, not many studies are yet using this relatively new tool. More studies are warranted regarding mortality and other health outcomes. Copyright © 2018 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  17. Investigating precipitation changes of anthropic origin: data and methodological issues

    NASA Astrophysics Data System (ADS)

    de Lima, Isabel; Lovejoy, Shaun

    2017-04-01

    There is much concern about the social, environmental and economic impacts of climate change that could result directly from changes in temperature and precipitation. For temperature, the situation is better understood; but despite the many studies that have been already dedicated to precipitation, change in this process - that could be associated to the transition to the Anthropocene - has not yet been convincingly proven. A large fraction of those studies have been exploring temporal (linear) trends in local precipitation, sometimes using records over only a few decades; other fewer studies have been dedicated to investigating global precipitation change. Overall, precipitation change of anthropic origin has showed to be difficult to establish with high statistical significance and, moreover, different data and products have displayed important discrepancies; this is valid even for global precipitation. We argue that the inadequate resolution and length of the data commonly used, as well as methodological issues, are among the main factors limiting the ability to identify the signature of change in precipitation. We propose several ways in which one can hope to improve the situation - or at least - clarify the difficulties. From the point of view of statistical analysis, the problem is one of detecting a low frequency anthropogenic signal in the presence of "noise" - the natural variability (the latter includes both internal dynamics and responses to volcanic, solar or other natural forcings). A consequence is that as one moves to longer and longer time scales, fluctuations are increasingly averaged and at some point, the anthropogenic signal will stand out above the natural variability noise. This approach can be systematized using scaling fluctuation analysis to characterizing different precipitation scaling regimes: weather, macroweather, climate - from higher to lower frequencies; in the anthropocene, the macroweather regime covers the range of time scales from about a month to ≈30 years. We illustrate this using local gauge data and three qualitatively different global scale precipitation products (from gauges, reanalyses and a satellite and gauge hybrid) that allow to investigate precipitation from monthly to centennial scales and in space from planetary down to 5°x5° scales. By systematically characterizing precipitation variability across wide ranges of time and space scales, we show that the anthropogenic signal only exceeded the natural variability at time scales larger than ≈20 years, so that the disagreement in the trends can be traced to these low frequencies.

  18. Improving patient safety through the systematic evaluation of patient outcomes

    PubMed Central

    Forster, Alan J.; Dervin, Geoff; Martin, Claude; Papp, Steven

    2012-01-01

    Despite increased advocacy for patient safety and several large-scale programs designed to reduce preventable harm, most notably surgical checklists, recent data evaluating entire health systems suggests that we are no further ahead in improving patient safety and that hospital complications are no less frequent now than in the 1990s. We suggest that the failure to systematically measure patient safety is the reason for our limited progress. In addition to defining patient safety outcomes and describing their financial and clinical impact, we argue why the failure to implement patient safety measurement systems has compromised the ability to move the agenda forward. We also present an overview of how patient safety can be assessed and the strengths and weaknesses of each method and comment on some of the consequences created by the absence of a systematic measurement system. PMID:23177520

  19. Neutrino footprint in large scale structure

    NASA Astrophysics Data System (ADS)

    Garay, Carlos Peña; Verde, Licia; Jimenez, Raul

    2017-03-01

    Recent constrains on the sum of neutrino masses inferred by analyzing cosmological data, show that detecting a non-zero neutrino mass is within reach of forthcoming cosmological surveys. Such a measurement will imply a direct determination of the absolute neutrino mass scale. Physically, the measurement relies on constraining the shape of the matter power spectrum below the neutrino free streaming scale: massive neutrinos erase power at these scales. However, detection of a lack of small-scale power from cosmological data could also be due to a host of other effects. It is therefore of paramount importance to validate neutrinos as the source of power suppression at small scales. We show that, independent on hierarchy, neutrinos always show a footprint on large, linear scales; the exact location and properties are fully specified by the measured power suppression (an astrophysical measurement) and atmospheric neutrinos mass splitting (a neutrino oscillation experiment measurement). This feature cannot be easily mimicked by systematic uncertainties in the cosmological data analysis or modifications in the cosmological model. Therefore the measurement of such a feature, up to 1% relative change in the power spectrum for extreme differences in the mass eigenstates mass ratios, is a smoking gun for confirming the determination of the absolute neutrino mass scale from cosmological observations. It also demonstrates the synergy between astrophysics and particle physics experiments.

  20. Implication of Taylor's hypothesis on amplitude modulation

    NASA Astrophysics Data System (ADS)

    Howland, Michael; Yang, Xiang

    2017-11-01

    Amplitude modulation is a physical phenomenon which describes the non-linear inter-scale interaction between large and small scales in a turbulent wall-bounded flow. The amplitude of the small scale fluctuations are modulated by the large scale flow structures. Due to the increase of amplitude modulation as a function of Reynolds number (Reτ = δuτ / ν), this phenomenon is frequently studied using experimental temporal 1D signals, taken using hot-wire anemometry. Typically, Taylor's frozen turbulence hypothesis has been invoked where the convection by velocity fluctuations is neglected and the mean velocity is used as the convective velocity. At high Reynolds numbers, turbulent fluctuations are comparable to the mean velocity in the near wall region (y+ O(10)), and as a result, using a constant global convective velocity systematically locally compresses or stretches a velocity signal when converting from temporal to spatial domain given a positive or negative fluctuation respectively. Despite this, temporal hot-wire data from wind tunnel or field experiments of high Reynolds number boundary layer flows can still be used for measuring modulation provided that the local fluid velocity is used as the local convective velocity. MH is funded through the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1656518 and the Stanford Graduate Fellowship. XY is funded by the US AFOSR, Grant No. 1194592-1-TAAHO monitored by Dr. Ivett Leyva.

  1. Train the Trainer Effectiveness Trials of Behavioral Intervention for Individuals with Autism: A Systematic Review

    ERIC Educational Resources Information Center

    Shire, Stephanie Yoshiko; Kasari, Connie

    2014-01-01

    This systematic review examines train the trainer (TTT) effectiveness trials of behavioral interventions for individuals with autism spectrum disorder (ASD). Published methodological quality scales were used to assess studies including participant description, research design, intervention, outcomes, and analysis. Twelve studies including 9 weak…

  2. Quality Assessment of Studies Published in Open Access and Subscription Journals: Results of a Systematic Evaluation.

    PubMed

    Pastorino, Roberta; Milovanovic, Sonja; Stojanovic, Jovana; Efremov, Ljupcho; Amore, Rosarita; Boccia, Stefania

    2016-01-01

    Along with the proliferation of Open Access (OA) publishing, the interest for comparing the scientific quality of studies published in OA journals versus subscription journals has also increased. With our study we aimed to compare the methodological quality and the quality of reporting of primary epidemiological studies and systematic reviews and meta-analyses published in OA and non-OA journals. In order to identify the studies to appraise, we listed all OA and non-OA journals which published in 2013 at least one primary epidemiologic study (case-control or cohort study design), and at least one systematic review or meta-analysis in the field of oncology. For the appraisal, we picked up the first studies published in 2013 with case-control or cohort study design from OA journals (Group A; n = 12), and in the same time period from non-OA journals (Group B; n = 26); the first systematic reviews and meta-analyses published in 2013 from OA journals (Group C; n = 15), and in the same time period from non-OA journals (Group D; n = 32). We evaluated the methodological quality of studies by assessing the compliance of case-control and cohort studies to Newcastle and Ottawa Scale (NOS) scale, and the compliance of systematic reviews and meta-analyses to Assessment of Multiple Systematic Reviews (AMSTAR) scale. The quality of reporting was assessed considering the adherence of case-control and cohort studies to STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist, and the adherence of systematic reviews and meta-analyses to Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) checklist. Among case-control and cohort studies published in OA and non-OA journals, we did not observe significant differences in the median value of NOS score (Group A: 7 (IQR 7-8) versus Group B: 8 (7-9); p = 0.5) and in the adherence to STROBE checklist (Group A, 75% versus Group B, 80%; p = 0.1). The results did not change after adjustment for impact factor. The compliance with AMSTAR and adherence to PRISMA checklist were comparable between systematic reviews and meta-analyses published in OA and non-OA journals (Group C, 46.0% versus Group D, 55.0%; p = 0.06), (Group C, 72.0% versus Group D, 76.0%; p = 0.1), respectively). The epidemiological studies published in OA journals in the field of oncology approach the same methodological quality and quality of reporting as studies published in non-OA journals.

  3. Quality Assessment of Studies Published in Open Access and Subscription Journals: Results of a Systematic Evaluation

    PubMed Central

    Pastorino, Roberta; Milovanovic, Sonja; Stojanovic, Jovana; Efremov, Ljupcho; Amore, Rosarita; Boccia, Stefania

    2016-01-01

    Introduction Along with the proliferation of Open Access (OA) publishing, the interest for comparing the scientific quality of studies published in OA journals versus subscription journals has also increased. With our study we aimed to compare the methodological quality and the quality of reporting of primary epidemiological studies and systematic reviews and meta-analyses published in OA and non-OA journals. Methods In order to identify the studies to appraise, we listed all OA and non-OA journals which published in 2013 at least one primary epidemiologic study (case-control or cohort study design), and at least one systematic review or meta-analysis in the field of oncology. For the appraisal, we picked up the first studies published in 2013 with case-control or cohort study design from OA journals (Group A; n = 12), and in the same time period from non-OA journals (Group B; n = 26); the first systematic reviews and meta-analyses published in 2013 from OA journals (Group C; n = 15), and in the same time period from non-OA journals (Group D; n = 32). We evaluated the methodological quality of studies by assessing the compliance of case-control and cohort studies to Newcastle and Ottawa Scale (NOS) scale, and the compliance of systematic reviews and meta-analyses to Assessment of Multiple Systematic Reviews (AMSTAR) scale. The quality of reporting was assessed considering the adherence of case-control and cohort studies to STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist, and the adherence of systematic reviews and meta-analyses to Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) checklist. Results Among case-control and cohort studies published in OA and non-OA journals, we did not observe significant differences in the median value of NOS score (Group A: 7 (IQR 7–8) versus Group B: 8 (7–9); p = 0.5) and in the adherence to STROBE checklist (Group A, 75% versus Group B, 80%; p = 0.1). The results did not change after adjustment for impact factor. The compliance with AMSTAR and adherence to PRISMA checklist were comparable between systematic reviews and meta-analyses published in OA and non-OA journals (Group C, 46.0% versus Group D, 55.0%; p = 0.06), (Group C, 72.0% versus Group D, 76.0%; p = 0.1), respectively). Conclusion The epidemiological studies published in OA journals in the field of oncology approach the same methodological quality and quality of reporting as studies published in non-OA journals. PMID:27167982

  4. Lepton jets and low-mass sterile neutrinos at hadron colliders

    NASA Astrophysics Data System (ADS)

    Dube, Sourabh; Gadkari, Divya; Thalapillil, Arun M.

    2017-09-01

    Sterile neutrinos, if they exist, are potential harbingers for physics beyond the Standard Model. They have the capacity to shed light on our flavor sector, grand unification frameworks, dark matter sector and origins of baryon antibaryon asymmetry. There have been a few seminal studies that have broached the subject of sterile neutrinos with low, electroweak-scale masses (i.e. ΛQCD≪mNR≪mW± ) and investigated their reach at hadron colliders using lepton jets. These preliminary studies nevertheless assume background-free scenarios after certain selection criteria which are overly optimistic and untenable in realistic situations. These lead to incorrect projections. The unique signal topology and challenging hadronic environment also make this mass-scale regime ripe for a careful investigation. With the above motivations, we attempt to perform the first systematic study of low, electroweak-scale, right-handed neutrinos at hadron colliders, in this unique signal topology. There are currently no active searches at hadron colliders for sterile neutrino states in this mass range, and we frame the study in the context of the 13 TeV high-luminosity Large Hadron Collider and the proposed FCC-hh/SppC 100 TeV p p -collider.

  5. The Development, Validity and Reliability of TPACK-Deep: A Technological Pedagogical Content Knowledge Scale

    ERIC Educational Resources Information Center

    Yurdakul, Isil Kabakci; Odabasi, Hatice Ferhan; Kilicer, Kerem; Coklar, Ahmet Naci; Birinci, Gurkay; Kurt, Adile Askim

    2012-01-01

    The purpose of this study is to develop a TPACK (technological pedagogical content knowledge) scale based on the centered component of TPACK framework in order to measure preservice teachers' TPACK. A systematic and step-by-step approach was followed for the development of the scale. The validity and reliability studies of the scale were carried…

  6. Barium isotopes reveal role of ocean circulation on barium cycling in the Atlantic

    NASA Astrophysics Data System (ADS)

    Bates, Stephanie L.; Hendry, Katharine R.; Pryer, Helena V.; Kinsley, Christopher W.; Pyle, Kimberley M.; Woodward, E. Malcolm S.; Horner, Tristan J.

    2017-05-01

    We diagnose the relative influences of local-scale biogeochemical cycling and regional-scale ocean circulation on Atlantic barium cycling by analysing four new depth profiles of dissolved Ba concentrations and isotope compositions from the South and tropical North Atlantic. These new profiles exhibit systematic vertical, zonal and meridional variations that reflect the influence of both local-scale barite cycling and large-scale ocean circulation. Epipelagic decoupling of dissolved Ba and Si reported previously in the tropics is also found to be associated with significant Ba isotope heterogeneity. As such, we contend that this decoupling originates from the depth segregation of opal and barite formation but is exacerbated by weak vertical mixing. Zonal influence from isotopically-'heavy' water masses in the western North Atlantic evidence the advective inflow of Ba-depleted Upper Labrador Sea Water, which is not seen in the eastern basin or the South Atlantic. Meridional variations in Atlantic Ba isotope systematics below 2000 m appear entirely controlled by conservative mixing. Using an inverse isotopic mixing model, we calculate the Ba isotope composition of the Ba-poor northern end-member as +0.45 ‰ and the Ba-rich southern end-member +0.26 ‰, relative to NIST SRM 3104a. The near-conservative behaviour of Ba below 2000 m indicates that Ba isotopes can serve as an independent tracer of the provenance of northern- versus southern-sourced water masses in the deep Atlantic Ocean. This finding may prove useful in palaeoceanographic studies, should appropriate sedimentary archives be identified, and offers new insights into the processes that cycle Ba in seawater.

  7. Policy Guidance From a Multi-scale Suite of Natural Field and Digital Laboratories of Change: Hydrological Catchment Studies of Nutrient and Pollutant Source Releases, Waterborne Transport-Transformations and Mass Flows in Water Ecosystems

    NASA Astrophysics Data System (ADS)

    Destouni, G.

    2008-12-01

    Continental fresh water transports and loads excess nutrients and pollutants from various land surface sources, through the landscape, into downstream inland and coastal water environments. Our ability to understand, predict and control the eutrophication and the pollution pressures on inland, coastal and marine water ecosystems relies on our ability to quantify these mass flows. This paper synthesizes a series of hydro- biogeochemical studies of nutrient and pollutant sources, transport-transformations and mass flows in catchment areas across a range of scales, from continental, through regional and national, to individual drainage basin scales. Main findings on continental scales include correlations between country/catchment area, population and GDP and associated pollutant and nutrient loading, which differ significantly between world regions with different development levels. On regional scales, essential systematic near-coastal gaps are identified in the national monitoring of nutrient and pollutant loads from land to the sea. Combination of the unmonitored near-coastal area characteristics with the relevant regional nutrient and pollutant load correlations with these characteristics shows that the unmonitored nutrient and pollutant mass loads to the sea may often be as large as, or greater than the monitored river loads. Process studies on individual basin- scales show long-term nutrient and pollutant memories in the soil-groundwater systems of the basins, which may continue to uphold large mass loading to inland and coastal waters long time after mitigation of the sources. Linked hydro-biogeochemical-economic model studies finally demonstrate significant comparative advantages of policies that demand explicit quantitative account of the uncertainties implied by these monitoring gaps and long-term nutrient-pollution memories and time lags, and other knowledge, data and model limitations, instead of the now common neglect or subjective implicit handling of such uncertainties in strategies and practices for combating water pollution and eutrophication.

  8. Large-Scale SRM Screen of Urothelial Bladder Cancer Candidate Biomarkers in Urine.

    PubMed

    Duriez, Elodie; Masselon, Christophe D; Mesmin, Cédric; Court, Magali; Demeure, Kevin; Allory, Yves; Malats, Núria; Matondo, Mariette; Radvanyi, François; Garin, Jérôme; Domon, Bruno

    2017-04-07

    Urothelial bladder cancer is a condition associated with high recurrence and substantial morbidity and mortality. Noninvasive urinary tests that would detect bladder cancer and tumor recurrence are required to significantly improve patient care. Over the past decade, numerous bladder cancer candidate biomarkers have been identified in the context of extensive proteomics or transcriptomics studies. To translate these findings in clinically useful biomarkers, the systematic evaluation of these candidates remains the bottleneck. Such evaluation involves large-scale quantitative LC-SRM (liquid chromatography-selected reaction monitoring) measurements, targeting hundreds of signature peptides by monitoring thousands of transitions in a single analysis. The design of highly multiplexed SRM analyses is driven by several factors: throughput, robustness, selectivity and sensitivity. Because of the complexity of the samples to be analyzed, some measurements (transitions) can be interfered by coeluting isobaric species resulting in biased or inconsistent estimated peptide/protein levels. Thus the assessment of the quality of SRM data is critical to allow flagging these inconsistent data. We describe an efficient and robust method to process large SRM data sets, including the processing of the raw data, the detection of low-quality measurements, the normalization of the signals for each protein, and the estimation of protein levels. Using this methodology, a variety of proteins previously associated with bladder cancer have been assessed through the analysis of urine samples from a large cohort of cancer patients and corresponding controls in an effort to establish a priority list of most promising candidates to guide subsequent clinical validation studies.

  9. Seeing in the Dark: Weak Lensing from the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Huff, Eric Michael

    Statistical weak lensing by large-scale structure { cosmic shear { is a promising cosmological tool, which has motivated the design of several large upcoming astronomical surveys. This Thesis presents a measurement of cosmic shear using coadded Sloan Digital Sky Survey (SDSS) imaging in 168 square degrees of the equatorial region, with r < 23:5 and i < 22:5, a source number density of 2.2 per arcmin2 and median redshift of zmed = 0.52. These coadds were generated using a new rounding kernel method that was intended to minimize systematic errors in the lensing measurement due to coherent PSF anisotropies that are otherwise prevalent in the SDSS imaging data. Measurements of cosmic shear out to angular separations of 2 degrees are presented, along with systematics tests of the catalog generation and shear measurement steps that demonstrate that these results are dominated by statistical rather than systematic errors. Assuming a cosmological model corresponding to WMAP7 (Komatsu et al., 2011) and allowing only the amplitude of matter fluctuations sigma8 to vary, the best-t value of the amplitude of matter fluctuations is sigma 8=0.636+0.109-0.154 (1sigma); without systematic errors this would be sigma8=0.636+0.099 -0.137 (1sigma). Assuming a flat Λ CDM model, the combined constraints with WMAP7 are sigma8=0.784+0.028 -0.026 (1sigma). The 2sigma error range is 14 percent smaller than WMAP7 alone. Aside from the intrinsic value of such cosmological constraints from the growth of structure, some important lessons are identified for upcoming surveys that may face similar issues when combining multi-epoch data to measure cosmic shear. Motivated by the challenges faced in the cosmic shear measurement, two new lensing probes are suggested for increasing the available weak lensing signal. Both use galaxy scaling relations to control for scatter in lensing observables. The first employs a version of the well-known fundamental plane relation for early type galaxies. This modified "photometric fundamental plane" replaces velocity dispersions with photometric galaxy properties, thus obviating the need for spectroscopic data. We present the first detection of magnification using this method by applying it to photometric catalogs from the Sloan Digital Sky Survey. This analysis shows that the derived magnification signal is comparable to that available from conventional methods using gravitational shear. We suppress the dominant sources of systematic error and discuss modest improvements that may allow this method to equal or even surpass the signal-to-noise achievable with shear. Moreover, some of the dominant sources of systematic error are substantially different from those of shear-based techniques. The second outlines an idea for using the optical Tully-Fisher relation to dramatically improve the signal-to-noise and systematic error control for shear measurements. The expected error properties and potential advantages of such a measurement are proposed, and a pilot study is suggested in order to test the viability of Tully-Fisher weak lensing in the context of the forthcoming generation of large spectroscopic surveys.

  10. Multi-scale chromatin state annotation using a hierarchical hidden Markov model

    NASA Astrophysics Data System (ADS)

    Marco, Eugenio; Meuleman, Wouter; Huang, Jialiang; Glass, Kimberly; Pinello, Luca; Wang, Jianrong; Kellis, Manolis; Yuan, Guo-Cheng

    2017-04-01

    Chromatin-state analysis is widely applied in the studies of development and diseases. However, existing methods operate at a single length scale, and therefore cannot distinguish large domains from isolated elements of the same type. To overcome this limitation, we present a hierarchical hidden Markov model, diHMM, to systematically annotate chromatin states at multiple length scales. We apply diHMM to analyse a public ChIP-seq data set. diHMM not only accurately captures nucleosome-level information, but identifies domain-level states that vary in nucleosome-level state composition, spatial distribution and functionality. The domain-level states recapitulate known patterns such as super-enhancers, bivalent promoters and Polycomb repressed regions, and identify additional patterns whose biological functions are not yet characterized. By integrating chromatin-state information with gene expression and Hi-C data, we identify context-dependent functions of nucleosome-level states. Thus, diHMM provides a powerful tool for investigating the role of higher-order chromatin structure in gene regulation.

  11. Analysis of Helium Segregation on Surfaces of Plasma-Exposed Tungsten

    NASA Astrophysics Data System (ADS)

    Maroudas, Dimitrios; Hu, Lin; Hammond, Karl; Wirth, Brian

    2015-11-01

    We report a systematic theoretical and atomic-scale computational study of implanted helium segregation on surfaces of tungsten, which is considered as a plasma facing component in nuclear fusion reactors. We employ a hierarchy of atomic-scale simulations, including molecular statics to understand the origin of helium surface segregation, targeted molecular-dynamics (MD) simulations of near-surface cluster reactions, and large-scale MD simulations of implanted helium evolution in plasma-exposed tungsten. We find that small, mobile helium clusters (of 1-7 He atoms) in the near-surface region are attracted to the surface due to an elastic interaction force. This thermodynamic driving force induces drift fluxes of these mobile clusters toward the surface, facilitating helium segregation. Moreover, the clusters' drift toward the surface enables cluster reactions, most importantly trap mutation, at rates much higher than in the bulk material. This cluster dynamics has significant effects on the surface morphology, near-surface defect structures, and the amount of helium retained in the material upon plasma exposure.

  12. Tight-binding calculation studies of vacancy and adatom defects in graphene

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Wei; Lu, Wen-Cai; Zhang, Hong-Xing

    2016-02-19

    Computational studies of complex defects in graphene usually need to deal with a larger number of atoms than the current first-principles methods can handle. We show a recently developed three-center tight-binding potential for carbon is very efficient for large scale atomistic simulations and can accurately describe the structures and energies of various defects in graphene. Using the three-center tight-binding potential, we have systematically studied the stable structures and formation energies of vacancy and embedded-atom defects of various sizes up to 4 vacancies and 4 embedded atoms in graphene. In conclusion, our calculations reveal low-energy defect structures and provide a moremore » comprehensive understanding of the structures and stability of defects in graphene.« less

  13. Viscous decay of nonlinear oscillations of a spherical bubble at large Reynolds number

    NASA Astrophysics Data System (ADS)

    Smith, W. R.; Wang, Q. X.

    2017-08-01

    The long-time viscous decay of large-amplitude bubble oscillations is considered in an incompressible Newtonian fluid, based on the Rayleigh-Plesset equation. At large Reynolds numbers, this is a multi-scaled problem with a short time scale associated with inertial oscillation and a long time scale associated with viscous damping. A multi-scaled perturbation method is thus employed to solve the problem. The leading-order analytical solution of the bubble radius history is obtained to the Rayleigh-Plesset equation in a closed form including both viscous and surface tension effects. Some important formulae are derived including the following: the average energy loss rate of the bubble system during each cycle of oscillation, an explicit formula for the dependence of the oscillation frequency on the energy, and an implicit formula for the amplitude envelope of the bubble radius as a function of the energy. Our theory shows that the energy of the bubble system and the frequency of oscillation do not change on the inertial time scale at leading order, the energy loss rate on the long viscous time scale being inversely proportional to the Reynolds number. These asymptotic predictions remain valid during each cycle of oscillation whether or not compressibility effects are significant. A systematic parametric analysis is carried out using the above formula for the energy of the bubble system, frequency of oscillation, and minimum/maximum bubble radii in terms of the Reynolds number, the dimensionless initial pressure of the bubble gases, and the Weber number. Our results show that the frequency and the decay rate have substantial variations over the lifetime of a decaying oscillation. The results also reveal that large-amplitude bubble oscillations are very sensitive to small changes in the initial conditions through large changes in the phase shift.

  14. Turbulence as a Problem in Non-equilibrium Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Goldenfeld, Nigel; Shih, Hong-Yan

    2017-05-01

    The transitional and well-developed regimes of turbulent shear flows exhibit a variety of remarkable scaling laws that are only now beginning to be systematically studied and understood. In the first part of this article, we summarize recent progress in understanding the friction factor of turbulent flows in rough pipes and quasi-two-dimensional soap films, showing how the data obey a two-parameter scaling law known as roughness-induced criticality, and exhibit power-law scaling of friction factor with Reynolds number that depends on the precise form of the nature of the turbulent cascade. These results hint at a non-equilibrium fluctuation-dissipation relation that applies to turbulent flows. The second part of this article concerns the lifetime statistics in smooth pipes around the transition, showing how the remarkable super-exponential scaling with Reynolds number reflects deep connections between large deviation theory, extreme value statistics, directed percolation and the onset of coexistence in predator-prey ecosystems. Both these phenomena reflect the way in which turbulence can be fruitfully approached as a problem in non-equilibrium statistical mechanics.

  15. Temporal Organization of Sound Information in Auditory Memory.

    PubMed

    Song, Kun; Luo, Huan

    2017-01-01

    Memory is a constructive and organizational process. Instead of being stored with all the fine details, external information is reorganized and structured at certain spatiotemporal scales. It is well acknowledged that time plays a central role in audition by segmenting sound inputs into temporal chunks of appropriate length. However, it remains largely unknown whether critical temporal structures exist to mediate sound representation in auditory memory. To address the issue, here we designed an auditory memory transferring study, by combining a previously developed unsupervised white noise memory paradigm with a reversed sound manipulation method. Specifically, we systematically measured the memory transferring from a random white noise sound to its locally temporal reversed version on various temporal scales in seven experiments. We demonstrate a U-shape memory-transferring pattern with the minimum value around temporal scale of 200 ms. Furthermore, neither auditory perceptual similarity nor physical similarity as a function of the manipulating temporal scale can account for the memory-transferring results. Our results suggest that sounds are not stored with all the fine spectrotemporal details but are organized and structured at discrete temporal chunks in long-term auditory memory representation.

  16. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    NASA Astrophysics Data System (ADS)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  17. SU-E-J-257: A PCA Model to Predict Adaptive Changes for Head&neck Patients Based On Extraction of Geometric Features From Daily CBCT Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, M; Henry Ford Health System, Detroit, MI; Siddiqui, F

    2015-06-15

    Purpose: Using daily cone beam CTs (CBCTs) to develop principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients and to assess the possibility of using these prospectively in adaptive radiation therapy (ART). Methods: Planning CT (pCT) images of 4 H&N patients were deformed to model several different systematic changes in patient anatomy during the course of the radiation therapy (RT). A Pinnacle plugin was used to linearly interpolate the systematic change in patient for the 35 fraction RT course and to generate a set of 35 synthetic CBCTs. Each synthetic CBCT represents the systematic changemore » in patient anatomy for each fraction. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CBCTs with random fraction-to-fraction changes were superimposed on the DVFs. A patient-specific PCA model was built using these DVFs containing systematic plus random changes. It was hypothesized that resulting eigenDVFs (EDVFs) with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: For all 4 patients, the PCA model provided different results depending on the type and size of systematic change in patient’s body. PCA was more successful in capturing the systematic changes early in the treatment course when these were of a larger scale with respect to the random fraction-to-fraction changes in patient’s anatomy. For smaller scale systematic changes, random changes in patient could completely “hide” the systematic change. Conclusion: The leading EDVF from the patientspecific PCA models could tentatively be identified as a major systematic change during treatment if the systematic change is large enough with respect to random fraction-to-fraction changes. Otherwise, leading EDVF could not represent systematic changes reliably. This work is expected to facilitate development of population-based PCA models that can be used to prospectively identify significant anatomical changes early in treatment. This work is supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  18. Tigers Need Cover: Multi-Scale Occupancy Study of the Big Cat in Sumatran Forest and Plantation Landscapes

    PubMed Central

    Sunarto, Sunarto; Kelly, Marcella J.; Parakkasi, Karmila; Klenzendorf, Sybille; Septayuda, Eka; Kurniawan, Harry

    2012-01-01

    The critically endangered Sumatran tiger (Panthera tigris sumatrae Pocock, 1929) is generally known as a forest-dependent animal. With large-scale conversion of forests into plantations, however, it is crucial for restoration efforts to understand to what extent tigers use modified habitats. We investigated tiger-habitat relationships at 2 spatial scales: occupancy across the landscape and habitat use within the home range. Across major landcover types in central Sumatra, we conducted systematic detection, non-detection sign surveys in 47, 17×17 km grid cells. Within each cell, we surveyed 40, 1-km transects and recorded tiger detections and habitat variables in 100 m segments totaling 1,857 km surveyed. We found that tigers strongly preferred forest and used plantations of acacia and oilpalm, far less than their availability. Tiger probability of occupancy covaried positively and strongly with altitude, positively with forest area, and negatively with distance-to-forest centroids. At the fine scale, probability of habitat use by tigers across landcover types covaried positively and strongly with understory cover and altitude, and negatively and strongly with human settlement. Within forest areas, tigers strongly preferred sites that are farther from water bodies, higher in altitude, farther from edge, and closer to centroid of large forest block; and strongly preferred sites with thicker understory cover, lower level of disturbance, higher altitude, and steeper slope. These results indicate that to thrive, tigers depend on the existence of large contiguous forest blocks, and that with adjustments in plantation management, tigers could use mosaics of plantations (as additional roaming zones), riparian forests (as corridors) and smaller forest patches (as stepping stones), potentially maintaining a metapopulation structure in fragmented landscapes. This study highlights the importance of a multi-spatial scale analysis and provides crucial information relevant to restoring tigers and other wildlife in forest and plantation landscapes through improvement in habitat extent, quality, and connectivity. PMID:22292063

  19. Tigers need cover: multi-scale occupancy study of the big cat in Sumatran forest and plantation landscapes.

    PubMed

    Sunarto, Sunarto; Kelly, Marcella J; Parakkasi, Karmila; Klenzendorf, Sybille; Septayuda, Eka; Kurniawan, Harry

    2012-01-01

    The critically endangered Sumatran tiger (Panthera tigris sumatrae Pocock, 1929) is generally known as a forest-dependent animal. With large-scale conversion of forests into plantations, however, it is crucial for restoration efforts to understand to what extent tigers use modified habitats. We investigated tiger-habitat relationships at 2 spatial scales: occupancy across the landscape and habitat use within the home range. Across major landcover types in central Sumatra, we conducted systematic detection, non-detection sign surveys in 47, 17×17 km grid cells. Within each cell, we surveyed 40, 1-km transects and recorded tiger detections and habitat variables in 100 m segments totaling 1,857 km surveyed. We found that tigers strongly preferred forest and used plantations of acacia and oilpalm, far less than their availability. Tiger probability of occupancy covaried positively and strongly with altitude, positively with forest area, and negatively with distance-to-forest centroids. At the fine scale, probability of habitat use by tigers across landcover types covaried positively and strongly with understory cover and altitude, and negatively and strongly with human settlement. Within forest areas, tigers strongly preferred sites that are farther from water bodies, higher in altitude, farther from edge, and closer to centroid of large forest block; and strongly preferred sites with thicker understory cover, lower level of disturbance, higher altitude, and steeper slope. These results indicate that to thrive, tigers depend on the existence of large contiguous forest blocks, and that with adjustments in plantation management, tigers could use mosaics of plantations (as additional roaming zones), riparian forests (as corridors) and smaller forest patches (as stepping stones), potentially maintaining a metapopulation structure in fragmented landscapes. This study highlights the importance of a multi-spatial scale analysis and provides crucial information relevant to restoring tigers and other wildlife in forest and plantation landscapes through improvement in habitat extent, quality, and connectivity.

  20. Systematic Review of Chinese Medicine for Miscarriage during Early Pregnancy

    PubMed Central

    Leung, Ping Chung; Chung, Tony Kwok Hung; Wang, Chi Chiu

    2014-01-01

    Background. Miscarriage is a very common complication during early pregnancy. So far, clinical therapies have limitation in preventing the early pregnancy loss. Chinese Medicine, regarded as gentle, effective, and safe, has become popular and common as a complementary and alternative treatment for miscarriages. However, the evidence to support its therapeutic efficacy and safety is still very limited. Objectives and Methods. To summarize the clinical application of Chinese Medicine for pregnancy and provide scientific evidence on the efficacy and safety of Chinese medicines for miscarriage, we located all the relevant pieces of literature on the clinical applications of Chinese Medicine for miscarriage and worked out this systematic review. Results. 339,792 pieces of literature were identified, but no placebo was included and only few studies were selected for systematic review and conducted for meta-analysis. A combination of Chinese medicines and Western medicines was more effective than Chinese medicines alone. No specific safety problem was reported, but potential adverse events by certain medicines were identified. Conclusions. Studies vary considerably in design, interventions, and outcome measures; therefore conclusive results remain elusive. Large scales of randomized controlled trials and more scientific evidences are still necessary to confirm the efficacy and safety of Chinese medicines during early pregnancy. PMID:24648851

  1. Small intestinal submucosa extracellular matrix (CorMatrix®) in cardiovascular surgery: a systematic review

    PubMed Central

    Mosala Nezhad, Zahra; Poncelet, Alain; de Kerchove, Laurent; Gianello, Pierre; Fervaille, Caroline; El Khoury, Gebrine

    2016-01-01

    Extracellular matrix (ECM) derived from small intestinal submucosa (SIS) is widely used in clinical applications as a scaffold for tissue repair. Recently, CorMatrix® porcine SIS-ECM (CorMatrix Cardiovascular, Inc., Roswell, GA, USA) has gained popularity for ‘next-generation’ cardiovascular tissue engineering due to its ease of use, remodelling properties, lack of immunogenicity, absorbability and potential to promote native tissue growth. Here, we provide an overview of the biology of porcine SIS-ECM and systematically review the preclinical and clinical literature on its use in cardiovascular surgery. CorMatrix® has been used in a variety of cardiovascular surgical applications, and since it is the most widely used SIS-ECM, this material is the focus of this review. Since CorMatrix® is a relatively new product for cardiovascular surgery, some clinical and preclinical studies published lack systematic reporting of functional and pathological findings in sufficient numbers of subjects. There are also emerging reports to suggest that, contrary to expectations, an undesirable inflammatory response may occur in CorMatrix® implants in humans and longer-term outcomes at particular sites, such as the heart valves, may be suboptimal. Large-scale clinical studies are needed driven by robust protocols that aim to quantify the pathological process of tissue repair. PMID:26912574

  2. The hubble constant.

    PubMed

    Huchra, J P

    1992-04-17

    The Hubble constant is the constant of proportionality between recession velocity and distance in the expanding universe. It is a fundamental property of cosmology that sets both the scale and the expansion age of the universe. It is determined by measurement of galaxy The Hubble constant is the constant of proportionality between recession velocity and development of new techniques for the measurements of galaxy distances, both calibration uncertainties and debates over systematic errors remain. Current determinations still range over nearly a factor of 2; the higher values favored by most local measurements are not consistent with many theories of the origin of large-scale structure and stellar evolution.

  3. Large scale study on the variation of RF energy absorption in the head & brain regions of adults and children and evaluation of the SAM phantom conservativeness.

    PubMed

    Keshvari, J; Kivento, M; Christ, A; Bit-Babik, G

    2016-04-21

    This paper presents the results of two computational large scale studies using highly realistic exposure scenarios, MRI based human head and hand models, and two mobile phone models. The objectives are (i) to study the relevance of age when people are exposed to RF by comparing adult and child heads and (ii) to analyze and discuss the conservativeness of the SAM phantom for all age groups. Representative use conditions were simulated using detailed CAD models of two mobile phones operating between 900 MHz and 1950 MHz including configurations with the hand holding the phone, which were not considered in most previous studies. The peak spatial-average specific absorption rate (psSAR) in the head and the pinna tissues is assessed using anatomically accurate head and hand models. The first of the two mentioned studies involved nine head-, four hand- and two phone-models, the second study included six head-, four hand- and three simplified phone-models (over 400 configurations in total). In addition, both studies also evaluated the exposure using the SAM phantom. Results show no systematic differences between psSAR induced in the adult and child heads. The exposure level and its variation for different age groups may be different for particular phones, but no correlation between psSAR and model age was found. The psSAR from all exposure conditions was compared to the corresponding configurations using SAM, which was found to be conservative in the large majority of cases.

  4. Large scale study on the variation of RF energy absorption in the head & brain regions of adults and children and evaluation of the SAM phantom conservativeness

    NASA Astrophysics Data System (ADS)

    Keshvari, J.; Kivento, M.; Christ, A.; Bit-Babik, G.

    2016-04-01

    This paper presents the results of two computational large scale studies using highly realistic exposure scenarios, MRI based human head and hand models, and two mobile phone models. The objectives are (i) to study the relevance of age when people are exposed to RF by comparing adult and child heads and (ii) to analyze and discuss the conservativeness of the SAM phantom for all age groups. Representative use conditions were simulated using detailed CAD models of two mobile phones operating between 900 MHz and 1950 MHz including configurations with the hand holding the phone, which were not considered in most previous studies. The peak spatial-average specific absorption rate (psSAR) in the head and the pinna tissues is assessed using anatomically accurate head and hand models. The first of the two mentioned studies involved nine head-, four hand- and two phone-models, the second study included six head-, four hand- and three simplified phone-models (over 400 configurations in total). In addition, both studies also evaluated the exposure using the SAM phantom. Results show no systematic differences between psSAR induced in the adult and child heads. The exposure level and its variation for different age groups may be different for particular phones, but no correlation between psSAR and model age was found. The psSAR from all exposure conditions was compared to the corresponding configurations using SAM, which was found to be conservative in the large majority of cases.

  5. A Single Column Model Ensemble Approach Applied to the TWP-ICE Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, Laura; Jakob, Christian; Cheung, K.

    2013-06-27

    Single column models (SCM) are useful testbeds for investigating the parameterisation schemes of numerical weather prediction and climate models. The usefulness of SCM simulations are limited, however, by the accuracy of the best-estimate large-scale data prescribed. One method to address this uncertainty is to perform ensemble simulations of the SCM. This study first derives an ensemble of large-scale data for the Tropical Warm Pool International Cloud Experiment (TWP-ICE) based on an estimate of a possible source of error in the best-estimate product. This data is then used to carry out simulations with 11 SCM and 2 cloud-resolving models (CRM). Best-estimatemore » simulations are also performed. All models show that moisture related variables are close to observations and there are limited differences between the best-estimate and ensemble mean values. The models, however, show different sensitivities to changes in the forcing particularly when weakly forced. The ensemble simulations highlight important differences in the moisture budget between the SCM and CRM. Systematic differences are also apparent in the ensemble mean vertical structure of cloud variables. The ensemble is further used to investigate relations between cloud variables and precipitation identifying large differences between CRM and SCM. This study highlights that additional information can be gained by performing ensemble simulations enhancing the information derived from models using the more traditional single best-estimate simulation.« less

  6. The circuit architecture of whole brains at the mesoscopic scale.

    PubMed

    Mitra, Partha P

    2014-09-17

    Vertebrate brains of even moderate size are composed of astronomically large numbers of neurons and show a great degree of individual variability at the microscopic scale. This variation is presumably the result of phenotypic plasticity and individual experience. At a larger scale, however, relatively stable species-typical spatial patterns are observed in neuronal architecture, e.g., the spatial distributions of somata and axonal projection patterns, probably the result of a genetically encoded developmental program. The mesoscopic scale of analysis of brain architecture is the transitional point between a microscopic scale where individual variation is prominent and the macroscopic level where a stable, species-typical neural architecture is observed. The empirical existence of this scale, implicit in neuroanatomical atlases, combined with advances in computational resources, makes studying the circuit architecture of entire brains a practical task. A methodology has previously been proposed that employs a shotgun-like grid-based approach to systematically cover entire brain volumes with injections of neuronal tracers. This methodology is being employed to obtain mesoscale circuit maps in mouse and should be applicable to other vertebrate taxa. The resulting large data sets raise issues of data representation, analysis, and interpretation, which must be resolved. Even for data representation the challenges are nontrivial: the conventional approach using regional connectivity matrices fails to capture the collateral branching patterns of projection neurons. Future success of this promising research enterprise depends on the integration of previous neuroanatomical knowledge, partly through the development of suitable computational tools that encapsulate such expertise. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. The research and realization of multi-platform real-time message-oriented middleware in large-scale air traffic control system

    NASA Astrophysics Data System (ADS)

    Liang, Haijun; Ren, Jialong; Song, Tao

    2017-05-01

    Operating requirement of air traffic control system, the multi-platform real-time message-oriented middleware was studied and realized, which is composed of CDCC and CDCS. The former provides application process interface, while the latter realizes data synchronism of CDCC and data exchange. MQM, as one important part of it, provides message queue management and, encrypt and compress data during transmitting procedure. The practical system application verifies that the middleware can simplify the development of air traffic control system, enhance its stability, improve its systematic function and make it convenient for maintenance and reuse.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Schmitt; Juan Deaton; Curt Papke

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructuremore » requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.« less

  9. Using Markov chains of nucleotide sequences as a possible precursor to predict functional roles of human genome: a case study on inactive chromatin regions.

    PubMed

    Lee, K-E; Lee, E-J; Park, H-S

    2016-08-30

    Recent advances in computational epigenetics have provided new opportunities to evaluate n-gram probabilistic language models. In this paper, we describe a systematic genome-wide approach for predicting functional roles in inactive chromatin regions by using a sequence-based Markovian chromatin map of the human genome. We demonstrate that Markov chains of sequences can be used as a precursor to predict functional roles in heterochromatin regions and provide an example comparing two publicly available chromatin annotations of large-scale epigenomics projects: ENCODE project consortium and Roadmap Epigenomics consortium.

  10. The effect of alcohol consumption on the adolescent brain: A systematic review of MRI and fMRI studies of alcohol-using youth

    PubMed Central

    Feldstein Ewing, Sarah W.; Sakhardande, Ashok; Blakemore, Sarah-Jayne

    2014-01-01

    Background A large proportion of adolescents drink alcohol, with many engaging in high-risk patterns of consumption, including binge drinking. Here, we systematically review and synthesize the existing empirical literature on how consuming alcohol affects the developing human brain in alcohol-using (AU) youth. Methods For this systematic review, we began by conducting a literature search using the PubMED database to identify all available peer-reviewed magnetic resonance imaging (MRI) and functional magnetic resonance imaging (fMRI) studies of AU adolescents (aged 19 and under). All studies were screened against a strict set of criteria designed to constrain the impact of confounding factors, such as co-occurring psychiatric conditions. Results Twenty-one studies (10 MRI and 11 fMRI) met the criteria for inclusion. A synthesis of the MRI studies suggested that overall, AU youth showed regional differences in brain structure as compared with non-AU youth, with smaller grey matter volumes and lower white matter integrity in relevant brain areas. In terms of fMRI outcomes, despite equivalent task performance between AU and non-AU youth, AU youth showed a broad pattern of lower task-relevant activation, and greater task-irrelevant activation. In addition, a pattern of gender differences was observed for brain structure and function, with particularly striking effects among AU females. Conclusions Alcohol consumption during adolescence was associated with significant differences in structure and function in the developing human brain. However, this is a nascent field, with several limiting factors (including small sample sizes, cross-sectional designs, presence of confounding factors) within many of the reviewed studies, meaning that results should be interpreted in light of the preliminary state of the field. Future longitudinal and large-scale studies are critical to replicate the existing findings, and to provide a more comprehensive and conclusive picture of the effect of alcohol consumption on the developing brain. PMID:26958467

  11. The effect of alcohol consumption on the adolescent brain: A systematic review of MRI and fMRI studies of alcohol-using youth.

    PubMed

    Ewing, Sarah W Feldstein; Sakhardande, Ashok; Blakemore, Sarah-Jayne

    2014-01-01

    A large proportion of adolescents drink alcohol, with many engaging in high-risk patterns of consumption, including binge drinking. Here, we systematically review and synthesize the existing empirical literature on how consuming alcohol affects the developing human brain in alcohol-using (AU) youth. For this systematic review, we began by conducting a literature search using the PubMED database to identify all available peer-reviewed magnetic resonance imaging (MRI) and functional magnetic resonance imaging (fMRI) studies of AU adolescents (aged 19 and under). All studies were screened against a strict set of criteria designed to constrain the impact of confounding factors, such as co-occurring psychiatric conditions. Twenty-one studies (10 MRI and 11 fMRI) met the criteria for inclusion. A synthesis of the MRI studies suggested that overall, AU youth showed regional differences in brain structure as compared with non-AU youth, with smaller grey matter volumes and lower white matter integrity in relevant brain areas. In terms of fMRI outcomes, despite equivalent task performance between AU and non-AU youth, AU youth showed a broad pattern of lower task-relevant activation, and greater task-irrelevant activation. In addition, a pattern of gender differences was observed for brain structure and function, with particularly striking effects among AU females. Alcohol consumption during adolescence was associated with significant differences in structure and function in the developing human brain. However, this is a nascent field, with several limiting factors (including small sample sizes, cross-sectional designs, presence of confounding factors) within many of the reviewed studies, meaning that results should be interpreted in light of the preliminary state of the field. Future longitudinal and large-scale studies are critical to replicate the existing findings, and to provide a more comprehensive and conclusive picture of the effect of alcohol consumption on the developing brain.

  12. Hail statistics for Germany derived from single-polarization radar data

    NASA Astrophysics Data System (ADS)

    Puskeiler, Marc; Kunz, Michael; Schmidberger, Manuel

    2016-09-01

    Despite the considerable damage potential related to severe hailstorms, knowledge about the local hail probability in Germany is very limited. Constructing a reliable hail probability map is challenging due largely to the lack of direct hail observations. In our study, we suggest a reasonable method by which to estimate hail signals from 3D radar reflectivity measured by conventional single-polarization radars between 2005 and 2011. Evaluating the radar-derived hail days with loss data from a building and an agricultural insurance company confirmed the reliability of the method and the results as expressed, for example, by a Heidke Skill Score HSS of 0.7. Overall, radar-derived hail days demonstrate very high spatial variability, which reflects the local-scale nature of deep moist convection. Nonetheless, systematic patterns related to climatic conditions and orography can also be observed. On the large scale, the number of hail days substantially increases from north to south, which may plausibly be explained by the higher thermal instability in the south. At regional and local scales, several hot spots with elevated hail frequency can be identified, in most cases downstream of the mountains. Several other characteristics including convective energy related to the events identified, differences in track lengths, and seasonal cycles are discussed.

  13. Spatial embedding of structural similarity in the cerebral cortex

    PubMed Central

    Song, H. Francis; Kennedy, Henry; Wang, Xiao-Jing

    2014-01-01

    Recent anatomical tracing studies have yielded substantial amounts of data on the areal connectivity underlying distributed processing in cortex, yet the fundamental principles that govern the large-scale organization of cortex remain unknown. Here we show that functional similarity between areas as defined by the pattern of shared inputs or outputs is a key to understanding the areal network of cortex. In particular, we report a systematic relation in the monkey, human, and mouse cortex between the occurrence of connections from one area to another and their similarity distance. This characteristic relation is rooted in the wiring distance dependence of connections in the brain. We introduce a weighted, spatially embedded random network model that robustly gives rise to this structure, as well as many other spatial and topological properties observed in cortex. These include features that were not accounted for in any previous model, such as the wide range of interareal connection weights. Connections in the model emerge from an underlying distribution of spatially embedded axons, thereby integrating the two scales of cortical connectivity—individual axons and interareal pathways—into a common geometric framework. These results provide insights into the origin of large-scale connectivity in cortex and have important implications for theories of cortical organization. PMID:25368200

  14. Complementary medicine for treatment of agitation and delirium in older persons: a systematic review and narrative synthesis.

    PubMed

    Levy, Ilana; Attias, Samuel; Ben-Arye, Eran; Bloch, Boaz; Schiff, Elad

    2017-05-01

    Agitation and delirium frequently occur in cognitively impaired older people. We conducted a systematic review with narrative synthesis of the literature aiming to assess effectiveness of complementary and alternative medicine (CAM) modalities to address these conditions. Following preliminary search, we included 40 original researches on CAM treatment of delirium and agitation in older persons. Then, the quality of these studies was assessed using the Downs and Black Checklist and Quality Assessment Tool for Studies with Diverse Designs, and the effect sizes were calculated. We subsequently conducted a narrative synthesis of the main findings, including theory development, preliminary synthesis, exploration of relationships within and between studies, and assessment of synthesis robustness. Forty articles that met the inclusion criteria were analyzed. Sixteen of these were randomized controlled trials. One article specifically addressed CAM treatment of delirium in patients without dementia, and the remaining 39 articles described treatments of agitated older persons with dementia. Thirty-five of the 40 included studies suggested that the investigated CAM therapies may ameliorate the severity of agitation and delirium. The physiological surrogates of agitation assessed in these studies included cortisol level, chromogranin A level, and heart rate variability. Very few of the studies systematically assessed safety issues, although no major adverse effects were reported. Overall, the systematic review of the literature suggests that several CAM modalities are potentially beneficial in the treatment of agitation and delirium among older persons. We suggest that promising CAM modalities should be further explored through large-scale randomized controlled trials in different clinical settings. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Systematic reviews involving complementary and alternative medicine interventions had higher quality of reporting than conventional medicine reviews.

    PubMed

    Lawson, Margaret L; Pham, Ba'; Klassen, Terry P; Moher, David

    2005-08-01

    To compare the quality of systematic reviews reported in English and in languages other than English, and to determine whether there are differences between conventional medicine (CM) and complementary and alternative medicine (CAM) reports. We used the Oxman and Guyatt (OG) scale to assess the quality of reporting in 130 systematic reviews: 50 were language-restricted, 32 were language-inclusive but only English-language (EL) trials contained (inclusive-EL), and 48 were language-inclusive and included trials published in languages other than English (inclusive-LOE). Of the 130 reviews, 105 addressed CM interventions and 25 addressed CAM interventions. Comparison of the systematic reviews showed that the quality of reporting and reporting characteristics are not affected by inclusion or exclusion of LOE; however, the quality of reporting of systematic reviews involving CAM interventions is higher than that of reviews focusing on CM interventions. Informal comparison of the OG scale with the data collected on quality assessments showed that the OG scale performs well overall but may not identify important differences in comprehensiveness of the search strategy and avoidance of bias in study selection. Further research is required to determine the best methods for assessing quality of systematic reviews and whether the effect of language restrictions is dependent on the type of intervention (CM or CAM).

  16. Identifying, characterizing and predicting spatial patterns of lacustrine groundwater discharge

    NASA Astrophysics Data System (ADS)

    Tecklenburg, Christina; Blume, Theresa

    2017-10-01

    Lacustrine groundwater discharge (LGD) can significantly affect lake water balances and lake water quality. However, quantifying LGD and its spatial patterns is challenging because of the large spatial extent of the aquifer-lake interface and pronounced spatial variability. This is the first experimental study to specifically study these larger-scale patterns with sufficient spatial resolution to systematically investigate how landscape and local characteristics affect the spatial variability in LGD. We measured vertical temperature profiles around a 0.49 km2 lake in northeastern Germany with a needle thermistor, which has the advantage of allowing for rapid (manual) measurements and thus, when used in a survey, high spatial coverage and resolution. Groundwater inflow rates were then estimated using the heat transport equation. These near-shore temperature profiles were complemented with sediment temperature measurements with a fibre-optic cable along six transects from shoreline to shoreline and radon measurements of lake water samples to qualitatively identify LGD patterns in the offshore part of the lake. As the hydrogeology of the catchment is sufficiently homogeneous (sandy sediments of a glacial outwash plain; no bedrock control) to avoid patterns being dominated by geological discontinuities, we were able to test the common assumptions that spatial patterns of LGD are mainly controlled by sediment characteristics and the groundwater flow field. We also tested the assumption that topographic gradients can be used as a proxy for gradients of the groundwater flow field. Thanks to the extensive data set, these tests could be carried out in a nested design, considering both small- and large-scale variability in LGD. We found that LGD was concentrated in the near-shore area, but alongshore variability was high, with specific regions of higher rates and higher spatial variability. Median inflow rates were 44 L m-2 d-1 with maximum rates in certain locations going up to 169 L m-2 d-1. Offshore LGD was negligible except for two local hotspots on steep steps in the lake bed topography. Large-scale groundwater inflow patterns were correlated with topography and the groundwater flow field, whereas small-scale patterns correlated with grain size distributions of the lake sediment. These findings confirm results and assumptions of theoretical and modelling studies more systematically than was previously possible with coarser sampling designs. However, we also found that a significant fraction of the variance in LGD could not be explained by these controls alone and that additional processes need to be considered. While regression models using these controls as explanatory variables had limited power to predict LGD rates, the results nevertheless encourage the use of topographic indices and sediment heterogeneity as an aid for targeted campaigns in future studies of groundwater discharge to lakes.

  17. Introducing malaria rapid diagnostic tests in private medicine retail outlets: A systematic literature review

    PubMed Central

    Visser, Theodoor; Bruxvoort, Katia; Maloney, Kathleen; Leslie, Toby; Barat, Lawrence M.; Allan, Richard; Ansah, Evelyn K.; Anyanti, Jennifer; Boulton, Ian; Clarke, Siân E.; Cohen, Jessica L.; Cohen, Justin M.; Cutherell, Andrea; Dolkart, Caitlin; Eves, Katie; Fink, Günther; Goodman, Catherine; Hutchinson, Eleanor; Lal, Sham; Mbonye, Anthony; Onwujekwe, Obinna; Petty, Nora; Pontarollo, Julie; Poyer, Stephen; Schellenberg, David; Streat, Elizabeth; Ward, Abigail; Wiseman, Virginia; Whitty, Christopher J. M.; Yeung, Shunmay; Cunningham, Jane; Chandler, Clare I. R.

    2017-01-01

    Background Many patients with malaria-like symptoms seek treatment in private medicine retail outlets (PMR) that distribute malaria medicines but do not traditionally provide diagnostic services, potentially leading to overtreatment with antimalarial drugs. To achieve universal access to prompt parasite-based diagnosis, many malaria-endemic countries are considering scaling up malaria rapid diagnostic tests (RDTs) in these outlets, an intervention that may require legislative changes and major investments in supporting programs and infrastructures. This review identifies studies that introduced malaria RDTs in PMRs and examines study outcomes and success factors to inform scale up decisions. Methods Published and unpublished studies that introduced malaria RDTs in PMRs were systematically identified and reviewed. Literature published before November 2016 was searched in six electronic databases, and unpublished studies were identified through personal contacts and stakeholder meetings. Outcomes were extracted from publications or provided by principal investigators. Results Six published and six unpublished studies were found. Most studies took place in sub-Saharan Africa and were small-scale pilots of RDT introduction in drug shops or pharmacies. None of the studies assessed large-scale implementation in PMRs. RDT uptake varied widely from 8%-100%. Provision of artemisinin-based combination therapy (ACT) for patients testing positive ranged from 30%-99%, and was more than 85% in five studies. Of those testing negative, provision of antimalarials varied from 2%-83% and was less than 20% in eight studies. Longer provider training, lower RDT retail prices and frequent supervision appeared to have a positive effect on RDT uptake and provider adherence to test results. Performance of RDTs by PMR vendors was generally good, but disposal of medical waste and referral of patients to public facilities were common challenges. Conclusions Expanding services of PMRs to include malaria diagnostic services may hold great promise to improve malaria case management and curb overtreatment with antimalarials. However, doing so will require careful planning, investment and additional research to develop and sustain effective training, supervision, waste-management, referral and surveillance programs beyond the public sector. PMID:28253315

  18. Can we really use available scales for child and adolescent psychopathology across cultures? A systematic review of cross-cultural measurement invariance data.

    PubMed

    Stevanovic, Dejan; Jafari, Peyman; Knez, Rajna; Franic, Tomislav; Atilola, Olayinka; Davidovic, Nikolina; Bagheri, Zahra; Lakic, Aneta

    2017-02-01

    In this systematic review, we assessed available evidence for cross-cultural measurement invariance of assessment scales for child and adolescent psychopathology as an indicator of cross-cultural validity. A literature search was conducted using the Medline, PsychInfo, Scopus, Web of Science, and Google Scholar databases. Cross-cultural measurement invariance data was available for 26 scales. Based on the aggregation of the evidence from the studies under review, none of the evaluated scales have strong evidence for cross-cultural validity and suitability for cross-cultural comparison. A few of the studies showed a moderate level of measurement invariance for some scales (such as the Fear Survey Schedule for Children-Revised, Multidimensional Anxiety Scale for Children, Revised Child Anxiety and Depression Scale, Revised Children's Manifest Anxiety Scale, Mood and Feelings Questionnaire, and Disruptive Behavior Rating Scale), which may make them suitable in cross-cultural comparative studies. The remainder of the scales either showed weak or outright lack of measurement invariance. This review showed only limited testing for measurement invariance across cultural groups of scales for pediatric psychopathology, with evidence of cross-cultural validity for only a few scales. This study also revealed a need to improve practices of statistical analysis reporting in testing measurement invariance. Implications for future research are discussed.

  19. Characterization of Detectors and Instrument Systematics for the SPIDER CMB Polarimeter

    NASA Astrophysics Data System (ADS)

    Tucker, Rebecca Suzanne

    We know from the CMB and observations of large-scale structure that the universe is extremely flat, homogenous, and isotropic. The current favored mechanism for generating these characteristics is inflation, a theorized period of exponential expansion of the universe that occurred shortly after the Big Bang. Most theories of inflation generically predict a background of stochastic gravitational waves. These gravitational waves should leave their unique imprint on the polarization of the CMB via Thompson scattering. Scalar perturbations of the metric will cause a pattern of polarization with no curl (E-mode). Tensor perturbations (gravitational waves) will cause a unique pattern of polarization on the CMB that includes a curl component (B-mode). A measurement of the ratio of the tensor to scalar perturbations (r ) tells us the energy scale of inflation. Recent measurements by the BICEP2 team detect the B-mode spectrum with a tensor-to-scalar ratio of r=0.2 (+0.05, -0.07). An independent confirmation of this result is the next step towards understanding the inflationary universe. This thesis describes my work on a balloon-borne polarimeter called SPIDER, which is designed to illuminate the physics of the early universe through measurements of the cosmic microwave background polarization. SPIDER consists of six single-frequency, on-axis refracting telescopes contained in a shared-vacuum liquid-helium cryostat. Its large format arrays of millimeter-wave detectors and tight control of systematics will give it unprecedented sensitivity. This thesis describes how the SPIDER detectors are characterized and calibrated for flight, as well as how the systematics requirements for the SPIDER system are simulated and measured.

  20. Clinical Research That Matters: Designing Outcome-Based Research for Older Adults to Qualify for Systematic Reviews and Meta-Analyses.

    PubMed

    Lee, Jeannie K; Fosnight, Susan M; Estus, Erica L; Evans, Paula J; Pho, Victoria B; Reidt, Shannon; Reist, Jeffrey C; Ruby, Christine M; Sibicky, Stephanie L; Wheeler, Janel B

    2018-01-01

    Though older adults are more sensitive to the effects of medications than their younger counterparts, they are often excluded from manufacturer-based clinical studies. Practice-based research is a practical method to identify medication-related effects in older patients. This research also highlights the role of a pharmacist in improving care in this population. A single study rarely has strong enough evidence to change geriatric practice, unless it is a large-scale, multisite, randomized controlled trial that specifically targets older adults. It is important to design studies that may be used in systematic reviews or meta-analyses that build a stronger evidence base. Recent literature has documented a gap in advanced pharmacist training pertaining to research skills. In this paper, we hope to fill some of the educational gaps related to research in older adults. We define best practices when deciding on the type of study, inclusion and exclusion criteria, design of the intervention, how outcomes are measured, and how results are reported. Well-designed studies increase the pool of available data to further document the important role that pharmacists have in optimizing care of older patients.

  1. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends

    DOE PAGES

    Grotjahn, Richard; Black, Robert; Leung, Ruby; ...

    2015-05-22

    This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less

  2. Urbanisation, urbanicity, and health: a systematic review of the reliability and validity of urbanicity scales.

    PubMed

    Cyril, Sheila; Oldroyd, John C; Renzaho, Andre

    2013-05-28

    Despite a plethora of studies examining the effect of increased urbanisation on health, no single study has systematically examined the measurement properties of scales used to measure urbanicity. It is critical to distinguish findings from studies that use surrogate measures of urbanicity (e.g. population density) from those that use measures rigorously tested for reliability and validity. The purpose of this study was to assess the measurement reliability and validity of the available urbanicity scales and identify areas where more research is needed to facilitate the development of a standardised measure of urbanicity. Databases searched were MEDLINE with Full Text, CINAHL with Full Text, and PsycINFO (EBSCOhost) as well as Embase (Ovid) covering the period from January 1970 to April 2012. Studies included in this systematic review were those that focused on the development of an urbanicity scale with clearly defined items or the adoption of an existing scale, included at least one outcome measure related to health, published in peer-reviewed journals, the full text was available in English and tested for validity and reliability. Eleven studies met our inclusion criteria which were conducted in Sri Lanka, Austria, China, Nigeria, India and Philippines. They ranged in size from 3327 to 33,404 participants. The number of scale items ranged from 7 to 12 items in 5 studies. One study measured urban area socioeconomic disadvantage instead of urbanicity. The emerging evidence is that increased urbanisation is associated with deleterious health outcomes. It is possible that increased urbanisation is also associated with access and utilisation of health services. However, urbanicity measures differed across studies, and the reliability and validity properties of the used scales were not well established. There is an urgent need for studies to standardise measures of urbanicity. Longitudinal cohort studies to confirm the relationship between increased urbanisation and health outcomes are urgently needed.

  3. Urbanisation, urbanicity, and health: a systematic review of the reliability and validity of urbanicity scales

    PubMed Central

    2013-01-01

    Background Despite a plethora of studies examining the effect of increased urbanisation on health, no single study has systematically examined the measurement properties of scales used to measure urbanicity. It is critical to distinguish findings from studies that use surrogate measures of urbanicity (e.g. population density) from those that use measures rigorously tested for reliability and validity. The purpose of this study was to assess the measurement reliability and validity of the available urbanicity scales and identify areas where more research is needed to facilitate the development of a standardised measure of urbanicity. Methods Databases searched were MEDLINE with Full Text, CINAHL with Full Text, and PsycINFO (EBSCOhost) as well as Embase (Ovid) covering the period from January 1970 to April 2012. Studies included in this systematic review were those that focused on the development of an urbanicity scale with clearly defined items or the adoption of an existing scale, included at least one outcome measure related to health, published in peer-reviewed journals, the full text was available in English and tested for validity and reliability. Results Eleven studies met our inclusion criteria which were conducted in Sri Lanka, Austria, China, Nigeria, India and Philippines. They ranged in size from 3327 to 33,404 participants. The number of scale items ranged from 7 to 12 items in 5 studies. One study measured urban area socioeconomic disadvantage instead of urbanicity. The emerging evidence is that increased urbanisation is associated with deleterious health outcomes. It is possible that increased urbanisation is also associated with access and utilisation of health services. However, urbanicity measures differed across studies, and the reliability and validity properties of the used scales were not well established. Conclusion There is an urgent need for studies to standardise measures of urbanicity. Longitudinal cohort studies to confirm the relationship between increased urbanisation and health outcomes are urgently needed. PMID:23714282

  4. Aromatherapy for managing menopausal symptoms: A protocol for systematic review and meta-analysis.

    PubMed

    Choi, Jiae; Lee, Hye Won; Lee, Ju Ah; Lim, Hyun-Ja; Lee, Myeong Soo

    2018-02-01

    Aromatherapy is often used as a complementary therapy for women's health. This systematic review aims to evaluate the therapeutic effects of aromatherapy as a management for menopausal symptoms. Eleven electronic databases will be searched from inception to February 2018. Randomized controlled trials that evaluated any type of aromatherapy against any type of control in individuals with menopausal symptoms will be eligible. The methodological quality will be assessed using the Cochrane risk of bias tool. Two authors will independently assess each study for eligibility and risk of bias and to extract data. This study will provide a high quality synthesis of current evidence of aromatherapy for menopausal symptoms measured with Menopause Rating Scale, the Kupperman Index, the Greene Climacteric Scale, or other validated questionnaires. The conclusion of our systematic review will provide evidence to judge whether aromatherapy is an effective intervention for patient with menopausal women. Ethical approval will not be required, given that this protocol is for a systematic review. The systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. PROSPERO CRD42017079191.

  5. SPIDER OPTIMIZATION. II. OPTICAL, MAGNETIC, AND FOREGROUND EFFECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Dea, D. T.; Clark, C. N.; Contaldi, C. R.

    2011-09-01

    SPIDER is a balloon-borne instrument designed to map the polarization of the cosmic microwave background (CMB) with degree-scale resolution over a large fraction of the sky. SPIDER's main goal is to measure the amplitude of primordial gravitational waves through their imprint on the polarization of the CMB if the tensor-to-scalar ratio, r, is greater than 0.03. To achieve this goal, instrumental systematic errors must be controlled with unprecedented accuracy. Here, we build on previous work to use simulations of SPIDER observations to examine the impact of several systematic effects that have been characterized through testing and modeling of various instrumentmore » components. In particular, we investigate the impact of the non-ideal spectral response of the half-wave plates, coupling between focal-plane components and Earth's magnetic field, and beam mismatches and asymmetries. We also present a model of diffuse polarized foreground emission based on a three-dimensional model of the Galactic magnetic field and dust, and study the interaction of this foreground emission with our observation strategy and instrumental effects. We find that the expected level of foreground and systematic contamination is sufficiently low for SPIDER to achieve its science goals.« less

  6. Division of Labor in Vocabulary Structure: Insights From Corpus Analyses.

    PubMed

    Christiansen, Morten H; Monaghan, Padraic

    2016-07-01

    Psychologists have used experimental methods to study language for more than a century. However, only with the recent availability of large-scale linguistic databases has a more complete picture begun to emerge of how language is actually used, and what information is available as input to language acquisition. Analyses of such "big data" have resulted in reappraisals of key assumptions about the nature of language. As an example, we focus on corpus-based research that has shed new light on the arbitrariness of the sign: the longstanding assumption that the relationship between the sound of a word and its meaning is arbitrary. The results reveal a systematic relationship between the sound of a word and its meaning, which is stronger for early acquired words. Moreover, the analyses further uncover a systematic relationship between words and their lexical categories-nouns and verbs sound differently from each other-affecting how we learn new words and use them in sentences. Together, these results point to a division of labor between arbitrariness and systematicity in sound-meaning mappings. We conclude by arguing in favor of including "big data" analyses into the language scientist's methodological toolbox. Copyright © 2015 Cognitive Science Society, Inc.

  7. In Situ Operating Room-Based Simulation: A Review.

    PubMed

    Owei, Lily; Neylan, Christopher J; Rao, Raghavendra; Caskey, Robert C; Morris, Jon B; Sensenig, Richard; Brooks, Ari D; Dempsey, Daniel T; Williams, Noel N; Atkins, Joshua H; Baranov, Dimitry Y; Dumon, Kristoffel R

    To systematically review the literature surrounding operating room-based in situ training in surgery. A systematic review was conducted of MEDLINE. The review was conducted based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology, and employed the Population, Intervention, Comparator, Outcome (PICO) structure to define inclusion/exclusion criteria. The Kirkpatrick model was used to further classify the outcome of in situ training when possible. The search returned 308 database hits, and ultimately 19 articles were identified that met the stated PICO inclusion criteria. Operating room-based in situ simulation is used for a variety of purposes and in a variety of settings, and it has the potential to offer unique advantages over other types of simulation. Only one randomized controlled trial was conducted comparing in situ simulation to off-site simulation, which found few significant differences. One large-scale outcome study showed improved perinatal outcomes in obstetrics. Although in situ simulation theoretically offers certain advantages over other types of simulation, especially in addressing system-wide or environmental threats, its efficacy has yet to be clearly demonstrated. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. A potential role of anti-poverty programs in health promotion

    PubMed Central

    Silverman, Kenneth; Holtyn, August F.; Jarvis, Brantley

    2016-01-01

    Poverty is one of the most pervasive risk factors underlying poor health, but is rarely targeted to improve health. Research on the effects of anti-poverty interventions on health has been limited, at least in part because funding for that research has been limited. Anti-poverty programs have been applied on a large scale, frequently by governments, but without systematic development and cumulative programmatic experimental studies. Anti-poverty programs that produce lasting effects on poverty have not been developed. Before evaluating the effect of anti-poverty programs on health, programs must be developed that can reduce poverty consistently. Anti-poverty programs require systematic development and cumulative programmatic scientific evaluation. Research on the therapeutic workplace could provide a model for that research and an adaptation of the therapeutic workplace could serve as a foundation of a comprehensive anti-poverty program. Once effective anti-poverty programs are developed, future research could determine if those programs improve health in addition to increasing income. The potential personal, health and economic benefits of effective anti-poverty programs could be substantial, and could justify the major efforts and expenses that would be required to support systematic research to develop such programs. PMID:27235603

  9. A potential role of anti-poverty programs in health promotion.

    PubMed

    Silverman, Kenneth; Holtyn, August F; Jarvis, Brantley P

    2016-11-01

    Poverty is one of the most pervasive risk factors underlying poor health, but is rarely targeted to improve health. Research on the effects of anti-poverty interventions on health has been limited, at least in part because funding for that research has been limited. Anti-poverty programs have been applied on a large scale, frequently by governments, but without systematic development and cumulative programmatic experimental studies. Anti-poverty programs that produce lasting effects on poverty have not been developed. Before evaluating the effect of anti-poverty programs on health, programs must be developed that can reduce poverty consistently. Anti-poverty programs require systematic development and cumulative programmatic scientific evaluation. Research on the therapeutic workplace could provide a model for that research and an adaptation of the therapeutic workplace could serve as a foundation of a comprehensive anti-poverty program. Once effective anti-poverty programs are developed, future research could determine if those programs improve health in addition to increasing income. The potential personal, health and economic benefits of effective anti-poverty programs could be substantial, and could justify the major efforts and expenses that would be required to support systematic research to develop such programs. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. The Future of Stellar Populations Studies in the Milky Way and the Local Group

    NASA Astrophysics Data System (ADS)

    Majewski, Steven R.

    2010-04-01

    The last decade has seen enormous progress in understanding the structure of the Milky Way and neighboring galaxies via the production of large-scale digital surveys of the sky like 2MASS and SDSS, as well as specialized, counterpart imaging surveys of other Local Group systems. Apart from providing snaphots of galaxy structure, these “cartographic” surveys lend insights into the formation and evolution of galaxies when supplemented with additional data (e.g., spectroscopy, astrometry) and when referenced to theoretical models and simulations of galaxy evolution. These increasingly sophisticated simulations are making ever more specific predictions about the detailed chemistry and dynamics of stellar populations in galaxies. To fully exploit, test and constrain these theoretical ventures demands similar commitments of observational effort as has been plied into the previous imaging surveys to fill out other dimensions of parameter space with statistically significant intensity. Fortunately the future of large-scale stellar population studies is bright with a number of grand projects on the horizon that collectively will contribute a breathtaking volume of information on individual stars in Local Group galaxies. These projects include: (1) additional imaging surveys, such as Pan-STARRS, SkyMapper and LSST, which, apart from providing deep, multicolor imaging, yield time series data useful for revealing variable stars (including critical standard candles, like RR Lyrae variables) and creating large-scale, deep proper motion catalogs; (2) higher accuracy, space-based astrometric missions, such as Gaia and SIM-Lite, which stand to provide critical, high precision dynamical data on stars in the Milky Way and its satellites; and (3) large-scale spectroscopic surveys provided by RAVE, APOGEE, HERMES, LAMOST, and the Gaia spectrometer, which will yield not only enormous numbers of stellar radial velocities, but extremely comprehensive views of the chemistry of stellar populations. Meanwhile, previously dust-obscured regions of the Milky Way will continue to be systematically exposed via large infrared surveys underway or on the way, such as the various GLIMPSE surveys from Spitzer's IRAC instrument, UKIDSS, APOGEE, JASMINE and WISE.

  11. Emergency department triage scales and their components: a systematic review of the scientific evidence.

    PubMed

    Farrohknia, Nasim; Castrén, Maaret; Ehrenberg, Anna; Lind, Lars; Oredsson, Sven; Jonsson, Håkan; Asplund, Kjell; Göransson, Katarina E

    2011-06-30

    Emergency department (ED) triage is used to identify patients' level of urgency and treat them based on their triage level. The global advancement of triage scales in the past two decades has generated considerable research on the validity and reliability of these scales. This systematic review aims to investigate the scientific evidence for published ED triage scales. The following questions are addressed: 1. Does assessment of individual vital signs or chief complaints affect mortality during the hospital stay or within 30 days after arrival at the ED?2. What is the level of agreement between clinicians' triage decisions compared to each other or to a gold standard for each scale (reliability)? 3. How valid is each triage scale in predicting hospitalization and hospital mortality? A systematic search of the international literature published from 1966 through March 31, 2009 explored the British Nursing Index, Business Source Premier, CINAHL, Cochrane Library, EMBASE, and PubMed. Inclusion was limited to controlled studies of adult patients (≥ 15 years) visiting EDs for somatic reasons. Outcome variables were death in ED or hospital and need for hospitalization (validity). Methodological quality and clinical relevance of each study were rated as high, medium, or low. The results from the studies that met the inclusion criteria and quality standards were synthesized applying the internationally developed GRADE system. Each conclusion was then assessed as having strong, moderately strong, limited, or insufficient scientific evidence. If studies were not available, this was also noted.We found ED triage scales to be supported, at best, by limited and often insufficient evidence.The ability of the individual vital signs included in the different scales to predict outcome is seldom, if at all, studied in the ED setting. The scientific evidence to assess interrater agreement (reliability) was limited for one triage scale and insufficient or lacking for all other scales. Two of the scales yielded limited scientific evidence, and one scale yielded insufficient evidence, on which to assess the risk of early death or hospitalization in patients assigned to the two lowest triage levels on a 5-level scale (validity).

  12. Emergency Department Triage Scales and Their Components: A Systematic Review of the Scientific Evidence

    PubMed Central

    2011-01-01

    Emergency department (ED) triage is used to identify patients' level of urgency and treat them based on their triage level. The global advancement of triage scales in the past two decades has generated considerable research on the validity and reliability of these scales. This systematic review aims to investigate the scientific evidence for published ED triage scales. The following questions are addressed: 1. Does assessment of individual vital signs or chief complaints affect mortality during the hospital stay or within 30 days after arrival at the ED? 2. What is the level of agreement between clinicians' triage decisions compared to each other or to a gold standard for each scale (reliability)? 3. How valid is each triage scale in predicting hospitalization and hospital mortality? A systematic search of the international literature published from 1966 through March 31, 2009 explored the British Nursing Index, Business Source Premier, CINAHL, Cochrane Library, EMBASE, and PubMed. Inclusion was limited to controlled studies of adult patients (≥15 years) visiting EDs for somatic reasons. Outcome variables were death in ED or hospital and need for hospitalization (validity). Methodological quality and clinical relevance of each study were rated as high, medium, or low. The results from the studies that met the inclusion criteria and quality standards were synthesized applying the internationally developed GRADE system. Each conclusion was then assessed as having strong, moderately strong, limited, or insufficient scientific evidence. If studies were not available, this was also noted. We found ED triage scales to be supported, at best, by limited and often insufficient evidence. The ability of the individual vital signs included in the different scales to predict outcome is seldom, if at all, studied in the ED setting. The scientific evidence to assess interrater agreement (reliability) was limited for one triage scale and insufficient or lacking for all other scales. Two of the scales yielded limited scientific evidence, and one scale yielded insufficient evidence, on which to assess the risk of early death or hospitalization in patients assigned to the two lowest triage levels on a 5-level scale (validity). PMID:21718476

  13. Optimization of large animal MI models; a systematic analysis of control groups from preclinical studies.

    PubMed

    Zwetsloot, P P; Kouwenberg, L H J A; Sena, E S; Eding, J E; den Ruijter, H M; Sluijter, J P G; Pasterkamp, G; Doevendans, P A; Hoefer, I E; Chamuleau, S A J; van Hout, G P J; Jansen Of Lorkeers, S J

    2017-10-27

    Large animal models are essential for the development of novel therapeutics for myocardial infarction. To optimize translation, we need to assess the effect of experimental design on disease outcome and model experimental design to resemble the clinical course of MI. The aim of this study is therefore to systematically investigate how experimental decisions affect outcome measurements in large animal MI models. We used control animal-data from two independent meta-analyses of large animal MI models. All variables of interest were pre-defined. We performed univariable and multivariable meta-regression to analyze whether these variables influenced infarct size and ejection fraction. Our analyses incorporated 246 relevant studies. Multivariable meta-regression revealed that infarct size and cardiac function were influenced independently by choice of species, sex, co-medication, occlusion type, occluded vessel, quantification method, ischemia duration and follow-up duration. We provide strong systematic evidence that commonly used endpoints significantly depend on study design and biological variation. This makes direct comparison of different study-results difficult and calls for standardized models. Researchers should take this into account when designing large animal studies to most closely mimic the clinical course of MI and enable translational success.

  14. Modeling of the response of the POLARBEAR bolometers with a continuously rotating half-wave plate

    NASA Astrophysics Data System (ADS)

    Takakura, Satoru; POLARBEAR Collaboration

    2018-01-01

    The curly pattern, the so-called B-mode, in the polarization anisotropy of the cosmic microwave background (CMB) is a powerful probe to measure primordial gravitational waves from the cosmic inflation, as well as the weak lensing due to the large scale structure of the Universe. At present, ground-based CMB experiments with a few arcminutes resolution such as POLARBEAR, SPTpol, and ACTPol have successfully measured the angular power spectrum of the B-mode only in sub-degree scales, though these experiments also have potential to measure the inflationary B-modes in degree scales in absence of the low-frequency noise (1/f noise). Thus, techniques of polarization signal modulation such as a continuously rotating half-wave plate (CRHWP) are widely investigated to suppress the 1/f noise and also to reduce instrumental systematic errors. In this study, we have implemented a CRHWP placed around the prime focus of the POLARBEAR telescope and operated at ambient temperatures. We construct a comprehensive model including half-wave plate synchronous signals, detector non-linearities, beam imperfections, and all noise sources. Using this model, we show that, in practice, the 1/f noise and instrumental systematics could remain even with the CRHWP. However, we also evaluate those effects from test observations using a prototype CRHWP on the POLARBEAR telescope and find that the residual 1/f noise is sufficiently small for POLARBEAR to probe the multipoles about 40. We will also discuss prospects for future CMB experiments with better sensitivities.

  15. Structure and validity of Family Harmony Scale: An instrument for measuring harmony.

    PubMed

    Kavikondala, Sushma; Stewart, Sunita M; Ni, Michael Y; Chan, Brandford H Y; Lee, Paul H; Li, Kin-Kit; McDowell, Ian; Johnston, Janice M; Chan, Sophia S; Lam, T H; Lam, Wendy W T; Fielding, Richard; Leung, Gabriel M

    2016-03-01

    Culture plays a role in mental health, partly by defining the characteristics that are indicative of positive adjustment. In Chinese cultures, positive family relationships are considered central to well-being. The culturally emphasized characteristic of family harmony may be an important factor associated with psychopathology. This article presents the development and psychometric examination of the Family Harmony Scale (FHS), an indigenously developed 24-item instrument tapping family harmony in 17,461 Hong Kong residents from 7,791 households. A higher-order model with 1 second-order factor and 5 first-order factors fit the data well and showed factorial invariance across sex and participants in different family roles. A 5-item short form (FHS-5) was also developed, with 1 item from each first-order factor. The short scale showed, as expected, a single-factor structure with good fit. Both scales demonstrated high internal consistency, acceptable test-retest reliability, and good convergent and discriminant validity. The 24-item FHS was negatively associated with depressive symptoms after accounting for individual risk factors and general family function. Family harmony moderated the relationship between life stress and depressive symptoms such that those individuals who reported low family harmony had stronger associations between life stress and depressive symptoms. This study adds to the literature a systematically developed, multidimensional measure of family harmony, which may be an important psychological protective factor, in a large urban Chinese sample. The FHS-5 minimizes operational and respondent burdens, making it an attractive tool for large-scale epidemiological studies with Chinese populations in urban settings, where over half of China's 1.4 billion people reside. (c) 2016 APA, all rights reserved).

  16. Risk of malignancy in ankylosing spondylitis: a systematic review and meta-analysis.

    PubMed

    Deng, Chuiwen; Li, Wenli; Fei, Yunyun; Li, Yongzhe; Zhang, Fengchun

    2016-08-18

    Current knowledge about the overall and site-specific risk of malignancy associated with ankylosing spondylitis (AS) is inconsistent. We conducted a systematic review and meta-analysis to address this knowledge gap. Five databases (PubMed, EMBASE, Web of Science, the Cochrane library and the virtual health library) were systematically searched. A manual search of publications within the last 2 years in key journals in the field (Annals of the Rheumatic Diseases, Rheumatology and Arthritis &rheumatology) was also performed. STATA 11.2 software was used to conduct the meta-analysis. After screening, twenty-three studies, of different designs, were eligible for meta-analysis. AS is associated with a 14% (pooled RR 1.14; 95% CI 1.03-1.25) increase in the overall risk for malignancy. Compared to controls, patients with AS are at a specific increased risk for malignancy of the digestive system (pooled RR 1.20; 95% CI 1.01 to 1.42), multiple myelomas (pooled RR 1.92; 95% CI 1.37 to 3.69) and lymphomas (pooled RR 1.32; 95% CI 1.11 to 1.57). On subgroup analysis, evidence from high quality cohort studies indicated that AS patients from Asia are at highest risk for malignancy overall. Confirmation of findings from large-scale longitudinal studies is needed to identify specific risk factors and to evaluate treatment effects.

  17. Systematic review of the multidimensional fatigue symptom inventory-short form.

    PubMed

    Donovan, Kristine A; Stein, Kevin D; Lee, Morgan; Leach, Corinne R; Ilozumba, Onaedo; Jacobsen, Paul B

    2015-01-01

    Fatigue is a subjective complaint that is believed to be multifactorial in its etiology and multidimensional in its expression. Fatigue may be experienced by individuals in different dimensions as physical, mental, and emotional tiredness. The purposes of this study were to review and characterize the use of the 30-item Multidimensional Fatigue Symptom Inventory-Short Form (MFSI-SF) in published studies and to evaluate the available evidence for its psychometric properties. A systematic review was conducted to identify published articles reporting results for the MFSI-SF. Data were analyzed to characterize internal consistency reliability of multi-item MFSI-SF scales and test-retest reliability. Correlation coefficients were summarized to characterize concurrent, convergent, and divergent validity. Standardized effect sizes were calculated to characterize the discriminative validity of the MFSI-SF and its sensitivity to change. Seventy articles were identified. Sample sizes reported ranged from 10 to 529 and nearly half consisted exclusively of females. More than half the samples were composed of cancer patients; of those, 59% were breast cancer patients. Mean alpha coefficients for MFSI-SF fatigue subscales ranged from 0.84 for physical fatigue to 0.93 for general fatigue. The MFSI-SF demonstrated moderate test-retest reliability in a small number of studies. Correlations with other fatigue and vitality measures were moderate to large in size and in the expected direction. The MFSI-SF fatigue subscales were positively correlated with measures of distress, depressive, and anxious symptoms. Effect sizes for discriminative validity ranged from medium to large, while effect sizes for sensitivity to change ranged from small to large. Findings demonstrate the positive psychometric properties of the MFSI-SF, provide evidence for its usefulness in medically ill and nonmedically ill individuals, and support its use in future studies.

  18. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  19. Trace Elements and Healthcare: A Bioinformatics Perspective.

    PubMed

    Zhang, Yan

    2017-01-01

    Biological trace elements are essential for human health. Imbalance in trace element metabolism and homeostasis may play an important role in a variety of diseases and disorders. While the majority of previous researches focused on experimental verification of genes involved in trace element metabolism and those encoding trace element-dependent proteins, bioinformatics study on trace elements is relatively rare and still at the starting stage. This chapter offers an overview of recent progress in bioinformatics analyses of trace element utilization, metabolism, and function, especially comparative genomics of several important metals. The relationship between individual elements and several diseases based on recent large-scale systematic studies such as genome-wide association studies and case-control studies is discussed. Lastly, developments of ionomics and its recent application in human health are also introduced.

  20. Photosynthetic production of hydrogen. [Blue-green alga, Anabaena cylindrica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neil, G.; Nicholas, D.J.D.; Bockris, J.O.

    A systematic investigation of photosynthetic hydrogen production using a blue-green alga, Anabaena cylindrica, was carried out. The results indicate that there are two important problems which must be overcome for large-scale hydrogen production using photosynthetic processes. These are (a) the development of a stable system, and (b) attainment of at least a fifty-fold increase in the rate of hydrogen evolution per unit area illuminated.

  1. Principles for Large-Scale Classroom-Based Teacher Assessment of English Learners' Language: An Initial Framework from School-Based Assessment in Hong Kong

    ERIC Educational Resources Information Center

    Hamp-Lyons, Liz

    2009-01-01

    Davison and Leung (this issue) describe the field of teacher-based English language assessment as having "much variability, a lack of systematic principles and procedures and a dearth of information as to the impact of teacher-based assessments on learning and teaching" (p. 389). In this article, the author briefly explores an example of…

  2. Continuous Flow Polymer Synthesis toward Reproducible Large-Scale Production for Efficient Bulk Heterojunction Organic Solar Cells.

    PubMed

    Pirotte, Geert; Kesters, Jurgen; Verstappen, Pieter; Govaerts, Sanne; Manca, Jean; Lutsen, Laurence; Vanderzande, Dirk; Maes, Wouter

    2015-10-12

    Organic photovoltaics (OPV) have attracted great interest as a solar cell technology with appealing mechanical, aesthetical, and economies-of-scale features. To drive OPV toward economic viability, low-cost, large-scale module production has to be realized in combination with increased top-quality material availability and minimal batch-to-batch variation. To this extent, continuous flow chemistry can serve as a powerful tool. In this contribution, a flow protocol is optimized for the high performance benzodithiophene-thienopyrroledione copolymer PBDTTPD and the material quality is probed through systematic solar-cell evaluation. A stepwise approach is adopted to turn the batch process into a reproducible and scalable continuous flow procedure. Solar cell devices fabricated using the obtained polymer batches deliver an average power conversion efficiency of 7.2 %. Upon incorporation of an ionic polythiophene-based cathodic interlayer, the photovoltaic performance could be enhanced to a maximum efficiency of 9.1 %. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Key principles to improve programmes and interventions in complementary feeding.

    PubMed

    Lutter, Chessa K; Iannotti, Lora; Creed-Kanashiro, Hilary; Guyon, Agnes; Daelmans, Bernadette; Robert, Rebecca; Haider, Rukhsana

    2013-09-01

    Although there are some examples of successful complementary feeding programmes to promote healthy growth and prevent stunting at the community level, to date there are few, if any, examples of successful programmes at scale. A lack of systematic process and impact evaluations on pilot projects to generate lessons learned has precluded scaling up of effective programmes. Programmes to effect positive change in nutrition rarely follow systematic planning, implementation, and evaluation (PIE) processes to enhance effectiveness over the long term. As a result a set of programme-oriented key principles to promote healthy growth remains elusive. The purpose of this paper is to fill this gap by proposing a set of principles to improve programmes and interventions to promote healthy growth and development. Identifying such principles for programme success has three requirements: rethinking traditional paradigms used to promote improved infant and young child feeding; ensuring better linkages to delivery platforms; and, improving programming. Following the PIE model for programmes and learning from experiences from four relatively large-scale programmes described in this paper, 10 key principles are identified in the areas of programme planning, programme implementation, programme evaluation, and dissemination, replication, and scaling up. Nonetheless, numerous operational research questions remain, some of which are highlighted in this paper. © 2013 John Wiley & Sons Ltd.

  4. Rehabilitation of vulnerable groups in emergencies and disasters: A systematic review

    PubMed Central

    Sheikhbardsiri, Hojjat; Yarmohammadian, Mohammad H.; Rezaei, Fatemeh; Maracy, Mohammad Reza

    2017-01-01

    BACKGROUND: Natural and man-made disasters, especially those occurring in large scales not only result in human mortality, but also cause physical, psychological, and social disabilities. Providing effective rehabilitation services in time can decrease the frequency of such disabilities. The aim of the current study was to perform a systematic review related to rehabilitation of vulnerable groups in emergencies and disasters. METHODS: The systematic review was conducted according to the preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines. The key words “recovery”, “rehabilitation”, “reconstruction”, “transformation”, “transition”, “emergency”, “disaster”, “crisis”, “hazard”, “catastrophe”, “tragedy”, “mass casualty incident”, “women”, “female”, “children”, “pediatric”, “disable”, “handicap”, “elder”, “old” and “vulnerable” were used in combination with Boolean operators OR and AND. ISI Web of Science, PubMed, Scopus, Science Direct, Ovid, ProQuest, Wiley, Google Scholar were searched. RESULTS: In this study a total of 11 928 articles were considered and 25 articles were selected for final review of rehabilitation of vulnerable groups based on the objective of this study. Twenty-five studies including six qualitative, sixteen cross-sectional and three randomized controlled trials were reviewed for rehabilitation of vulnerable groups in emergencies and disasters. Out of the selected papers, 23 were studied based on rehabilitation after natural disasters and the remaining were man-made disasters. Most types of rehabilitation were physical, social, psychological and economic. CONCLUSION: The review of the papers showed different programs of physical, physiological, economic and social rehabilitations for vulnerable groups after emergencies and disasters. It may help health field managers better implement standard rehabilitation activities for vulnerable groups. PMID:29123602

  5. Parameters and Scales Used to Assess and Report Findings From Stroboscopy: A Systematic Review.

    PubMed

    Bonilha, Heather Shaw; Desjardins, Maude; Garand, Kendrea L; Martin-Harris, Bonnie

    2017-11-02

    Laryngeal endoscopy with stroboscopy, a critical component of the assessment of voice disorders, is rarely used as a treatment outcome measure in the scientific literature. We hypothesized that this is because of the lack of a widely used standardized, validated, and reliable method to assess and report laryngeal anatomy and physiology, and undertook a systematic literature review to determine the extent of the inconsistencies of the parameters and scales used in voice treatment outcome studies. Systematic literature review. We searched PubMed, Ovid, and Cochrane for studies where laryngeal endoscopy with stroboscopy was used as a treatment outcome measure with search terms representing "stroboscopy" and "treatment" guided by Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement standards. In the 62 included articles, we identified 141 terms representing 49 different parameters, which were further classified into 20 broad categories. The six most common parameters were magnitude of glottal gap, mucosal wave amplitude, location or shape of glottal gap, regularity of vibration, phase symmetry, and presence and size of specific lesions. Parameters were assessed on scales ranging from binary to 100 points. The number of scales used for each parameter varied from 1 to 24, with an average of four different scales per parameter. There is a lack of agreement in the scientific literature regarding which parameters should be assessed to measure voice treatment outcomes and which terms and scales should be used for each parameter. This greatly diminishes comparison and clinical implementation of the results of treatment outcomes research in voice disorders. We highlight a previously published tool and recommend it for future use in research and clinical settings. Copyright © 2017. Published by Elsevier Inc.

  6. Large-scale Distribution of Arrival Directions of Cosmic Rays Detected Above 1018 eV at the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antiči'c, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Buroker, L.; Burton, R. E.; Caballero-Mora, K. S.; Caccianiga, B.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gemmeke, H.; Ghia, P. L.; Giller, M.; Gitto, J.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gouffon, P.; Grashorn, E.; Grebe, S.; Griffith, N.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jansen, S.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; LaHurd, D.; Latronico, L.; Lauer, R.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Messina, S.; Meurer, C.; Meyhandan, R.; Mi'canovi'c, S.; Micheletti, M. I.; Minaya, I. A.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niechciol, M.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Oehlschläger, J.; Olinto, A.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrolini, A.; Petrov, Y.; Pfendner, C.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Ponce, V. H.; Pontz, M.; Porcelli, A.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Cabo, I.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Sima, O.; 'Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tapia, A.; Tartare, M.; Taşcău, O.; Tcaciuc, R.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tkaczyk, W.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Westerhoff, S.; Whelan, B. J.; Widom, A.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano Garcia, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.

    2012-12-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 1018 eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 1018 eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 1018 eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  7. Grid-based mapping: A method for rapidly determining the spatial distributions of small features over very large areas

    NASA Astrophysics Data System (ADS)

    Ramsdale, Jason D.; Balme, Matthew R.; Conway, Susan J.; Gallagher, Colman; van Gasselt, Stephan A.; Hauber, Ernst; Orgel, Csilla; Séjourné, Antoine; Skinner, James A.; Costard, Francois; Johnsson, Andreas; Losiak, Anna; Reiss, Dennis; Swirad, Zuzanna M.; Kereszturi, Akos; Smith, Isaac B.; Platz, Thomas

    2017-06-01

    The increased volume, spatial resolution, and areal coverage of high-resolution images of Mars over the past 15 years have led to an increased quantity and variety of small-scale landform identifications. Though many such landforms are too small to represent individually on regional-scale maps, determining their presence or absence across large areas helps form the observational basis for developing hypotheses on the geological nature and environmental history of a study area. The combination of improved spatial resolution and near-continuous coverage significantly increases the time required to analyse the data. This becomes problematic when attempting regional or global-scale studies of metre and decametre-scale landforms. Here, we describe an approach for mapping small features (from decimetre to kilometre scale) across large areas, formulated for a project to study the northern plains of Mars, and provide context on how this method was developed and how it can be implemented. Rather than ;mapping; with points and polygons, grid-based mapping uses a ;tick box; approach to efficiently record the locations of specific landforms (we use an example suite of glacial landforms; including viscous flow features, the latitude dependant mantle and polygonised ground). A grid of squares (e.g. 20 km by 20 km) is created over the mapping area. Then the basemap data are systematically examined, grid-square by grid-square at full resolution, in order to identify the landforms while recording the presence or absence of selected landforms in each grid-square to determine spatial distributions. The result is a series of grids recording the distribution of all the mapped landforms across the study area. In some ways, these are equivalent to raster images, as they show a continuous distribution-field of the various landforms across a defined (rectangular, in most cases) area. When overlain on context maps, these form a coarse, digital landform map. We find that grid-based mapping provides an efficient solution to the problems of mapping small landforms over large areas, by providing a consistent and standardised approach to spatial data collection. The simplicity of the grid-based mapping approach makes it extremely scalable and workable for group efforts, requiring minimal user experience and producing consistent and repeatable results. The discrete nature of the datasets, simplicity of approach, and divisibility of tasks, open up the possibility for citizen science in which crowdsourcing large grid-based mapping areas could be applied.

  8. Systematic Construction of Kinetic Models from Genome-Scale Metabolic Networks

    PubMed Central

    Smallbone, Kieran; Klipp, Edda; Mendes, Pedro; Liebermeister, Wolfram

    2013-01-01

    The quantitative effects of environmental and genetic perturbations on metabolism can be studied in silico using kinetic models. We present a strategy for large-scale model construction based on a logical layering of data such as reaction fluxes, metabolite concentrations, and kinetic constants. The resulting models contain realistic standard rate laws and plausible parameters, adhere to the laws of thermodynamics, and reproduce a predefined steady state. These features have not been simultaneously achieved by previous workflows. We demonstrate the advantages and limitations of the workflow by translating the yeast consensus metabolic network into a kinetic model. Despite crudely selected data, the model shows realistic control behaviour, a stable dynamic, and realistic response to perturbations in extracellular glucose concentrations. The paper concludes by outlining how new data can continuously be fed into the workflow and how iterative model building can assist in directing experiments. PMID:24324546

  9. Scaling behavior of knotted random polygons and self-avoiding polygons: Topological swelling with enhanced exponent.

    PubMed

    Uehara, Erica; Deguchi, Tetsuo

    2017-12-07

    We show that the average size of self-avoiding polygons (SAPs) with a fixed knot is much larger than that of no topological constraint if the excluded volume is small and the number of segments is large. We call it topological swelling. We argue an "enhancement" of the scaling exponent for random polygons with a fixed knot. We study them systematically through SAP consisting of hard cylindrical segments with various different values of the radius of segments. Here we mean by the average size the mean-square radius of gyration. Furthermore, we show numerically that the topological balance length of a composite knot is given by the sum of those of all constituent prime knots. Here we define the topological balance length of a knot by such a number of segments that topological entropic repulsions are balanced with the knot complexity in the average size. The additivity suggests the local knot picture.

  10. Scaling behavior of knotted random polygons and self-avoiding polygons: Topological swelling with enhanced exponent

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2017-12-01

    We show that the average size of self-avoiding polygons (SAPs) with a fixed knot is much larger than that of no topological constraint if the excluded volume is small and the number of segments is large. We call it topological swelling. We argue an "enhancement" of the scaling exponent for random polygons with a fixed knot. We study them systematically through SAP consisting of hard cylindrical segments with various different values of the radius of segments. Here we mean by the average size the mean-square radius of gyration. Furthermore, we show numerically that the topological balance length of a composite knot is given by the sum of those of all constituent prime knots. Here we define the topological balance length of a knot by such a number of segments that topological entropic repulsions are balanced with the knot complexity in the average size. The additivity suggests the local knot picture.

  11. Clean fuels for resource-poor settings: A systematic review of barriers and enablers to adoption and sustained use.

    PubMed

    Puzzolo, Elisa; Pope, Daniel; Stanistreet, Debbi; Rehfuess, Eva A; Bruce, Nigel G

    2016-04-01

    Access to, and sustained adoption of, clean household fuels at scale remains an aspirational goal to achieve sufficient reductions in household air pollution (HAP) in order to impact on the substantial global health burden caused by reliance on solid fuels. To systematically appraise the current evidence base to identify: (i) which factors enable or limit adoption and sustained use of clean fuels (namely liquefied petroleum gas (LPG), biogas, solar cooking and alcohol fuels) in low- and middle-income countries; (ii) lessons learnt concerning equitable scaling-up of programmes of cleaner cooking fuels in relation to poverty, urban-rural settings and gender. A mixed-methods systematic review was conducted using established review methodology and extensive searches of published and grey literature sources. Data extraction and quality appraisal of quantitative, qualitative and case studies meeting inclusion criteria were conducted using standardised methods with reliability checking. Forty-four studies from Africa, Asia and Latin America met the inclusion criteria (17 on biogas, 12 on LPG, 9 on solar, 6 on alcohol fuels). A broad range of inter-related enabling and limiting factors were identified for all four types of intervention, operating across seven pre-specified domains (i.e. fuel and technology characteristics, household and setting characteristics, knowledge and perceptions, financial, tax and subsidy aspects, market development, regulation, legislation and standards, and programme and policy mechanisms) and multiple levels (i.e. household, community, national). All domains matter and the majority of factors are common to all clean fuels interventions reviewed although some are fuel and technology-specific. All factors should therefore be taken into account and carefully assessed during planning and implementation of any small- and large-scale initiative aiming at promoting clean fuels for household cooking. Despite limitations in quantity and quality of the evidence this systematic review provides a useful starting point for the design, delivery and evaluation of programmes to ensure more effective adoption and use of LPG, biogas, alcohol fuels and solar cooking. This review was funded by the Department for International Development (DfID) of the United Kingdom. The authors would also like to thank the Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre) for their technical support. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. H0, q0 and the local velocity field. [Hubble and deceleration constants in Big Bang expansion

    NASA Technical Reports Server (NTRS)

    Sandage, A.; Tammann, G. A.

    1982-01-01

    An attempt is made to find a systematic deviation from linearity for distances that are under the control of the Virgo cluster, and to determine the value of the mean random motion about the systematic flow, in order to improve the measurement of the Hubble and the deceleration constants. The velocity-distance relation for large and intermediate distances is studied, and type I supernovae are calibrated relatively as distance indicators and absolutely to obtain a new value for the Hubble constant. Methods of determining the deceleration constant are assessed, including determination from direct measurement, mean luminosity density, virgocentric motion, and the time scale test. The very local velocity field is investigated, and a solution is preferred with a random peculiar radial velocity of very nearby field galaxies of 90-100 km/s, and a Virgocentric motion of the local group of 220 km/s, leading to an underlying expansion rate of 55, in satisfactory agreement with the global value.

  13. Neuroimaging Impaired Response Inhibition and Salience Attribution in Human Drug Addiction: A Systematic Review.

    PubMed

    Zilverstand, Anna; Huang, Anna S; Alia-Klein, Nelly; Goldstein, Rita Z

    2018-06-06

    The impaired response inhibition and salience attribution (iRISA) model proposes that impaired response inhibition and salience attribution underlie drug seeking and taking. To update this model, we systematically reviewed 105 task-related neuroimaging studies (n > 15/group) published since 2010. Results demonstrate specific impairments within six large-scale brain networks (reward, habit, salience, executive, memory, and self-directed networks) during drug cue exposure, decision making, inhibitory control, and social-emotional processing. Addicted individuals demonstrated increased recruitment of these networks during drug-related processing but a blunted response during non-drug-related processing, with the same networks also being implicated during resting state. Associations with real-life drug use, relapse, therapeutic interventions, and the relevance to initiation of drug use during adolescence support the clinical relevance of the results. Whereas the salience and executive networks showed impairments throughout the addiction cycle, the reward network was dysregulated at later stages of abuse. Effects were similar in alcohol, cannabis, and stimulant addiction. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Systematic Characterization and Analysis of the Taxonomic Drivers of Functional Shifts in the Human Microbiome.

    PubMed

    Manor, Ohad; Borenstein, Elhanan

    2017-02-08

    Comparative analyses of the human microbiome have identified both taxonomic and functional shifts that are associated with numerous diseases. To date, however, microbiome taxonomy and function have mostly been studied independently and the taxonomic drivers of functional imbalances have not been systematically identified. Here, we present FishTaco, an analytical and computational framework that integrates taxonomic and functional comparative analyses to accurately quantify taxon-level contributions to disease-associated functional shifts. Applying FishTaco to several large-scale metagenomic cohorts, we show that shifts in the microbiome's functional capacity can be traced back to specific taxa. Furthermore, the set of taxa driving functional shifts and their contribution levels vary markedly between functions. We additionally find that similar functional imbalances in different diseases are driven by both disease-specific and shared taxa. Such integrated analysis of microbiome ecological and functional dynamics can inform future microbiome-based therapy, pinpointing putative intervention targets for manipulating the microbiome's functional capacity. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. ScaleNet: A literature-based model of scale insect biology and systematics

    USDA-ARS?s Scientific Manuscript database

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found in all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis, and plant-insect i...

  16. The effects of run-of-river hydroelectric power schemes on invertebrate community composition in temperate streams and rivers.

    PubMed

    Bilotta, Gary S; Burnside, Niall G; Turley, Matthew D; Gray, Jeremy C; Orr, Harriet G

    2017-01-01

    Run-of-river (ROR) hydroelectric power (HEP) schemes are often presumed to be less ecologically damaging than large-scale storage HEP schemes. However, there is currently limited scientific evidence on their ecological impact. The aim of this article is to investigate the effects of ROR HEP schemes on communities of invertebrates in temperate streams and rivers, using a multi-site Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 22 systematically-selected ROR HEP schemes and 22 systematically-selected paired control sites. Five widely-used family-level invertebrate metrics (richness, evenness, LIFE, E-PSI, WHPT) were analysed using a linear mixed effects model. The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the evenness of the invertebrate community. However, no statistically significant effects were detected on the four other metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future invertebrate community impact studies.

  17. The effects of run-of-river hydroelectric power schemes on invertebrate community composition in temperate streams and rivers

    PubMed Central

    2017-01-01

    Run-of-river (ROR) hydroelectric power (HEP) schemes are often presumed to be less ecologically damaging than large-scale storage HEP schemes. However, there is currently limited scientific evidence on their ecological impact. The aim of this article is to investigate the effects of ROR HEP schemes on communities of invertebrates in temperate streams and rivers, using a multi-site Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 22 systematically-selected ROR HEP schemes and 22 systematically-selected paired control sites. Five widely-used family-level invertebrate metrics (richness, evenness, LIFE, E-PSI, WHPT) were analysed using a linear mixed effects model. The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the evenness of the invertebrate community. However, no statistically significant effects were detected on the four other metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future invertebrate community impact studies. PMID:28158282

  18. Adjunctive nutraceuticals with standard pharmacotherapies in bipolar disorder: a systematic review of clinical trials.

    PubMed

    Sarris, Jerome; Mischoulon, David; Schweitzer, Isaac

    2011-01-01

      Studies using augmentation of pharmacotherapies with nutraceuticals in bipolar disorder (BD) have been conducted and preliminary evidence in many cases appears positive. To date, however, no specialized systematic review of this area has been conducted. We present the first systematic review of clinical trials using nutrient-based nutraceuticals in combination with standard pharmacotherapies to treat BD. A subsequent aim of this report was to discuss posited underlying mechanisms of action.   PubMed, CINAHL, Web of Science, and Cochrane Library databases, and grey literature were searched during mid-2010 for human clinical trials in English using nutraceuticals such as omega-3, N-acetyl cysteine (NAC), inositol, and vitamins and minerals, in combination with pharmacotherapies to treat bipolar mania and bipolar depression. A review of the results including an effect size analysis (Cohen's d) was subsequently conducted.   In treating bipolar depression, positive evidence with large effect sizes were found for NAC (d=1.04) and a chelated mineral and vitamin formula (d=1.70). On the outcome of bipolar mania, several nutraceuticals reduced mania with strong clinical effects: a chelated mineral formula (d=0.83), L-tryptophan (d=1.47), magnesium (d=1.44), folic acid (d=0.40), and branched-chain amino acids (d=1.60). Mixed, but mainly positive, evidence was found for omega-3 for bipolar depression, while no evidentiary support was found for use in mania. No significant effect on BD outcome scales was found for inositol (possibly due to small samples).   BD treatment outcomes may potentially be improved by additional use of certain nutraceuticals with conventional pharmacotherapies. However, caution should be extended in interpreting the large effects of several isolated studies, as they have not yet been replicated in larger trials. © 2011 John Wiley and Sons A/S.

  19. Study of muon-induced neutron production using accelerator muon beam at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakajima, Y.; Lin, C. J.; Ochoa-Ricoux, J. P.

    2015-08-17

    Cosmogenic muon-induced neutrons are one of the most problematic backgrounds for various underground experiments for rare event searches. In order to accurately understand such backgrounds, experimental data with high-statistics and well-controlled systematics is essential. We performed a test experiment to measure muon-induced neutron production yield and energy spectrum using a high-energy accelerator muon beam at CERN. We successfully observed neutrons from 160 GeV/c muon interaction on lead, and measured kinetic energy distributions for various production angles. Works towards evaluation of absolute neutron production yield is underway. This work also demonstrates that the setup is feasible for a future large-scale experimentmore » for more comprehensive study of muon-induced neutron production.« less

  20. Interformat reliability of digital psychiatric self-report questionnaires: a systematic review.

    PubMed

    Alfonsson, Sven; Maathz, Pernilla; Hursti, Timo

    2014-12-03

    Research on Internet-based interventions typically use digital versions of pen and paper self-report symptom scales. However, adaptation into the digital format could affect the psychometric properties of established self-report scales. Several studies have investigated differences between digital and pen and paper versions of instruments, but no systematic review of the results has yet been done. This review aims to assess the interformat reliability of self-report symptom scales used in digital or online psychotherapy research. Three databases (MEDLINE, Embase, and PsycINFO) were systematically reviewed for studies investigating the reliability between digital and pen and paper versions of psychiatric symptom scales. From a total of 1504 publications, 33 were included in the review, and interformat reliability of 40 different symptom scales was assessed. Significant differences in mean total scores between formats were found in 10 of 62 analyses. These differences were found in just a few studies, which indicates that the results were due to study effects and sample effects rather than unreliable instruments. The interformat reliability ranged from r=.35 to r=.99; however, the majority of instruments showed a strong correlation between format scores. The quality of the included studies varied, and several studies had insufficient power to detect small differences between formats. When digital versions of self-report symptom scales are compared to pen and paper versions, most scales show high interformat reliability. This supports the reliability of results obtained in psychotherapy research on the Internet and the comparability of the results to traditional psychotherapy research. There are, however, some instruments that consistently show low interformat reliability, suggesting that these conclusions cannot be generalized to all questionnaires. Most studies had at least some methodological issues with insufficient statistical power being the most common issue. Future studies should preferably provide information about the transformation of the instrument into digital format and the procedure for data collection in more detail.

  1. Data-Driven Simulation-Enhanced Optimization of People-Based Print Production Service

    NASA Astrophysics Data System (ADS)

    Rai, Sudhendu

    This paper describes a systematic six-step data-driven simulation-based methodology for optimizing people-based service systems on a large distributed scale that exhibit high variety and variability. The methodology is exemplified through its application within the printing services industry where it has been successfully deployed by Xerox Corporation across small, mid-sized and large print shops generating over 250 million in profits across the customer value chain. Each step of the methodology consisting of innovative concepts co-development and testing in partnership with customers, development of software and hardware tools to implement the innovative concepts, establishment of work-process and practices for customer-engagement and service implementation, creation of training and infrastructure for large scale deployment, integration of the innovative offering within the framework of existing corporate offerings and lastly the monitoring and deployment of the financial and operational metrics for estimating the return-on-investment and the continual renewal of the offering are described in detail.

  2. Volunteer Functions Inventory: A systematic review.

    PubMed

    Chacón, Fernando; Gutiérrez, Gema; Sauto, Verónica; Vecina, María L; Pérez, Alfonso

    2017-08-01

    The objective of this research study was to conduct a systematic review of the research on volunteers using Clary et al.’s VFI (1998). A total of 48 research studies including 67 independent samples met eligibility criteria. The total sample of the studies analyzed ranged from 20375 to 21988 participants, depending on the motivation analyzed. The results show that the Values factor obtained the highest mean score, both overall and in each type of volunteering, whereas the lowest scores were for the Career and Enhancement factors. Studies conducted with samples with a mean age under 40 years obtain higher scores on Career and Understanding scales when compared to studies in older samples. The group of studies with less than 50% women yield higher mean scores on the Social scale than studies with more than 50% women in the sample. All the scales show reliability coefficients between .78 and .84. Only eight of the articles provide data on the reliability of the scale with a mean value of .90. Of the 26 studies that performed factor analysis, 18 confirmed the original structure of six factors.

  3. Herbarium specimens can reveal impacts of climate change on plant phenology; a review of methods and applications.

    PubMed

    Jones, Casey A; Daehler, Curtis C

    2018-01-01

    Studies in plant phenology have provided some of the best evidence for large-scale responses to recent climate change. Over the last decade, more than thirty studies have used herbarium specimens to analyze changes in flowering phenology over time, although studies from tropical environments are thus far generally lacking. In this review, we summarize the approaches and applications used to date. Reproductive plant phenology has primarily been analyzed using two summary statistics, the mean flowering day of year and first-flowering day of year, but mean flowering day has proven to be a more robust statistic. Two types of regression models have been applied to test for associations between flowering, temperature and time: flowering day regressed on year and flowering day regressed on temperature. Most studies analyzed the effect of temperature by averaging temperatures from three months prior to the date of flowering. On average, published studies have used 55 herbarium specimens per species to characterize changes in phenology over time, but in many cases fewer specimens were used. Geospatial grid data are increasingly being used for determining average temperatures at herbarium specimen collection locations, allowing testing for finer scale correspondence between phenology and climate. Multiple studies have shown that inferences from herbarium specimen data are comparable to findings from systematically collected field observations. Understanding phenological responses to climate change is a crucial step towards recognizing implications for higher trophic levels and large-scale ecosystem processes. As herbaria are increasingly being digitized worldwide, more data are becoming available for future studies. As temperatures continue to rise globally, herbarium specimens are expected to become an increasingly important resource for analyzing plant responses to climate change.

  4. Zolpidem for the Treatment of Neurologic Disorders: A Systematic Review.

    PubMed

    Bomalaski, Martin N; Claflin, Edward S; Townsend, Whitney; Peterson, Mark D

    2017-09-01

    Given its selective action on the ω1 subtype of the γ-aminobutyric acid A receptor, zolpidem tartrate presents a potential treatment mechanism for other neurologic disorders. To synthesize studies that used zolpidem to treat neurologic disorders. Eligibility criteria included any published English-language article that examined the use of zolpidem for noninsomnia neurologic disorders in humans for all dates up to March 20, 2015. Searched databases included PubMed, Scopus, Web of Science Core Collection, the Cochrane Library, EMBASE, CENTRAL, and clinicaltrials.gov. Publication bias was mitigated by searching clinicaltrials.gov for unpublished studies. Two rounds of screening were performed based on title and then abstract, and coding was performed by 2 coders. All methods followed the PRISMA Reporting Guidelines for systematic reviews of the literature. The initial search produced 2314 articles after removing duplicates. After exclusion based on a review of abstracts, 67 articles remained for full manuscript review. Thirty-one studies treated movement disorders, 22 treated disorders of consciousness, and 14 treated other neurologic conditions, including stroke, traumatic brain injury, encephalopathy, and dementia. Study designs included case reports (n = 28), case series (n = 8), single-patient interventional (n = 13), pretest and posttest (n = 9), randomized clinical trials (n = 9), and crossover studies (n = 5). Only 11 studies had more than 10 participants. Effects of zolpidem were wide ranging (eg, improvement on the JFK Coma Recovery Scale-Revised, the Unified Parkinson Disease Rating Scale, and the Burke-Fahn-Marsden Dystonia Rating Scale) and generally lasted 1 to 4 hours before the participant returned to baseline. Sedation was the most common adverse effect. Zolpidem has been observed to transiently treat a large variety of neurologic disorders, most often related to movement disorders and disorders of consciousness. Much of what is known comes from case reports and small interventional trials. These findings may represent a new treatment mechanism for these disorders.

  5. Strategies for delivering insecticide-treated nets at scale for malaria control: a systematic review

    PubMed Central

    Paintain, Lucy Smith; Mangham, Lindsay; Car, Josip; Schellenberg, Joanna Armstrong

    2012-01-01

    Abstract Objective To synthesize findings from recent studies of strategies to deliver insecticide-treated nets (ITNs) at scale in malaria-endemic areas. Methods Databases were searched for studies published between January 2000 and December 2010 in which: subjects resided in areas with endemicity for Plasmodium falciparum and Plasmodium vivax malaria; ITN delivery at scale was evaluated; ITN ownership among households, receipt by pregnant women and/or use among children aged < 5 years was evaluated; and the study design was an individual or cluster-randomized controlled design, nonrandomized, quasi-experimental, before-and-after, interrupted time series or cross-sectional without temporal or geographical controls. Papers describing qualitative studies, case studies, process evaluations and cost-effectiveness studies linked to an eligible paper were also included. Study quality was assessed using the Cochrane risk of bias checklist and GRADE criteria. Important influences on scaling up were identified and assessed across delivery strategies. Findings A total of 32 papers describing 20 African studies were reviewed. Many delivery strategies involved health sectors and retail outlets (partial subsidy), antenatal care clinics (full subsidy) and campaigns (full subsidy). Strategies achieving high ownership among households and use among children < 5 delivered ITNs free through campaigns. Costs were largely comparable across strategies; ITNs were the main cost. Cost-effectiveness estimates were most sensitive to the assumed net lifespan and leakage. Common barriers to delivery included cost, stock-outs and poor logistics. Common facilitators were staff training and supervision, cooperation across departments or ministries and stakeholder involvement. Conclusion There is a broad taxonomy of strategies for delivering ITNs at scale. PMID:22984312

  6. Cometary atmospheres: Modeling the spatial distribution of observed neutral radicals

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.

    1986-01-01

    Progress during the second year of a program of research on the modeling of the spatial distributions of cometary radicals is discussed herein in several major areas. New scale length laws for cometary C2 and CN were determined which explain that the previously-held apparent drop of the C2/CN ratio for large heliocentric distances does not exist and that there is no systematic variation. Monte Carlo particle trajectory model (MCPTM) analysis of sunward and anti-sunward brightness profiles of cometary C2 was completed. This analysis implies a lifetime of 31,000 seconds for the C2 parent and an ejection speed for C2 of approximately 0.5 km/sec upon dissociation from the parent. A systematic reanalysis of published C3 and OH data was begun. Preliminary results find a heliocentric distance dependence for C3 scale lengths with a much larger variation than for C2 and CN. Scale lengths for OH are generally somewhat larger than currently accepted values. The MCPTM was updated to include the coma temperature. Finally, the collaborative effort with the University of Arizona programs has yielded some preliminary CCD images of Comet P/Halley.

  7. Sense of competence in dementia care staff (SCIDS) scale: development, reliability, and validity.

    PubMed

    Schepers, Astrid Kristine; Orrell, Martin; Shanahan, Niamh; Spector, Aimee

    2012-07-01

    Sense of competence in dementia care staff (SCIDS) may be associated with more positive attitudes to dementia among care staff and better outcomes for those being cared for. There is a need for a reliable and valid measure of sense of competence specific to dementia care staff. This study describes the development and evaluation of a measure to assess "sense of competence" in dementia care staff and reports on its psychometric properties. The systematic measure development process involved care staff and experts. For item selection and assessment of psychometric properties, a pilot study (N = 37) and a large-scale study (N = 211) with a test-retest reliability (N = 58) sub-study were undertaken. The final measure consists of 17 items across four subscales with acceptable to good internal consistency and moderate to substantial test-retest reliability. As predicted, the measure was positively associated with work experience, job satisfaction, and person-centered approaches to dementia care, giving a first indication for its validity. The SCIDS scale provides a useful and user-friendly means of measuring sense of competence in care staff. It has been developed using a robust process and has adequate psychometric properties. Further exploration of the construct and the scale's validity is warranted. It may be useful to assess the impact of training and perceived abilities and skills in dementia care.

  8. Developing a "Social Presence Scale" for E-Learning Environments

    ERIC Educational Resources Information Center

    Kilic Cakmak, Ebru; Cebi, Ayça; Kan, Adnan

    2014-01-01

    The purpose of the current study is to develop a "social presence scale" for e-learning environments. A systematic approach was followed for developing the scale. The scale was applied to 461 students registered in seven different programs at Gazi University. The sample was split into two subsamples on a random basis (n1 = 261; n2 =…

  9. Stretch-and-release fabrication, testing and optimization of a flexible ceramic armor inspired from fish scales.

    PubMed

    Martini, Roberto; Barthelat, Francois

    2016-10-13

    Protective systems that are simultaneously hard to puncture and compliant in flexion are desirable, but difficult to achieve because hard materials are usually stiff. However, we can overcome this conflicting design requirement by combining plates of a hard material with a softer substrate, and a strategy which is widely found in natural armors such as fish scales or osteoderms. Man-made segmented armors have a long history, but their systematic implementation in a modern and a protective system is still hampered by a limited understanding of the mechanics and the design of optimization guidelines, and by challenges in cost-efficient manufacturing. This study addresses these limitations with a flexible bioinspired armor based on overlapping ceramic scales. The fabrication combines laser engraving and a stretch-and-release method which allows for fine tuning of the size and overlap of the scales, and which is suitable for large scale fabrication. Compared to a continuous layer of uniform ceramic, our fish-scale like armor is not only more flexible, but it is also more resistant to puncture and more damage tolerant. The proposed armor is also about ten times more puncture resistant than soft elastomers, making it a very attractive alternative to traditional protective equipment.

  10. New single-copy nuclear genes for scale insect systematics

    USDA-ARS?s Scientific Manuscript database

    Despite the advent of next-generation sequencing, the polymerase chain reaction (PCR) and Sanger sequencing remain useful tools for molecular identification and systematics. To date, molecular systematics of scale insects has been constrained by the paucity of loci that researchers have been able to...

  11. Comparing NICU teamwork and safety climate across two commonly used survey instruments

    PubMed Central

    Profit, Jochen; Lee, Henry C; Sharek, Paul J; Kan, Peggy; Nisbet, Courtney C; Thomas, Eric J; Etchegaray, Jason M; Sexton, Bryan

    2016-01-01

    Background and objectives Measurement and our understanding of safety culture are still evolving. The objectives of this study were to assess variation in safety and teamwork climate and in the neonatal intensive care unit (NICU) setting, and compare measurement of safety culture scales using two different instruments (Safety Attitudes Questionnaire (SAQ) and Hospital Survey on Patient Safety Culture (HSOPSC)). Methods Cross-sectional survey study of a voluntary sample of 2073 (response rate 62.9%) health professionals in 44 NICUs. To compare survey instruments, we used Spearman's rank correlation coefficients. We also compared similar scales and items across the instruments using t tests and changes in quartile-level performance. Results We found significant variation across NICUs in safety and teamwork climate scales of SAQ and HSOPSC (p<0.001). Safety scales (safety climate and overall perception of safety) and teamwork scales (teamwork climate and teamwork within units) of the two instruments correlated strongly (safety r=0.72, p<0.001; teamwork r=0.67, p<0.001). However, the means and per cent agreements for all scale scores and even seemingly similar item scores were significantly different. In addition, comparisons of scale score quartiles between the two instruments revealed that half of the NICUs fell into different quartiles when translating between the instruments. Conclusions Large variation and opportunities for improvement in patient safety culture exist across NICUs. Important systematic differences exist between SAQ and HSOPSC such that these instruments should not be used interchangeably. PMID:26700545

  12. Debiasing Health-Related Judgments and Decision Making: A Systematic Review.

    PubMed

    Ludolph, Ramona; Schulz, Peter J

    2018-01-01

    Being confronted with uncertainty in the context of health-related judgments and decision making can give rise to the occurrence of systematic biases. These biases may detrimentally affect lay persons and health experts alike. Debiasing aims at mitigating these negative effects by eliminating or reducing the biases. However, little is known about its effectiveness. This study seeks to systematically review the research on health-related debiasing to identify new opportunities and challenges for successful debiasing strategies. A systematic search resulted in 2748 abstracts eligible for screening. Sixty-eight articles reporting 87 relevant studies met the predefined inclusion criteria and were categorized and analyzed with regard to content and quality. All steps were undertaken independently by 2 reviewers, and inconsistencies were resolved through discussion. The majority of debiasing interventions ( n = 60) was at least partially successful. Optimistic biases ( n = 25), framing effects ( n = 14), and base rate neglects ( n = 10) were the main targets of debiasing efforts. Cognitive strategies ( n = 36) such as "consider-the-opposite" and technological interventions ( n = 33) such as visual aids were mainly tested. Thirteen studies aimed at debiasing health care professionals' judgments, while 74 interventions addressed the general population. Studies' methodological quality ranged from 26.2% to 92.9%, with an average rating of 68.7%. In the past, the usefulness of debiasing was often debated. Yet most of the interventions reviewed here are found to be effective, pointing to the utility of debiasing in the health context. In particular, technological strategies offer a novel opportunity to pursue large-scale debiasing outside the laboratory. The need to strengthen the transfer of debiasing interventions to real-life settings and a lack of conceptual rigor are identified as the main challenges requiring further research.

  13. The accurate particle tracer code

    DOE PAGES

    Wang, Yulei; Liu, Jian; Qin, Hong; ...

    2017-07-20

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runawaymore » electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world’s fastest computer, the Sunway TaihuLight supercomputer, by supporting master–slave architecture of Sunway many-core processors. Here, based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.« less

  14. The accurate particle tracer code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yulei; Liu, Jian; Qin, Hong

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runawaymore » electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world’s fastest computer, the Sunway TaihuLight supercomputer, by supporting master–slave architecture of Sunway many-core processors. Here, based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.« less

  15. Screening for depression in arthritis populations: an assessment of differential item functioning in three self-reported questionnaires.

    PubMed

    Hu, Jinxiang; Ward, Michael M

    2017-09-01

    To determine if persons with arthritis differ systematically from persons without arthritis in how they respond to questions on three depression questionnaires, which include somatic items such as fatigue and sleep disturbance. We extracted data on the Centers for Epidemiological Studies Depression (CES-D) scale, the Patient Health Questionnaire-9 (PHQ-9), and the Kessler-6 (K-6) scale from three large population-based national surveys. We assessed items on these questionnaires for differential item functioning (DIF) between persons with and without self-reported physician-diagnosed arthritis using multiple indicator multiple cause models, which controlled for the underlying level of depression and important confounders. We also examined if DIF by arthritis status was similar between women and men. Although five items of the CES-D, one item of the PHQ-9, and five items of the K-6 scale had evidence of DIF based on statistical comparisons, the magnitude of each difference was less than the threshold of a small effect. The statistical differences were a function of the very large sample sizes in the surveys. Effect sizes for DIF were similar between women and men except for two items on the Patient Health Questionnaire-9. For each questionnaire, DIF accounted for 8% or less of the arthritis-depression association, and excluding items with DIF did not reduce the difference in depression scores between those with and without arthritis. Persons with arthritis respond to items on the CES-D, PHQ-9, and K-6 depression scales similarly to persons without arthritis, despite the inclusion of somatic items in these scales.

  16. A Comprehensive Critique and Review of Published Measures of Acne Severity

    PubMed Central

    Furber, Gareth; Leach, Matthew; Segal, Leonie

    2016-01-01

    Objective: Acne vulgaris is a dynamic, complex condition that is notoriously difficult to evaluate. The authors set out to critically evaluate currently available measures of acne severity, particularly in terms of suitability for use in clinical trials. Design: A systematic review was conducted to identify methods used to measure acne severity, using MEDLINE, CINAHL, Scopus, and Wiley Online. Each method was critically reviewed and given a score out of 13 based on eight quality criteria under two broad groupings of psychometric testing and suitability for research and evaluation. Results: Twenty-four methods for assessing acne severity were identified. Four scales received a quality score of zero, and 11 scored ≤3. The highest rated scales achieved a total score of 6. Six scales reported strong inter-rater reliability (ICC>0.75), and four reported strong intra-rater reliability (ICC>0.75). The poor overall performance of most scales, largely characterized by the absence of reliability testing or evidence for independent assessment and validation indicates that generally, their application in clinical trials is not supported. Conclusion: This review and appraisal of instruments for measuring acne severity supports previously identified concerns regarding the quality of published measures. It highlights the need for a valid and reliable acne severity scale, especially for use in research and evaluation. The ideal scale would demonstrate adequate validation and reliability and be easily implemented for third-party analysis. The development of such a scale is critical to interpreting results of trials and facilitating the pooling of results for systematic reviews and meta-analyses. PMID:27672410

  17. Experimental quiet engine program

    NASA Technical Reports Server (NTRS)

    Cornell, W. G.

    1975-01-01

    Full-scale low-tip-speed fans, a full-scale high-tip-speed fan, scale model versions of fans, and two full-scale high-bypass-ratio turbofan engines, were designed, fabricated, tested, and evaluated. Turbine noise suppression was investigated. Preliminary design studies of flight propulsion system concepts were used in application studies to determine acoustic-economic tradeoffs. Salient results are as follows: tradeoff evaluation of fan tip speed and blade loading; systematic data on source noise characteristics and suppression effectiveness; documentation of high- and low-fan-speed aerodynamic and acoustic technology; aerodynamic and acoustic evaluation of acoustic treatment configurations, casing tip bleed, serrated and variable pitch rotor blades, leaned outlet guide vanes, slotted tip casings, rotor blade shape modifications, and inlet noise suppression; systematic evaluation of aerodynamic and acoustic effects; flyover noise projections of engine test data; turbine noise suppression technology development; and tradeoff evaluation of preliminary design high-fan-speed and low-fan-speed flight engines.

  18. Calibrating First-Order Strong Lensing Mass Estimates in Clusters of Galaxies

    NASA Astrophysics Data System (ADS)

    Reed, Brendan; Remolian, Juan; Sharon, Keren; Li, Nan; SPT Clusters Cooperation

    2018-01-01

    We investigate methods to reduce the statistical and systematic errors inherent to using the Einstein Radius as a first-order mass estimate in strong lensing galaxy clusters. By finding an empirical universal calibration function, we aim to enable a first-order mass estimate of large cluster data sets in a fraction of the time and effort of full-scale strong lensing mass modeling. We use 74 simulated cluster data from the Argonne National Laboratory in a lens redshift slice of [0.159, 0.667] with various source redshifts in the range of [1.23, 2.69]. From the simulated density maps, we calculate the exact mass enclosed within the Einstein Radius. We find that the mass inferred from the Einstein Radius alone produces an error width of ~39% with respect to the true mass. We explore an array of polynomial and exponential correction functions with dependence on cluster redshift and projected radii of the lensed images, aiming to reduce the statistical and systematic uncertainty. We find that the error on the the mass inferred from the Einstein Radius can be reduced significantly by using a universal correction function. Our study has implications for current and future large galaxy cluster surveys aiming to measure cluster mass, and the mass-concentration relation.

  19. Plastome sequences and exploration of tree-space help to resolve the phylogeny of riceflowers (Thymelaeaceae: Pimelea).

    PubMed

    Foster, Charles S P; Henwood, Murray J; Ho, Simon Y W

    2018-05-25

    Data sets comprising small numbers of genetic markers are not always able to resolve phylogenetic relationships. This has frequently been the case in molecular systematic studies of plants, with many analyses being based on sequence data from only two or three chloroplast genes. An example of this comes from the riceflowers Pimelea Banks & Sol. ex Gaertn. (Thymelaeaceae), a large genus of flowering plants predominantly distributed in Australia. Despite the considerable morphological variation in the genus, low sequence divergence in chloroplast markers has led to the phylogeny of Pimelea remaining largely uncertain. In this study, we resolve the backbone of the phylogeny of Pimelea in comprehensive Bayesian and maximum-likelihood analyses of plastome sequences from 41 taxa. However, some relationships received only moderate to poor support, and the Pimelea clade contained extremely short internal branches. By using topology-clustering analyses, we demonstrate that conflicting phylogenetic signals can be found across the trees estimated from individual chloroplast protein-coding genes. A relaxed-clock dating analysis reveals that Pimelea arose in the mid-Miocene, with most divergences within the genus occurring during a subsequent rapid diversification. Our new phylogenetic estimate offers better resolution and is more strongly supported than previous estimates, providing a platform for future taxonomic revisions of both Pimelea and the broader subfamily. Our study has demonstrated the substantial improvements in phylogenetic resolution that can be achieved using plastome-scale data sets in plant molecular systematics. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. The Olympic Regeneration in East London (ORiEL) study: protocol for a prospective controlled quasi-experiment to evaluate the impact of urban regeneration on young people and their families.

    PubMed

    Smith, Neil R; Clark, Charlotte; Fahy, Amanda E; Tharmaratnam, Vanathi; Lewis, Daniel J; Thompson, Claire; Renton, Adrian; Moore, Derek G; Bhui, Kamaldeep S; Taylor, Stephanie J C; Eldridge, Sandra; Petticrew, Mark; Greenhalgh, Tricia; Stansfeld, Stephen A; Cummins, Steven

    2012-01-01

    Recent systematic reviews suggest that there is a dearth of evidence on the effectiveness of large-scale urban regeneration programmes in improving health and well-being and alleviating health inequalities. The development of the Olympic Park in Stratford for the London 2012 Olympic and Paralympic Games provides the opportunity to take advantage of a natural experiment to examine the impact of large-scale urban regeneration on the health and well-being of young people and their families. A prospective school-based survey of adolescents (11-12 years) with parent data collected through face-to-face interviews at home. Adolescents will be recruited from six randomly selected schools in an area receiving large-scale urban regeneration (London Borough of Newham) and compared with adolescents in 18 schools in three comparison areas with no equivalent regeneration (London Boroughs of Tower Hamlets, Hackney and Barking & Dagenham). Baseline data will be completed prior to the start of the London Olympics (July 2012) with follow-up at 6 and 18 months postintervention. Primary outcomes are: pre-post change in adolescent and parent mental health and well-being, physical activity and parental employment status. Secondary outcomes include: pre-post change in social cohesion, smoking, alcohol use, diet and body mass index. The study will account for individual and environmental contextual effects in evaluating changes to identified outcomes. A nested longitudinal qualitative study will explore families' experiences of regeneration in order to unpack the process by which regeneration impacts on health and well-being. The study has approval from Queen Mary University of London Ethics Committee (QMREC2011/40), the Association of Directors of Children's Services (RGE110927) and the London Boroughs Research Governance Framework (CERGF113). Fieldworkers have had advanced Criminal Records Bureau clearance. Findings will be disseminated through peer-reviewed publications, national and international conferences, through participating schools and the study website (http://www.orielproject.co.uk).

  1. Hypothesis exploration with visualization of variance

    PubMed Central

    2014-01-01

    Background The Consortium for Neuropsychiatric Phenomics (CNP) at UCLA was an investigation into the biological bases of traits such as memory and response inhibition phenotypes—to explore whether they are linked to syndromes including ADHD, Bipolar disorder, and Schizophrenia. An aim of the consortium was in moving from traditional categorical approaches for psychiatric syndromes towards more quantitative approaches based on large-scale analysis of the space of human variation. It represented an application of phenomics—wide-scale, systematic study of phenotypes—to neuropsychiatry research. Results This paper reports on a system for exploration of hypotheses in data obtained from the LA2K, LA3C, and LA5C studies in CNP. ViVA is a system for exploratory data analysis using novel mathematical models and methods for visualization of variance. An example of these methods is called VISOVA, a combination of visualization and analysis of variance, with the flavor of exploration associated with ANOVA in biomedical hypothesis generation. It permits visual identification of phenotype profiles—patterns of values across phenotypes—that characterize groups. Visualization enables screening and refinement of hypotheses about variance structure of sets of phenotypes. Conclusions The ViVA system was designed for exploration of neuropsychiatric hypotheses by interdisciplinary teams. Automated visualization in ViVA supports ‘natural selection’ on a pool of hypotheses, and permits deeper understanding of the statistical architecture of the data. Large-scale perspective of this kind could lead to better neuropsychiatric diagnostics. PMID:25097666

  2. Unravelling connections between river flow and large-scale climate: experiences from Europe

    NASA Astrophysics Data System (ADS)

    Hannah, D. M.; Kingston, D. G.; Lavers, D.; Stagge, J. H.; Tallaksen, L. M.

    2016-12-01

    The United Nations has identified better knowledge of large-scale water cycle processes as essential for socio-economic development and global water-food-energy security. In this context, and given the ever-growing concerns about climate change/ variability and human impacts on hydrology, there is an urgent research need: (a) to quantify space-time variability in regional river flow, and (b) to improve hydroclimatological understanding of climate-flow connections as a basis for identifying current and future water-related issues. In this paper, we draw together studies undertaken at the pan-European scale: (1) to evaluate current methods for assessing space-time dynamics for different streamflow metrics (annual regimes, low flows and high flows) and for linking flow variability to atmospheric drivers (circulation indices, air-masses, gridded climate fields and vapour flux); and (2) to propose a plan for future research connecting streamflow and the atmospheric conditions in Europe and elsewhere. We believe this research makes a useful, unique contribution to the literature through a systematic inter-comparison of different streamflow metrics and atmospheric descriptors. In our findings, we highlight the need to consider appropriate atmospheric descriptors (dependent on the target flow metric and region of interest) and to develop analytical techniques that best characterise connections in the ocean-atmosphere-land surface process chain. We call for the need to consider not only atmospheric interactions, but also the role of the river basin-scale terrestrial hydrological processes in modifying the climate signal response of river flows.

  3. Hemispherical and Longitudinal Asymmetries in the Heliospheric Magnetic Field: Flip-flops of a Bashful Ballerina

    NASA Astrophysics Data System (ADS)

    Hiltula, T.; Mursula, K.

    2004-12-01

    Several studies during many decennia have studied possible longitudinal and hemispherical asymmetries in various forms of solar activity. E.g., there are well known periods when one of the solar hemispheres has dominated the other in sunspot numbers, flare occurrence or some other form of solar activity. However, the solar asymmetries have not been found to be very conclusive, or to form any clear systematical patterns (e.g., relation to solar cycle). On the contrary, recent studies of similar longitudinal and hemispherical asymmetries in the heliospheric magnetic field have shown a very clear and systematic behaviour. E.g., it was found recently that the dominance of the two HMF sectors experiences an oscillation with a period of about 3.2 years. This new flip-flop periodicity in the heliospheric magnetic field is most likely related to a similar periodicity recently found in sunspots. Also, it has recently been found that the HMF sector coming from the northern solar hemisphere systematically dominates at 1AU during solar minimum times. This leads to a persistent southward shift or coning of the heliospheric current sheet at these times that can be picturesquely described by the concept of a Bashful Ballerina. This result also implies that the Sun has a large-scale quadrupole magnetic moment. Here we review these recent developments concerning the longitudinal and hemispherical asymmetries in the heliospheric magnetic field and study their inter-connection.

  4. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice.

    PubMed

    Wang, Diane R; Wolfrum, Edward J; Virk, Parminder; Ismail, Abdelbagi; Greenberg, Anthony J; McCouch, Susan R

    2016-11-01

    Rice plants (Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, established an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. We find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  5. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey.

    PubMed

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys.

  6. Interplay between Functional Connectivity and Scale-Free Dynamics in Intrinsic fMRI Networks

    PubMed Central

    Ciuciu, Philippe; Abry, Patrice; He, Biyu J.

    2014-01-01

    Studies employing functional connectivity-type analyses have established that spontaneous fluctuations in functional magnetic resonance imaging (fMRI) signals are organized within large-scale brain networks. Meanwhile, fMRI signals have been shown to exhibit 1/f-type power spectra – a hallmark of scale-free dynamics. We studied the interplay between functional connectivity and scale-free dynamics in fMRI signals, utilizing the fractal connectivity framework – a multivariate extension of the univariate fractional Gaussian noise model, which relies on a wavelet formulation for robust parameter estimation. We applied this framework to fMRI data acquired from healthy young adults at rest and performing a visual detection task. First, we found that scale-invariance existed beyond univariate dynamics, being present also in bivariate cross-temporal dynamics. Second, we observed that frequencies within the scale-free range do not contribute evenly to inter-regional connectivity, with a systematically stronger contribution of the lowest frequencies, both at rest and during task. Third, in addition to a decrease of the Hurst exponent and inter-regional correlations, task performance modified cross-temporal dynamics, inducing a larger contribution of the highest frequencies within the scale-free range to global correlation. Lastly, we found that across individuals, a weaker task modulation of the frequency contribution to inter-regional connectivity was associated with better task performance manifesting as shorter and less variable reaction times. These findings bring together two related fields that have hitherto been studied separately – resting-state networks and scale-free dynamics, and show that scale-free dynamics of human brain activity manifest in cross-regional interactions as well. PMID:24675649

  7. Comparison of spatio-temporal resolution of different flow measurement techniques for marine renewable energy applications

    NASA Astrophysics Data System (ADS)

    Lyon, Vincent; Wosnik, Martin

    2013-11-01

    Marine hydrokinetic (MHK) energy conversion devices are subject to a wide range of turbulent scales, either due to upstream bathymetry, obstacles and waves, or from wakes of upstream devices in array configurations. The commonly used, robust Acoustic Doppler Current Profilers (ADCP) are well suited for long term flow measurements in the marine environment, but are limited to low sampling rates due to their operational principle. The resulting temporal and spatial resolution is insufficient to measure all turbulence scales of interest to the device, e.g., ``blade-scale turbulence.'' The present study systematically characterizes the spatial and temporal resolution of ADCP, Acoustic Doppler Velocimetry (ADV), and Particle Image Velocimetry (PIV). Measurements were conducted in a large cross section tow tank (3.7m × 2.4m) for several benchmark cases, including low and high turbulence intensity uniform flow as well as in the wake of a cylinder, to quantitatively investigate the flow scales which each of the instruments can resolve. The purpose of the study is to supply data for mathematical modeling to improve predictions from ADCP measurements, which can help lead to higher-fidelity energy resource assessment and more accurate device evaluation, including wake measurements. Supported by NSF-CBET grant 1150797.

  8. Sensitivity simulations of superparameterised convection in a general circulation model

    NASA Astrophysics Data System (ADS)

    Rybka, Harald; Tost, Holger

    2015-04-01

    Cloud Resolving Models (CRMs) covering a horizontal grid spacing from a few hundred meters up to a few kilometers have been used to explicitly resolve small-scale and mesoscale processes. Special attention has been paid to realistically represent cloud dynamics and cloud microphysics involving cloud droplets, ice crystals, graupel and aerosols. The entire variety of physical processes on the small-scale interacts with the larger-scale circulation and has to be parameterised on the coarse grid of a general circulation model (GCM). Since more than a decade an approach to connect these two types of models which act on different scales has been developed to resolve cloud processes and their interactions with the large-scale flow. The concept is to use an ensemble of CRM grid cells in a 2D or 3D configuration in each grid cell of the GCM to explicitly represent small-scale processes avoiding the use of convection and large-scale cloud parameterisations which are a major source for uncertainties regarding clouds. The idea is commonly known as superparameterisation or cloud-resolving convection parameterisation. This study presents different simulations of an adapted Earth System Model (ESM) connected to a CRM which acts as a superparameterisation. Simulations have been performed with the ECHAM/MESSy atmospheric chemistry (EMAC) model comparing conventional GCM runs (including convection and large-scale cloud parameterisations) with the improved superparameterised EMAC (SP-EMAC) modeling one year with prescribed sea surface temperatures and sea ice content. The sensitivity of atmospheric temperature, precipiation patterns, cloud amount and types is observed changing the embedded CRM represenation (orientation, width, no. of CRM cells, 2D vs. 3D). Additionally, we also evaluate the radiation balance with the new model configuration, and systematically analyse the impact of tunable parameters on the radiation budget and hydrological cycle. Furthermore, the subgrid variability (individual CRM cell output) is analysed in order to illustrate the importance of a highly varying atmospheric structure inside a single GCM grid box. Finally, the convective transport of Radon is observed comparing different transport procedures and their influence on the vertical tracer distribution.

  9. A combinatorial code for pattern formation in Drosophila oogenesis.

    PubMed

    Yakoby, Nir; Bristow, Christopher A; Gong, Danielle; Schafer, Xenia; Lembong, Jessica; Zartman, Jeremiah J; Halfon, Marc S; Schüpbach, Trudi; Shvartsman, Stanislav Y

    2008-11-01

    Two-dimensional patterning of the follicular epithelium in Drosophila oogenesis is required for the formation of three-dimensional eggshell structures. Our analysis of a large number of published gene expression patterns in the follicle cells suggests that they follow a simple combinatorial code based on six spatial building blocks and the operations of union, difference, intersection, and addition. The building blocks are related to the distribution of inductive signals, provided by the highly conserved epidermal growth factor receptor and bone morphogenetic protein signaling pathways. We demonstrate the validity of the code by testing it against a set of patterns obtained in a large-scale transcriptional profiling experiment. Using the proposed code, we distinguish 36 distinct patterns for 81 genes expressed in the follicular epithelium and characterize their joint dynamics over four stages of oogenesis. The proposed combinatorial framework allows systematic analysis of the diversity and dynamics of two-dimensional transcriptional patterns and guides future studies of gene regulation.

  10. Observing the Cosmic Microwave Background Polarization with Variable-delay Polarization Modulators for the Cosmology Large Angular Scale Surveyor

    NASA Astrophysics Data System (ADS)

    Harrington, Kathleen; CLASS Collaboration

    2018-01-01

    The search for inflationary primordial gravitational waves and the optical depth to reionization, both through their imprint on the large angular scale correlations in the polarization of the cosmic microwave background (CMB), has created the need for high sensitivity measurements of polarization across large fractions of the sky at millimeter wavelengths. These measurements are subjected to instrumental and atmospheric 1/f noise, which has motivated the development of polarization modulators to facilitate the rejection of these large systematic effects.Variable-delay polarization modulators (VPMs) are used in the Cosmology Large Angular Scale Surveyor (CLASS) telescopes as the first element in the optical chain to rapidly modulate the incoming polarization. VPMs consist of a linearly polarizing wire grid in front of a moveable flat mirror; varying the distance between the grid and the mirror produces a changing phase shift between polarization states parallel and perpendicular to the grid which modulates Stokes U (linear polarization at 45°) and Stokes V (circular polarization). The reflective and scalable nature of the VPM enables its placement as the first optical element in a reflecting telescope. This simultaneously allows a lock-in style polarization measurement and the separation of sky polarization from any instrumental polarization farther along in the optical chain.The Q-Band CLASS VPM was the first VPM to begin observing the CMB full time in 2016. I will be presenting its design and characterization as well as demonstrating how modulating polarization significantly rejects atmospheric and instrumental long time scale noise.

  11. The Dietary Quality of Food Pantry Users: A Systematic Review of Existing Literature.

    PubMed

    Simmet, Anja; Depa, Julia; Tinnemann, Peter; Stroebele-Benschop, Nanette

    2017-04-01

    Users of food pantries often have a long history of food insecurity and may be vulnerable to nutritional deficiencies. The quality of their diets is not well researched. The purpose of this systematic review was to summarize the published evidence about the dietary quality of food pantry users. Systematic database searches of PubMed, PsycINFO, PsycARTICLES, and Psychology Behavioral Sciences Collection, and hand searches of references were conducted to identify cross-sectional, cohort, and intervention studies reporting baseline data, conducted in high-income countries and published between 1980 and 2015, which reported on the nutritional adequacy of individuals who have used a food pantry at least once in the previous 12 months. All identified citations were screened and independently assessed for eligibility. Results for dietary quality were summarized for overall diet quality, energy, food groups, macro- and micronutrients separately. The risk of bias of included studies was evaluated by using criteria of an adapted Ottawa Scale. The systematic review was reported in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. After applying predefined eligibility criteria, 16 articles were identified for inclusion. The diet quality among included food pantry users was low, as reflected by inadequate mean group intake of energy, fruits and vegetables, dairy products, and calcium. Even if the group mean intake was adequate, large percentages of study populations did not meet the recommendations for vitamins A, C, D, and B vitamins, or iron, magnesium, and zinc. The representativeness of the studies varied widely and none of them were nationally representative. The current evidence suggests that the dietary intake of most food pantry users does not meet recommendations. Future research should draw more representative samples and investigate the impact of food pantries on users' diet. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  12. Understanding Listening Competency: A Systematic Review of Research Scales

    ERIC Educational Resources Information Center

    Fontana, Peter C.; Cohen, Steven D.; Wolvin, Andrew D.

    2015-01-01

    To better understand what constitutes listening competency, we perform a systematic review of listening scales. Our goal was twofold: to determine the most commonly appearing listening traits and to determine if listening scales are similar to one other. As part of our analysis, we identified 53 relevant scales and analyzed the scales…

  13. Deficiency of ''Thin'' Stellar Bars in Seyfert Host Galaxies

    NASA Technical Reports Server (NTRS)

    Shlosman, Isaac; Peletier, Reynier F.; Knapen, Johan

    1999-01-01

    Using all available major samples of Seyfert galaxies and their corresponding control samples of closely matched non-active galaxies, we find that the bar ellipticities (or axial ratios) in Seyfert galaxies are systematically different from those in non-active galaxies. Overall, there is a deficiency of bars with large ellipticities (i.e., 'fat' or 'weak' bars) in Seyferts, compared to non-active galaxies. Accompanied with a large dispersion due to small number statistics, this effect is strictly speaking at the 2 sigma level. To obtain this result, the active galaxy samples of near-infrared surface photometry were matched to those of normal galaxies in type, host galaxy ellipticity, absolute magnitude, and, to some extent, in redshift. We discuss possible theoretical explanations of this phenomenon within the framework of galactic evolution, and, in particular, of radial gas redistribution in barred galaxies. Our conclusions provide further evidence that Seyfert hosts differ systematically from their non-active counterparts on scales of a few kpc.

  14. Complementary codes for odor identity and intensity in olfactory cortex

    PubMed Central

    Bolding, Kevin A; Franks, Kevin M

    2017-01-01

    The ability to represent both stimulus identity and intensity is fundamental for perception. Using large-scale population recordings in awake mice, we find distinct coding strategies facilitate non-interfering representations of odor identity and intensity in piriform cortex. Simply knowing which neurons were activated is sufficient to accurately represent odor identity, with no additional information about identity provided by spike time or spike count. Decoding analyses indicate that cortical odor representations are not sparse. Odorant concentration had no systematic effect on spike counts, indicating that rate cannot encode intensity. Instead, odor intensity can be encoded by temporal features of the population response. We found a subpopulation of rapid, largely concentration-invariant responses was followed by another population of responses whose latencies systematically decreased at higher concentrations. Cortical inhibition transforms olfactory bulb output to sharpen these dynamics. Our data therefore reveal complementary coding strategies that can selectively represent distinct features of a stimulus. DOI: http://dx.doi.org/10.7554/eLife.22630.001 PMID:28379135

  15. Systematic review of torrefied wood economics

    Treesearch

    Robert I. Radics; Ronalds Gonzalez; Edward M. (Ted) Bilek; Stephen S. Kelley

    2017-01-01

    This literature review aims to provide a systematic analysis of studies on the financial aspects of producing torrefied biomass and torrefied pellets. There are substantial differences in the specific technologies, operating conditions, scale of the demonstration, and properties of biomass feedstock. There is a lack of reports that consider the entire supply chain,...

  16. [Rhabdomyosarcoma of soft palate. A case on purpose].

    PubMed

    Arias Marzán, F; De Bonis Redondo, M; Redondo Ventura, F; Betancor Martínez, L; Sanginés Yzzo, M; Arias Marzán, J; De Bonis Braun, C; Zurita Expósito, V; Reig Ripoll, F; De Lucas Carmona, G

    2006-01-01

    The rabdomiosarcoma (RMS) are infrequent tumors. They are principally described in infancy and located in 35% of the cases in head and neck. The nasopharynx localisation is relatively rare, being in these cases the tongue, palate and oral mucosa the preferent places of establishment. Classically the patient presented very low standard healing with surgery and radiotherapy. The introduction in the middle 70 of systematic chimiotherapy as complementary treatment, improved the survival rate in large scale. In this article the case of an adolescent patient, who presented a RMS at the level of the soft palate, the diagnostic procedure and the therapeutic decision adopted, after revision of the last studies at this respect, are described.

  17. Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Jensen, David; Poll, Scott

    2009-01-01

    Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.

  18. The What and How of Prefrontal Cortical Organization

    PubMed Central

    O’Reilly, Randall C.

    2010-01-01

    How is the prefrontal cortex (PFC) organized such that it is capable of making people more flexible and in control of their behavior? Is there any systematic organization across the many diverse areas that comprise the PFC, or is it uniquely adaptive such that no fixed representation structure can develop? Going against the current tide, this paper argues that there is indeed a systematic organization across PFC areas, with an important functional distinction between ventral and dorsal regions characterized as processing What vs. How information, respectively. This distinction has implications for the rostro-caudal and medial-lateral axes of organization as well. The resulting large-scale functional map of PFC may prove useful in integrating diverse data, and generating novel predictions. PMID:20573407

  19. Evaluation of Bias-Variance Trade-Off for Commonly Used Post-Summarizing Normalization Procedures in Large-Scale Gene Expression Studies

    PubMed Central

    Qiu, Xing; Hu, Rui; Wu, Zhixin

    2014-01-01

    Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114

  20. A general path for large-scale solubilization of cellular proteins: From membrane receptors to multiprotein complexes

    PubMed Central

    Pullara, Filippo; Guerrero-Santoro, Jennifer; Calero, Monica; Zhang, Qiangmin; Peng, Ye; Spåhr, Henrik; Kornberg, Guy L.; Cusimano, Antonella; Stevenson, Hilary P.; Santamaria-Suarez, Hugo; Reynolds, Shelley L.; Brown, Ian S.; Monga, Satdarshan P.S.; Van Houten, Bennett; Rapić-Otrin, Vesna; Calero, Guillermo; Levine, Arthur S.

    2014-01-01

    Expression of recombinant proteins in bacterial or eukaryotic systems often results in aggregation rendering them unavailable for biochemical or structural studies. Protein aggregation is a costly problem for biomedical research. It forces research laboratories and the biomedical industry to search for alternative, more soluble, non-human proteins and limits the number of potential “druggable” targets. In this study we present a highly reproducible protocol that introduces the systematic use of an extensive number of detergents to solubilize aggregated proteins expressed in bacterial and eukaryotic systems. We validate the usefulness of this protocol by solubilizing traditionally difficult human protein targets to milligram quantities and confirm their biological activity. We use this method to solubilize monomeric or multimeric components of multi-protein complexes and demonstrate its efficacy to reconstitute large cellular machines. This protocol works equally well on cytosolic, nuclear and membrane proteins and can be easily adapted to a high throughput format. PMID:23137940

  1. ['Walkability' and physical activity - results of empirical studies based on the 'Neighbourhood Environment Walkability Scale (NEWS)'].

    PubMed

    Rottmann, M; Mielck, A

    2014-02-01

    'Walkability' is mainly assessed by the NEWS questionnaire (Neighbourhood Environment Walkability Scale); in Germany this questionnaire is widely unknown. We now try to fill this gap by providing a systematic overview of empirical studies based on the NEWS. A systematic review was conducted concerning original papers including empirical analyses based on the NEWS. The results are summarised and presented in tables. Altogether 31 publications could be identified. Most of them focus on associations with the variable 'physical activity', and they often report significant associations with at least some of the scales included in the NEWS. Due to methodological differences between the studies it is difficult to compare the results. The concept of 'walkability' should also be established in the German public health discussion. A number of methodological challenges remain to be solved, such as the identification of those scales and items in the NEWS that show the strongest associations with individual health behaviours. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Aromatherapy for managing menopausal symptoms

    PubMed Central

    Choi, Jiae; Lee, Hye Won; Lee, Ju Ah; Lim, Hyun-Ja; Lee, Myeong Soo

    2018-01-01

    Abstract Background: Aromatherapy is often used as a complementary therapy for women's health. This systematic review aims to evaluate the therapeutic effects of aromatherapy as a management for menopausal symptoms. Methods: Eleven electronic databases will be searched from inception to February 2018. Randomized controlled trials that evaluated any type of aromatherapy against any type of control in individuals with menopausal symptoms will be eligible. The methodological quality will be assessed using the Cochrane risk of bias tool. Two authors will independently assess each study for eligibility and risk of bias and to extract data. Results: This study will provide a high quality synthesis of current evidence of aromatherapy for menopausal symptoms measured with Menopause Rating Scale, the Kupperman Index, the Greene Climacteric Scale, or other validated questionnaires. Conclusions: The conclusion of our systematic review will provide evidence to judge whether aromatherapy is an effective intervention for patient with menopausal women. Ethics and dissemination: Ethical approval will not be required, given that this protocol is for a systematic review. The systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. Systematic review registration: PROSPERO CRD42017079191. PMID:29419673

  3. Rain Characteristics and Large-Scale Environments of Precipitation Objects with Extreme Rain Volumes from TRMM Observations

    NASA Technical Reports Server (NTRS)

    Zhou, Yaping; Lau, William K M.; Liu, Chuntao

    2013-01-01

    This study adopts a "precipitation object" approach by using 14 years of Tropical Rainfall Measuring Mission (TRMM) Precipitation Feature (PF) and National Centers for Environmental Prediction (NCEP) reanalysis data to study rainfall structure and environmental factors associated with extreme heavy rain events. Characteristics of instantaneous extreme volumetric PFs are examined and compared to those of intermediate and small systems. It is found that instantaneous PFs exhibit a much wider scale range compared to the daily gridded precipitation accumulation range. The top 1% of the rainiest PFs contribute over 55% of total rainfall and have 2 orders of rain volume magnitude greater than those of the median PFs. We find a threshold near the top 10% beyond which the PFs grow exponentially into larger, deeper, and colder rain systems. NCEP reanalyses show that midlevel relative humidity and total precipitable water increase steadily with increasingly larger PFs, along with a rapid increase of 500 hPa upward vertical velocity beyond the top 10%. This provides the necessary moisture convergence to amplify and sustain the extreme events. The rapid increase in vertical motion is associated with the release of convective available potential energy (CAPE) in mature systems, as is evident in the increase in CAPE of PFs up to 10% and the subsequent dropoff. The study illustrates distinct stages in the development of an extreme rainfall event including: (1) a systematic buildup in large-scale temperature and moisture, (2) a rapid change in rain structure, (3) explosive growth of the PF size, and (4) a release of CAPE before the demise of the event.

  4. Systematic Phenotyping of a Large-Scale Candida glabrata Deletion Collection Reveals Novel Antifungal Tolerance Genes

    PubMed Central

    Hiller, Ekkehard; Istel, Fabian; Tscherner, Michael; Brunke, Sascha; Ames, Lauren; Firon, Arnaud; Green, Brian; Cabral, Vitor; Marcet-Houben, Marina; Jacobsen, Ilse D.; Quintin, Jessica; Seider, Katja; Frohner, Ingrid; Glaser, Walter; Jungwirth, Helmut; Bachellier-Bassi, Sophie; Chauvel, Murielle; Zeidler, Ute; Ferrandon, Dominique; Gabaldón, Toni; Hube, Bernhard; d'Enfert, Christophe; Rupp, Steffen; Cormack, Brendan; Haynes, Ken; Kuchler, Karl

    2014-01-01

    The opportunistic fungal pathogen Candida glabrata is a frequent cause of candidiasis, causing infections ranging from superficial to life-threatening disseminated disease. The inherent tolerance of C. glabrata to azole drugs makes this pathogen a serious clinical threat. To identify novel genes implicated in antifungal drug tolerance, we have constructed a large-scale C. glabrata deletion library consisting of 619 unique, individually bar-coded mutant strains, each lacking one specific gene, all together representing almost 12% of the genome. Functional analysis of this library in a series of phenotypic and fitness assays identified numerous genes required for growth of C. glabrata under normal or specific stress conditions, as well as a number of novel genes involved in tolerance to clinically important antifungal drugs such as azoles and echinocandins. We identified 38 deletion strains displaying strongly increased susceptibility to caspofungin, 28 of which encoding proteins that have not previously been linked to echinocandin tolerance. Our results demonstrate the potential of the C. glabrata mutant collection as a valuable resource in functional genomics studies of this important fungal pathogen of humans, and to facilitate the identification of putative novel antifungal drug target and virulence genes. PMID:24945925

  5. Present and Future Redshift Surveys: ORS, DOGS and 2dF

    NASA Astrophysics Data System (ADS)

    Lahav, O.

    Three galaxy redshifts surveys and their analyses are discussed. (i) The recently completed Optical Redshift Survey (ORS) includes galaxies larger than 1.9 arcmin and/or brighter than $14.5^m$. It provides redshifts for $\\sim 8300 $ galaxies at Galactic latitude $|b|>20^o$. A new analysis of the survey explores the existence and extent of the Supergalactic Plane (SGP). Its orientation is found to be in good agreement with the standard SGP coordinates, and suggests that the SGP is at least as large as the survey (16000 km/sec in diameter). (ii) The Dwingeloo Obscured Galaxy Survey is aimed at finding galaxies hidden behind the Milky-Way using a blind search in 21 cm. The discovery of Dwingeloo1 illustrates that the survey will allow us to systematically survey the region $30^o < l < 200^o$ out to 4000 km/sec. (iii) The Anglo-Australian 2-degree-Field (2dF) survey will yield 250,000 redshifts for APM-selected galaxies brighter than $19.5^m$ to map the large scale structure on scales larger than $\\sim 30 \\Mpc$. To study morphological segregation and biasing the spectra will be classified using Artificial Neural Networks.

  6. Universal dimer–dimer scattering in lattice effective field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elhatisari, Serdar; Katterjohn, Kris; Lee, Dean

    We consider two-component fermions with short-range interactions and large scattering length. This system has universal properties that are realized in several different fields of physics. In the limit of large fermion–fermion scattering length a ff and zero-range interaction, all properties of the system scale proportionally with a ff. For the case with shallow bound dimers, we calculate the dimer–dimer scattering phase shifts using lattice effective field theory. We extract the universal dimer–dimer scattering length a dd/a ff=0.618(30) and effective range r dd/a ff=-0.431(48). This result for the effective range is the first calculation with quantified and controlled systematic errors. Wemore » also benchmark our methods by computing the fermion–dimer scattering parameters and testing some predictions of conformal scaling of irrelevant operators near the unitarity limit.« less

  7. Are large-scale flow experiments informing the science and management of freshwater ecosystems?

    USGS Publications Warehouse

    Olden, Julian D.; Konrad, Christopher P.; Melis, Theodore S.; Kennard, Mark J.; Freeman, Mary C.; Mims, Meryl C.; Bray, Erin N.; Gido, Keith B.; Hemphill, Nina P.; Lytle, David A.; McMullen, Laura E.; Pyron, Mark; Robinson, Christopher T.; Schmidt, John C.; Williams, John G.

    2013-01-01

    Greater scientific knowledge, changing societal values, and legislative mandates have emphasized the importance of implementing large-scale flow experiments (FEs) downstream of dams. We provide the first global assessment of FEs to evaluate their success in advancing science and informing management decisions. Systematic review of 113 FEs across 20 countries revealed that clear articulation of experimental objectives, while not universally practiced, was crucial for achieving management outcomes and changing dam-operating policies. Furthermore, changes to dam operations were three times less likely when FEs were conducted primarily for scientific purposes. Despite the recognized importance of riverine flow regimes, four-fifths of FEs involved only discrete flow events. Over three-quarters of FEs documented both abiotic and biotic outcomes, but only one-third examined multiple taxonomic responses, thus limiting how FE results can inform holistic dam management. Future FEs will present new opportunities to advance scientifically credible water policies.

  8. Higgs-boson production at small transverse momentum

    NASA Astrophysics Data System (ADS)

    Becher, Thomas; Neubert, Matthias; Wilhelm, Daniel

    2013-05-01

    Using methods from effective field theory, we have recently developed a novel, systematic framework for the calculation of the cross sections for electroweak gauge-boson production at small and very small transverse momentum q T , in which large logarithms of the scale ratio m V / q T are resummed to all orders. This formalism is applied to the production of Higgs bosons in gluon fusion at the LHC. The production cross section receives logarithmically enhanced corrections from two sources: the running of the hard matching coefficient and the collinear factorization anomaly. The anomaly leads to the dynamical generation of a non-perturbative scale {q_{*}}tilde{mkern6mu} {m_H}{e^{{{{{-const}} / {{{α_s}( {{m_H}} )}} .}}}}≈ 8 GeV, which protects the process from receiving large long-distance hadronic contributions. We present numerical predictions for the transverse-momentum spectrum of Higgs bosons produced at the LHC, finding that it is quite insensitive to hadronic effects.

  9. Satellite measurements of large-scale air pollution - Methods

    NASA Technical Reports Server (NTRS)

    Kaufman, Yoram J.; Ferrare, Richard A.; Fraser, Robert S.

    1990-01-01

    A technique for deriving large-scale pollution parameters from NIR and visible satellite remote-sensing images obtained over land or water is described and demonstrated on AVHRR images. The method is based on comparison of the upward radiances on clear and hazy days and permits simultaneous determination of aerosol optical thickness with error Delta tau(a) = 0.08-0.15, particle size with error + or - 100-200 nm, and single-scattering albedo with error + or - 0.03 (for albedos near 1), all assuming accurate and stable satellite calibration and stable surface reflectance between the clear and hazy days. In the analysis of AVHRR images of smoke from a forest fire, good agreement was obtained between satellite and ground-based (sun-photometer) measurements of aerosol optical thickness, but the satellite particle sizes were systematically greater than those measured from the ground. The AVHRR single-scattering albedo agreed well with a Landsat albedo for the same smoke.

  10. Universal dimer–dimer scattering in lattice effective field theory

    DOE PAGES

    Elhatisari, Serdar; Katterjohn, Kris; Lee, Dean; ...

    2017-03-14

    We consider two-component fermions with short-range interactions and large scattering length. This system has universal properties that are realized in several different fields of physics. In the limit of large fermion–fermion scattering length a ff and zero-range interaction, all properties of the system scale proportionally with a ff. For the case with shallow bound dimers, we calculate the dimer–dimer scattering phase shifts using lattice effective field theory. We extract the universal dimer–dimer scattering length a dd/a ff=0.618(30) and effective range r dd/a ff=-0.431(48). This result for the effective range is the first calculation with quantified and controlled systematic errors. Wemore » also benchmark our methods by computing the fermion–dimer scattering parameters and testing some predictions of conformal scaling of irrelevant operators near the unitarity limit.« less

  11. Exploring the Large Scale Anisotropy in the Cosmic Microwave Background Radiation at 170 GHz

    NASA Astrophysics Data System (ADS)

    Ganga, Kenneth Matthew

    1994-01-01

    In this thesis, data from the Far Infra-Red Survey (FIRS), a balloon-borne experiment designed to measure the large scale anisotropy in the cosmic microwave background radiation, are analyzed. The FIRS operates in four frequency bands at 170, 280, 480, and 670 GHz, using an approximately Gaussian beam with a 3.8 deg full-width-at-half-maximum. A cross-correlation with the COBE/DMR first-year maps yields significant results, confirming the DMR detection of anisotropy in the cosmic microwave background radiation. Analysis of the FIRS data alone sets bounds on the amplitude of anisotropy under the assumption that the fluctuations are described by a Harrison-Peebles-Zel'dovich spectrum and further analysis sets limits on the index of the primordial density fluctuations for an Einstein-DeSitter universe. Galactic dust emission is discussed and limits are set on the magnitude of possible systematic errors in the measurement.

  12. Superwind Outflows in Seyfert Galaxies? : Large-Scale Radio Maps of an Edge-On Sample

    NASA Astrophysics Data System (ADS)

    Colbert, E.; Gallimore, J.; Baum, S.; O'Dea, C.

    1995-03-01

    Large-scale galactic winds (superwinds) are commonly found flowing out of the nuclear region of ultraluminous infrared and powerful starburst galaxies. Stellar winds and supernovae from the nuclear starburst provide the energy to drive these superwinds. The outflowing gas escapes along the rotation axis, sweeping up and shock-heating clouds in the halo, which produces optical line emission, radio synchrotron emission, and X-rays. These features can most easily be studied in edge-on systems, so that the wind emission is not confused by that from the disk. We have begun a systematic search for superwind outflows in Seyfert galaxies. In an earlier optical emission-line survey, we found extended minor axis emission and/or double-peaked emission line profiles in >~30% of the sample objects. We present here large-scale (6cm VLA C-config) radio maps of 11 edge-on Seyfert galaxies, selected (without bias) from a distance-limited sample of 23 edge-on Seyferts. These data have been used to estimate the frequency of occurrence of superwinds. Preliminary results indicate that four (36%) of the 11 objects observed and six (26%) of the 23 objects in the distance-limited sample have extended radio emission oriented perpendicular to the galaxy disk. This emission may be produced by a galactic wind blowing out of the disk. Two (NGC 2992 and NGC 5506) of the nine objects for which we have both radio and optical data show good evidence for a galactic wind in both datasets. We suggest that galactic winds occur in >~30% of all Seyferts. A goal of this work is to find a diagnostic that can be used to distinguish between large-scale outflows that are driven by starbursts and those that are driven by an AGN. The presence of starburst-driven superwinds in Seyferts, if established, would have important implications for the connection between starburst galaxies and AGN.

  13. The Human Blood Metabolome-Transcriptome Interface

    PubMed Central

    Schramm, Katharina; Adamski, Jerzy; Gieger, Christian; Herder, Christian; Carstensen, Maren; Peters, Annette; Rathmann, Wolfgang; Roden, Michael; Strauch, Konstantin; Suhre, Karsten; Kastenmüller, Gabi; Prokisch, Holger; Theis, Fabian J.

    2015-01-01

    Biological systems consist of multiple organizational levels all densely interacting with each other to ensure function and flexibility of the system. Simultaneous analysis of cross-sectional multi-omics data from large population studies is a powerful tool to comprehensively characterize the underlying molecular mechanisms on a physiological scale. In this study, we systematically analyzed the relationship between fasting serum metabolomics and whole blood transcriptomics data from 712 individuals of the German KORA F4 cohort. Correlation-based analysis identified 1,109 significant associations between 522 transcripts and 114 metabolites summarized in an integrated network, the ‘human blood metabolome-transcriptome interface’ (BMTI). Bidirectional causality analysis using Mendelian randomization did not yield any statistically significant causal associations between transcripts and metabolites. A knowledge-based interpretation and integration with a genome-scale human metabolic reconstruction revealed systematic signatures of signaling, transport and metabolic processes, i.e. metabolic reactions mainly belonging to lipid, energy and amino acid metabolism. Moreover, the construction of a network based on functional categories illustrated the cross-talk between the biological layers at a pathway level. Using a transcription factor binding site enrichment analysis, this pathway cross-talk was further confirmed at a regulatory level. Finally, we demonstrated how the constructed networks can be used to gain novel insights into molecular mechanisms associated to intermediate clinical traits. Overall, our results demonstrate the utility of a multi-omics integrative approach to understand the molecular mechanisms underlying both normal physiology and disease. PMID:26086077

  14. Lights All Askew: Systematics in Galaxy Images from Megaparsecs to Microns

    NASA Astrophysics Data System (ADS)

    Bradshaw, Andrew Kenneth

    The stars and galaxies are not where they seem. In the process of imaging and measurement, the light from distant objects is distorted, blurred, and skewed by several physical effects on scales from megaparsecs to microns. Charge-coupled devices (CCDs) provide sensitive detection of this light, but introduce their own problems in the form of systematic biases. Images of these stars and galaxies are formed in CCDs when incoming light generates photoelectrons which are then collected in a pixel's potential well and measured as signal. However, these signal electrons can be diverted from purely parallel paths toward the pixel wells by transverse fields sourced by structural elements of the CCD, accidental imperfections in fabrication, or dynamic electric fields induced by other collected charges. These charge transport anomalies lead to measurable systematic errors in the images which bias cosmological inferences based on them. The physics of imaging therefore deserves thorough investigation, which is performed in the laboratory using a unique optical beam simulator and in computer simulations of charge transport. On top of detector systematics, there are often biases in the mathematical analysis of pixelized images; in particular, the location, shape, and orientation of stars and galaxies. Using elliptical Gaussians as a toy model for galaxies, it is demonstrated how small biases in the computed image moments lead to observable orientation patterns in modern survey data. Also presented are examples of the reduction of data and fitting of optical aberrations of images in the lab and on the sky which are modeled by physically or mathematically-motivated methods. Finally, end-to-end analysis of the weak gravitational lensing signal is presented using deep sky data as well as in N-body simulations. It is demonstrated how measured weak lens shear can be transformed by signal matched filters which aid in the detection of mass overdensities and separate signal from noise. A commonly-used decomposition of shear into two components, E- and B-modes, is thoroughly tested and both modes are shown to be useful in the detection of large scale structure. We find several astrophysical sources of B-mode and explain their apparent origin. The methods presented therefore offer an optimal way to filter weak gravitational shear into maps of large scale structure through the process of cosmic mass cartography.

  15. Performance of granular activated carbon to remove micropollutants from municipal wastewater-A meta-analysis of pilot- and large-scale studies.

    PubMed

    Benstoem, Frank; Nahrstedt, Andreas; Boehler, Marc; Knopp, Gregor; Montag, David; Siegrist, Hansruedi; Pinnekamp, Johannes

    2017-10-01

    For reducing organic micropollutants (MP) in municipal wastewater effluents, granular activated carbon (GAC) has been tested in various studies. We did systematic literature research and found 44 studies dealing with the adsorption of MPs (carbamazepine, diclofenac, sulfamethoxazole) from municipal wastewater on GAC in pilot- and large-scale plants. Within our meta-analysis we plot the bed volumes (BV [m 3 water /m 3 GAC ]) until the breakthrough criterion of MP-BV20% was reached, dependent on potential relevant parameters (empty bed contact time EBCT, influent DOC DOC 0 and manufacturing method). Moreover, we performed statistical tests (ANOVAs) to check the results for significance. Single adsorbers operating time differs i.e. by 2500% until breakthrough of diclofenac-BV20% was reached (800-20,000 BV). There was still elimination of the "very well/well" adsorbable MPs such as carbamazepine and diclofenac even when the equilibrium of DOC had already been reached. No strong statistical significance of EBCT and DOC 0 on MP-BV20% could be found due to lack of data and the high heterogeneity of the studies using GAC of different qualities. In further studies, adsorbers should be operated ≫20,000 BV for exact calculation of breakthrough curves, and the following parameters should be recorded: selected MPs; DOC 0; UVA 254 ; EBCT; product name, manufacturing method and raw material of GAC; suspended solids (TSS); backwash interval; backwash program and pressure drop within adsorber. Based on our investigations we generally recommend using reactivated GAC to reduce the environmental impact and to carry out tests on pilot scale to collect reliable data for process design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A Systematic Review and Psychometric Evaluation of Adaptive Behavior Scales and Recommendations for Practice

    ERIC Educational Resources Information Center

    Floyd, Randy G.; Shands, Elizabeth I.; Alfonso, Vincent C.; Phillips, Jessica F.; Autry, Beth K.; Mosteller, Jessica A.; Skinner, Mary; Irby, Sarah

    2015-01-01

    Adaptive behavior scales are vital in assessing children and adolescents who experience a range of disabling conditions in school settings. This article presents the results of an evaluation of the design characteristics, norming, scale characteristics, reliability and validity evidence, and bias identification studies supporting 14…

  17. The Power Spectrum of the Milky Way: Velocity Fluctuations in the Galactic Disk

    NASA Astrophysics Data System (ADS)

    Bovy, Jo; Bird, Jonathan C.; García Pérez, Ana E.; Majewski, Steven R.; Nidever, David L.; Zasowski, Gail

    2015-02-01

    We investigate the kinematics of stars in the mid-plane of the Milky Way (MW) on scales between 25 pc and 10 kpc with data from the Apache Point Observatory Galactic Evolution Experiment (APOGEE), the Radial Velocity Experiment (RAVE), and the Geneva-Copenhagen survey (GCS). Using red-clump (RC) stars in APOGEE, we determine the large-scale line-of-sight velocity field out to 5 kpc from the Sun in (0.75 kpc)2 bins. The solar motion V ⊙ - c with respect to the circular velocity Vc is the largest contribution to the power on large scales after subtracting an axisymmetric rotation field; we determine the solar motion by minimizing the large-scale power to be V ⊙ - c = 24 ± 1 (ran.) ± 2 (syst. [Vc ]) ± 5 (syst.[large-scale]) km s-1, where the systematic uncertainty is due to (1) a conservative 20 km s-1 uncertainty in Vc and (2) the estimated power on unobserved larger scales. Combining the APOGEE peculiar-velocity field with RC stars in RAVE out to 2 kpc from the Sun and with local GCS stars, we determine the power spectrum of residual velocity fluctuations in the MW's disk on scales between 0.2 kpc-1 <= k <= 40 kpc-1. Most of the power is contained in a broad peak between 0.2 kpc-1 < k < 0.9 kpc-1. We investigate the expected power spectrum for various non-axisymmetric perturbations and demonstrate that the central bar with commonly used parameters but of relatively high mass can explain the bulk of velocity fluctuations in the plane of the Galactic disk near the Sun. Streaming motions ≈10 km s-1 on >~ 3 kpc scales in the MW are in good agreement with observations of external galaxies and directly explain why local determinations of the solar motion are inconsistent with global measurements.

  18. Far Sidelobe Effects from Panel Gaps of the Atacama Cosmology Telescope

    NASA Technical Reports Server (NTRS)

    Fluxa, Pedro R.; Duenner, Rolando; Maurin, Loiec; Choi, Steve K.; Devlin, Mark J.; Gallardo, Patricio A.; Shuay-Pwu, P. Ho; Koopman, Brian J.; Louis, Thibaut; Wollack, Edward J.

    2016-01-01

    The Atacama Cosmology Telescope is a 6 meter diameter CMB telescope located at 5200 meters in the Chilean desert. ACT has made arc-minute scale maps of the sky at 90 and 150 GHz which have led to precise measurements of the fine angular power spectrum of the CMB fluctuations in temperature and polarization. One of the goals of ACT is to search for the B-mode polarization signal from primordial gravity waves, and thus extending ACT's data analysis to larger angular scales. This goal introduces new challenges in the control of systematic effects, including better understanding of far sidelobe effects that might enter the power spectrum at degree angular scales. Here we study the effects of the gaps between panels of the ACT primary and secondary reflectors in the worst case scenario in which the gaps remain open. We produced numerical simulations of the optics using GRASP up to 8 degrees away from the main beam and simulated timestreams for observations with this beam using real pointing information from ACT data. Maps from these simulated timestreams showed leakage from the sidelobes, indicating that this effect must be taken into consideration at large angular scales.

  19. Use of Social Desirability Scales in Clinical Psychology: A Systematic Review.

    PubMed

    Perinelli, Enrico; Gremigni, Paola

    2016-06-01

    There is still an open debate about the utility of social desirability indicators. This report systematically reviewed the use of social desirability scales in studies addressing social desirability in clinical psychology. A systematic review (January 2010-March 2015) was conducted, including 35 studies meeting the inclusion criteria of being published in peer-reviewed journals and describing quantitative findings about an association of social desirability with clinical psychology variables using a cross-sectional or longitudinal design. Social desirability was associated with self-reports of various clinical-psychological dimensions. Most of the included studies treated social desirability as a 1-dimensional variable and only 10 of 35 disentangled the impression management and self-deception components. Although theoretical literature does not consider social desirability a mere response bias, only 4 of the reviewed articles controlled for the possible suppressor effect of personality variables on social desirability, while the majority focused upon the stylistic (response bias) rather than the substantive (personality) nature of this construct. The present review highlighted some limitations in the use of social desirability scales in recent clinical psychology research and tried to offer a few suggestions for handling this issue. © 2016 Wiley Periodicals, Inc.

  20. ScaleNet: a literature-based model of scale insect biology and systematics

    PubMed Central

    García Morales, Mayrolin; Denno, Barbara D.; Miller, Douglass R.; Miller, Gary L.; Ben-Dov, Yair; Hardy, Nate B.

    2016-01-01

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found on all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis and plant-insect interactions. ScaleNet was launched in 1995 to provide insect identifiers, pest managers, insect systematists, evolutionary biologists and ecologists efficient access to information about scale insect biological diversity. It provides comprehensive information on scale insects taken directly from the primary literature. Currently, it draws from 23 477 articles and describes the systematics and biology of 8194 valid species. For 20 years, ScaleNet ran on the same software platform. That platform is no longer viable. Here, we present a new, open-source implementation of ScaleNet. We have normalized the data model, begun the process of correcting invalid data, upgraded the user interface, and added online administrative tools. These improvements make ScaleNet easier to use and maintain and make the ScaleNet data more accurate and extendable. Database URL: http://scalenet.info PMID:26861659

  1. Noninvasive prenatal diagnosis of common aneuploidies by semiconductor sequencing

    PubMed Central

    Liao, Can; Yin, Ai-hua; Peng, Chun-fang; Fu, Fang; Yang, Jie-xia; Li, Ru; Chen, Yang-yi; Luo, Dong-hong; Zhang, Yong-ling; Ou, Yan-mei; Li, Jian; Wu, Jing; Mai, Ming-qin; Hou, Rui; Wu, Frances; Luo, Hongrong; Li, Dong-zhi; Liu, Hai-liang; Zhang, Xiao-zhuang; Zhang, Kang

    2014-01-01

    Massively parallel sequencing (MPS) of cell-free fetal DNA from maternal plasma has revolutionized our ability to perform noninvasive prenatal diagnosis. This approach avoids the risk of fetal loss associated with more invasive diagnostic procedures. The present study developed an effective method for noninvasive prenatal diagnosis of common chromosomal aneuploidies using a benchtop semiconductor sequencing platform (SSP), which relies on the MPS platform but offers advantages over existing noninvasive screening techniques. A total of 2,275 pregnant subjects was included in the study; of these, 515 subjects who had full karyotyping results were used in a retrospective analysis, and 1,760 subjects without karyotyping were analyzed in a prospective study. In the retrospective study, all 55 fetal trisomy 21 cases were identified using the SSP with a sensitivity and specificity of 99.94% and 99.46%, respectively. The SSP also detected 16 trisomy 18 cases with 100% sensitivity and 99.24% specificity and 3 trisomy 13 cases with 100% sensitivity and 100% specificity. Furthermore, 15 fetuses with sex chromosome aneuploidies (10 45,X, 2 47,XYY, 2 47,XXX, and 1 47,XXY) were detected. In the prospective study, nine fetuses with trisomy 21, three with trisomy 18, three with trisomy 13, and one with 45,X were detected. To our knowledge, this is the first large-scale clinical study to systematically identify chromosomal aneuploidies based on cell-free fetal DNA using the SSP and provides an effective strategy for large-scale noninvasive screening for chromosomal aneuploidies in a clinical setting. PMID:24799683

  2. The role and benefits of accessing primary care patient records during unscheduled care: a systematic review.

    PubMed

    Bowden, Tom; Coiera, Enrico

    2017-09-22

    The purpose of this study was to assess the impact of accessing primary care records on unscheduled care. Unscheduled care is typically delivered in hospital Emergency Departments. Studies published to December 2014 reporting on primary care record access during unscheduled care were retrieved. Twenty-two articles met inclusion criteria from a pool of 192. Many shared electronic health records (SEHRs) were large in scale, servicing many millions of patients. Reported utilization rates by clinicians was variable, with rates >20% amongst health management organizations but much lower in nation-scale systems. No study reported on clinical outcomes or patient safety, and no economic studies of SEHR access during unscheduled care were available. Design factors that may affect utilization included consent and access models, SEHR content, and system usability and reliability. Despite their size and expense, SEHRs designed to support unscheduled care have been poorly evaluated, and it is not possible to draw conclusions about any likely benefits associated with their use. Heterogeneity across the systems and the populations they serve make generalization about system design or performance difficult. None of the reviewed studies used a theoretical model to guide evaluation. Value of Information models may be a useful theoretical approach to design evaluation metrics, facilitating comparison across systems in future studies. Well-designed SEHRs should in principle be capable of improving the efficiency, quality and safety of unscheduled care, but at present the evidence for such benefits is weak, largely because it has not been sought.

  3. The structure and large-scale organization of extreme cold waves over the conterminous United States

    NASA Astrophysics Data System (ADS)

    Xie, Zuowei; Black, Robert X.; Deng, Yi

    2017-12-01

    Extreme cold waves (ECWs) occurring over the conterminous United States (US) are studied through a systematic identification and documentation of their local synoptic structures, associated large-scale meteorological patterns (LMPs), and forcing mechanisms external to the US. Focusing on the boreal cool season (November-March) for 1950‒2005, a hierarchical cluster analysis identifies three ECW patterns, respectively characterized by cold surface air temperature anomalies over the upper midwest (UM), northwestern (NW), and southeastern (SE) US. Locally, ECWs are synoptically organized by anomalous high pressure and northerly flow. At larger scales, the UM LMP features a zonal dipole in the mid-tropospheric height field over North America, while the NW and SE LMPs each include a zonal wave train extending from the North Pacific across North America into the North Atlantic. The Community Climate System Model version 4 (CCSM4) in general simulates the three ECW patterns quite well and successfully reproduces the observed enhancements in the frequency of their associated LMPs. La Niña and the cool phase of the Pacific Decadal Oscillation (PDO) favor the occurrence of NW ECWs, while the warm PDO phase, low Arctic sea ice extent and high Eurasian snow cover extent (SCE) are associated with elevated SE-ECW frequency. Additionally, high Eurasian SCE is linked to increases in the occurrence likelihood of UM ECWs.

  4. Edge reconstruction in armchair phosphorene nanoribbons revealed by discontinuous Galerkin density functional theory.

    PubMed

    Hu, Wei; Lin, Lin; Yang, Chao

    2015-12-21

    With the help of our recently developed massively parallel DGDFT (Discontinuous Galerkin Density Functional Theory) methodology, we perform large-scale Kohn-Sham density functional theory calculations on phosphorene nanoribbons with armchair edges (ACPNRs) containing a few thousands to ten thousand atoms. The use of DGDFT allows us to systematically achieve a conventional plane wave basis set type of accuracy, but with a much smaller number (about 15) of adaptive local basis (ALB) functions per atom for this system. The relatively small number of degrees of freedom required to represent the Kohn-Sham Hamiltonian, together with the use of the pole expansion the selected inversion (PEXSI) technique that circumvents the need to diagonalize the Hamiltonian, results in a highly efficient and scalable computational scheme for analyzing the electronic structures of ACPNRs as well as their dynamics. The total wall clock time for calculating the electronic structures of large-scale ACPNRs containing 1080-10,800 atoms is only 10-25 s per self-consistent field (SCF) iteration, with accuracy fully comparable to that obtained from conventional planewave DFT calculations. For the ACPNR system, we observe that the DGDFT methodology can scale to 5000-50,000 processors. We use DGDFT based ab initio molecular dynamics (AIMD) calculations to study the thermodynamic stability of ACPNRs. Our calculations reveal that a 2 × 1 edge reconstruction appears in ACPNRs at room temperature.

  5. Scaling and Root Planning is Recommended in the Nonsurgical Treatment of Chronic Periodontitis.

    PubMed

    Herrera, David

    2016-03-01

    Systematic review and meta-analysis on the nonsurgical treatment of chronic periodontitis by means of scaling and root planing with or without adjuncts. Smiley CJ, Tracy SL, Abt E, Michalowicz BS, John MT, Gunsolley J, Cobb CM, Rossmann J, Harrel SK, Forrest JL, Hujoel PP, Noraian KW, Greenwell H, Frantsve-Hawley J, Estrich C, Hanson N. J Am Dent Assoc 2015;146(7):508-524.e5. The study was funded by the American Dental Association Systematic review with meta-analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Numerical and Experimental Study of Wake Redirection Techniques in a Boundary Layer Wind Tunnel

    NASA Astrophysics Data System (ADS)

    Wang, J.; Foley, S.; Nanos, E. M.; Yu, T.; Campagnolo, F.; Bottasso, C. L.; Zanotti, A.; Croce, A.

    2017-05-01

    The aim of the present paper is to validate a wind farm LES framework in the context of two distinct wake redirection techniques: yaw misalignment and individual cyclic pitch control. A test campaign was conducted using scaled wind turbine models in a boundary layer wind tunnel, where both particle image velocimetry and hot-wire thermo anemometers were used to obtain high quality measurements of the downstream flow. A LiDAR system was also employed to determine the non-uniformity of the inflow velocity field. A high-fidelity large-eddy simulation lifting-line model was used to simulate the aerodynamic behavior of the system, including the geometry of the wind turbine nacelle and tower. A tuning-free Lagrangian scale-dependent dynamic approach was adopted to improve the sub-grid scale modeling. Comparisons with experimental measurements are used to systematically validate the simulations. The LES results are in good agreement with the PIV and hot-wire data in terms of time-averaged wake profiles, turbulence intensity and Reynolds shear stresses. Discrepancies are also highlighted, to guide future improvements.

  7. Accurate Modeling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron; Scoccimarro, Roman

    2015-01-01

    The large-scale distribution of galaxies can be explained fairly simply by assuming (i) a cosmological model, which determines the dark matter halo distribution, and (ii) a simple connection between galaxies and the halos they inhabit. This conceptually simple framework, called the halo model, has been remarkably successful at reproducing the clustering of galaxies on all scales, as observed in various galaxy redshift surveys. However, none of these previous studies have carefully modeled the systematics and thus truly tested the halo model in a statistically rigorous sense. We present a new accurate and fully numerical halo model framework and test it against clustering measurements from two luminosity samples of galaxies drawn from the SDSS DR7. We show that the simple ΛCDM cosmology + halo model is not able to simultaneously reproduce the galaxy projected correlation function and the group multiplicity function. In particular, the more luminous sample shows significant tension with theory. We discuss the implications of our findings and how this work paves the way for constraining galaxy formation by accurate simultaneous modeling of multiple galaxy clustering statistics.

  8. The scaling of human interactions with city size

    PubMed Central

    Schläpfer, Markus; Bettencourt, Luís M. A.; Grauwin, Sébastian; Raschke, Mathias; Claxton, Rob; Smoreda, Zbigniew; West, Geoffrey B.; Ratti, Carlo

    2014-01-01

    The size of cities is known to play a fundamental role in social and economic life. Yet, its relation to the structure of the underlying network of human interactions has not been investigated empirically in detail. In this paper, we map society-wide communication networks to the urban areas of two European countries. We show that both the total number of contacts and the total communication activity grow superlinearly with city population size, according to well-defined scaling relations and resulting from a multiplicative increase that affects most citizens. Perhaps surprisingly, however, the probability that an individual's contacts are also connected with each other remains largely unaffected. These empirical results predict a systematic and scale-invariant acceleration of interaction-based spreading phenomena as cities get bigger, which is numerically confirmed by applying epidemiological models to the studied networks. Our findings should provide a microscopic basis towards understanding the superlinear increase of different socioeconomic quantities with city size, that applies to almost all urban systems and includes, for instance, the creation of new inventions or the prevalence of certain contagious diseases. PMID:24990287

  9. Infusion phlebitis assessment measures: a systematic review

    PubMed Central

    Ray-Barruel, Gillian; Polit, Denise F; Murfield, Jenny E; Rickard, Claire M

    2014-01-01

    Rationale, aims and objectives Phlebitis is a common and painful complication of peripheral intravenous cannulation. The aim of this review was to identify the measures used in infusion phlebitis assessment and evaluate evidence regarding their reliability, validity, responsiveness and feasibility. Method We conducted a systematic literature review of the Cochrane library, Ovid MEDLINE and EBSCO CINAHL until September 2013. All English-language studies (randomized controlled trials, prospective cohort and cross-sectional) that used an infusion phlebitis scale were retrieved and analysed to determine which symptoms were included in each scale and how these were measured. We evaluated studies that reported testing the psychometric properties of phlebitis assessment scales using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) guidelines. Results Infusion phlebitis was the primary outcome measure in 233 studies. Fifty-three (23%) of these provided no actual definition of phlebitis. Of the 180 studies that reported measuring phlebitis incidence and/or severity, 101 (56%) used a scale and 79 (44%) used a definition alone. We identified 71 different phlebitis assessment scales. Three scales had undergone some psychometric analyses, but no scale had been rigorously tested. Conclusion Many phlebitis scales exist, but none has been thoroughly validated for use in clinical practice. A lack of consensus on phlebitis measures has likely contributed to disparities in reported phlebitis incidence, precluding meaningful comparison of phlebitis rates. PMID:24401116

  10. Infusion phlebitis assessment measures: a systematic review.

    PubMed

    Ray-Barruel, Gillian; Polit, Denise F; Murfield, Jenny E; Rickard, Claire M

    2014-04-01

    Phlebitis is a common and painful complication of peripheral intravenous cannulation. The aim of this review was to identify the measures used in infusion phlebitis assessment and evaluate evidence regarding their reliability, validity, responsiveness and feasibility. We conducted a systematic literature review of the Cochrane library, Ovid MEDLINE and EBSCO CINAHL until September 2013. All English-language studies (randomized controlled trials, prospective cohort and cross-sectional) that used an infusion phlebitis scale were retrieved and analysed to determine which symptoms were included in each scale and how these were measured. We evaluated studies that reported testing the psychometric properties of phlebitis assessment scales using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) guidelines. Infusion phlebitis was the primary outcome measure in 233 studies. Fifty-three (23%) of these provided no actual definition of phlebitis. Of the 180 studies that reported measuring phlebitis incidence and/or severity, 101 (56%) used a scale and 79 (44%) used a definition alone. We identified 71 different phlebitis assessment scales. Three scales had undergone some psychometric analyses, but no scale had been rigorously tested. Many phlebitis scales exist, but none has been thoroughly validated for use in clinical practice. A lack of consensus on phlebitis measures has likely contributed to disparities in reported phlebitis incidence, precluding meaningful comparison of phlebitis rates. © 2014 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  11. The use of observational scales to monitor symptom control and depth of sedation in patients requiring palliative sedation: a systematic review.

    PubMed

    Brinkkemper, Tijn; van Norel, Arjanne M; Szadek, Karolina M; Loer, Stephan A; Zuurmond, Wouter W A; Perez, Roberto S G M

    2013-01-01

    Palliative sedation is the intentional lowering of consciousness of a patient in the last phase of life to relieve suffering from refractory symptoms such as pain, delirium and dyspnoea. In this systematic review, we evaluated the use of monitoring scales to assess the degree of control of refractory symptoms and/or the depth of the sedation. A database search of PubMed and Embase was performed up to January 2010 using the search terms 'palliative sedation' OR 'terminal sedation'. Retro- and prospective studies as well as reviews and guidelines containing information about monitoring of palliative sedation, written in the English, German or Dutch language were included. The search yielded 264 articles of which 30 were considered relevant. Most studies focused on monitoring refractory symptoms (pain, fatigue or delirium) or the level of awareness to control the level of sedation. Four prospective and one retrospective study used scales validated in other settings: the Numeric Pain Rating Scale, the Visual Analogue Scale, the Memorial Delirium Assessment Scale, the Communication Capacity Scale and Agitation Distress Scale. Only the Community Capacity Scale was partially validated for use in a palliative sedation setting. One guideline described the use of a scale validated in another setting. A minority of studies reported the use of observational scales to monitor the effect of palliative sedation. Future studies should be focused on establishing proper instruments, most adequate frequency and timing of assessment, and interdisciplinary evaluation of sedation depth and symptom control for palliative sedation.

  12. A Bayesian Estimate of the CMB-Large-scale Structure Cross-correlation

    NASA Astrophysics Data System (ADS)

    Moura-Santos, E.; Carvalho, F. C.; Penna-Lima, M.; Novaes, C. P.; Wuensche, C. A.

    2016-08-01

    Evidences for late-time acceleration of the universe are provided by multiple probes, such as Type Ia supernovae, the cosmic microwave background (CMB), and large-scale structure (LSS). In this work, we focus on the integrated Sachs-Wolfe (ISW) effect, I.e., secondary CMB fluctuations generated by evolving gravitational potentials due to the transition between, e.g., the matter and dark energy (DE) dominated phases. Therefore, assuming a flat universe, DE properties can be inferred from ISW detections. We present a Bayesian approach to compute the CMB-LSS cross-correlation signal. The method is based on the estimate of the likelihood for measuring a combined set consisting of a CMB temperature and galaxy contrast maps, provided that we have some information on the statistical properties of the fluctuations affecting these maps. The likelihood is estimated by a sampling algorithm, therefore avoiding the computationally demanding techniques of direct evaluation in either pixel or harmonic space. As local tracers of the matter distribution at large scales, we used the Two Micron All Sky Survey galaxy catalog and, for the CMB temperature fluctuations, the ninth-year data release of the Wilkinson Microwave Anisotropy Probe (WMAP9). The results show a dominance of cosmic variance over the weak recovered signal, due mainly to the shallowness of the catalog used, with systematics associated with the sampling algorithm playing a secondary role as sources of uncertainty. When combined with other complementary probes, the method presented in this paper is expected to be a useful tool to late-time acceleration studies in cosmology.

  13. Cryptosporidium within-host genetic diversity: systematic bibliographical search and narrative overview.

    PubMed

    Grinberg, Alex; Widmer, Giovanni

    2016-07-01

    Knowledge of the within-host genetic diversity of a pathogen often has broad implications for disease management. Cryptosporidium protozoan parasites are among the most common causative agents of infectious diarrhoea. Current limitations of in vitro culture impose the use of uncultured isolates obtained directly from the hosts as operational units of Cryptosporidium genotyping. The validity of this practice is centred on the assumption of genetic homogeneity of the parasite within the host, and genetic studies often take little account of the within-host genetic diversity of Cryptosporidium. Yet, theory and experimental evidence contemplate genetic diversity of Cryptosporidium at the within-host scale, but this diversity is not easily identified by genotyping methods ill-suited for the resolution of DNA mixtures. We performed a systematic bibliographical search of the occurrence of within-host genetic diversity of Cryptosporidium parasites in epidemiological samples, between 2005 and 2015. Our results indicate that genetic diversity at the within-host scale, in the form of mixed species or intra-species diversity, has been identified in a large number (n=55) of epidemiological surveys of cryptosporidiosis in variable proportions, but has often been treated as a secondary finding and not analysed. As in malaria, there are indications that the scale of this diversity varies between geographical regions, perhaps depending on the prevailing transmission pathways. These results provide a significant knowledge base from which to draw alternative population genetic structure models, some of which are discussed in this paper. Copyright © 2016 Australian Society for Parasitology. Published by Elsevier Ltd. All rights reserved.

  14. Contribution to the molecular systematics of the genus Capoeta from the south Caspian Sea basin using mitochondrial cytochrome b sequences (Teleostei: Cyprinidae)

    PubMed Central

    Zareian, Halimeh; Esmaeili, Hamid Reza; Heidari, Adeleh; Khoshkholgh, Majid Reza; Mousavi-Sabet, Hamed

    2016-01-01

    Traditionally, Capoeta populations from the southern Caspian Sea basin have been considered as Capoeta capoeta gracilis. Study on the phylogenetic relationship of Capoeta species using mitochondrial cytochrome b gene sequences show that Capoeta population from the southern Caspian Sea basin is distinct species and receive well support (posterior probability of 100%). Based on the tree topologies obtained from Bayesian and Maximum Likelihood methods, three main groups for the studied Capoeta were detected: Clade I) Capoeta trutta group (the Mesopotamian Capoeta group) including closely related taxa (e.g. trutta, turani, barroisi) characterized by having numerous irregular black spots on the dorsal half of the body. This clade was the sister group to all other Capoeta species and its separation occurred very early in evolution possess, so we considered it as O ld Evolutionary Group. Clade II) comprises highly diversified and widespread group, Capoeta damascina complex group (small scale capoeta group), the Anatolian-Iranian group (e.g. banarescui, buhsei, damascina, saadii), characterized by small scales and plain body (absence of irregular black spots on the dorsal half of the body, except in some juveniles) with significantly later speciation event so called Young Evolutionary Group. Clade III) Capoeta capoeta complex group (large scale capoeta group, the Aralo-Caspian group) comprises very closely related taxa characterized by large scales and plain body (absence of irregular black spots on the dorsal half of the body) distributed in Aralo-Caspian water bodies (capoeta, ekmekciae, heratensis, gracilis, sevangi) that has been recently diverged and could be considered as Very Young Evolutionary Group. PMID:28097160

  15. Systematic Studies of Cosmic-Ray Anisotropy and Energy Spectrum with IceCube and IceTop

    NASA Astrophysics Data System (ADS)

    McNally, Frank

    Anisotropy in the cosmic-ray arrival direction distribution has been well documented over a large energy range, but its origin remains largely a mystery. In the TeV to PeV energy range, the galactic magnetic field thoroughly scatters cosmic rays, but anisotropy at the part-per-mille level and smaller persists, potentially carrying information about nearby cosmic-ray accelerators and the galactic magnetic field. The IceCube Neutrino Observatory was the first detector to observe anisotropy at these energies in the Southern sky. This work uses 318 billion cosmic-ray induced muon events, collected between May 2009 and May 2015 from both the in-ice component of IceCube as well as the surface component, IceTop. The observed global anisotropy features large regions of relative excess and deficit, with amplitudes on the order of 10-3. While a decomposition of the arrival direction distribution into spherical harmonics shows that most of the power is contained in the low-multipole (ℓ ≤ 4) moments, higher-multipole components are found to be statistically significant down to an angular scale of less than 10°, approaching the angular resolution of the detector. Above 100TeV, a change in the topology of the arrival direction distribution is observed, and the anisotropy is characterized by a wide relative deficit whose amplitude increases with primary energy up to at least 5PeV, the highest energies currently accessible to IceCube with sufficient event statistics. No time dependence of the large- and small-scale structures is observed in the six-year period covered by this analysis within statistical and systematic uncertainties. Analysis of the energy spectrum and composition in the PeV energy range as a function of sky position is performed with IceTop data over a five-year period using a likelihood-based reconstruction. Both the energy spectrum and the composition distribution are found to be consistent with a single source population over declination bands. This work represents an early attempt at understanding the anisotropy through the study of the spectrum and composition. The high-statistics data set reveals more details on the properties of the anisotropy, potentially able to shed light on the various physical processes responsible for the complex angular structure and energy evolution.

  16. Quality of Child Care Using the Environment Rating Scales: A Meta-Analysis of International Studies

    ERIC Educational Resources Information Center

    Vermeer, Harriet J.; van IJzendoorn, Marinus H.; Cárcamo, Rodrigo A.; Harrison, Linda J.

    2016-01-01

    The current study provides a systematic examination of child care quality around the globe, using the Environment Rating Scales (ERS). Additional goals of this study are to examine associations between ERS process quality and structural features (group size, caregiver-child ratio) that underpin quality and between ERS and more proximal aspects of…

  17. Oral health and orofacial pain in older people with dementia: a systematic review with focus on dental hard tissues.

    PubMed

    Delwel, Suzanne; Binnekade, Tarik T; Perez, Roberto S G M; Hertogh, Cees M P M; Scherder, Erik J A; Lobbezoo, Frank

    2017-01-01

    The aim of this review was to provide a systematic overview including a quality assessment of studies about oral health and orofacial pain in older people with dementia, compared to older people without dementia. A systematic literature search was performed in PubMed, CINAHL, and the Cochrane Library. The following search terms were used: dementia and oral health or stomatognathic disease. The quality assessment of the included articles was performed using the Newcastle-Ottawa Scale (NOS). The search yielded 527 articles, of which 37 were included for the quality assessment and quantitative overview. The median NOS score of the included studies was 5, and the mean was 4.9 (SD 2.2). The heterogeneity between the studies was considered too large to perform a meta-analysis. An equivalent prevalence of orofacial pain, number of teeth present, decayed missing filled teeth index, edentulousness percentage, and denture use was found for both groups. However, the presence of caries and retained roots was higher in older people with dementia than in those without. Older people with dementia have worse oral health, with more retained roots and coronal and root caries, when compared to older people without dementia. Little research focused on orofacial pain in older people with dementia. The current state of oral health in older people with dementia could be improved with oral care education of caretakers and regular professional dental care.

  18. Real-time evolution of a large-scale relativistic jet

    NASA Astrophysics Data System (ADS)

    Martí, Josep; Luque-Escamilla, Pedro L.; Romero, Gustavo E.; Sánchez-Sutil, Juan R.; Muñoz-Arjonilla, Álvaro J.

    2015-06-01

    Context. Astrophysical jets are ubiquitous in the Universe on all scales, but their large-scale dynamics and evolution in time are hard to observe since they usually develop at a very slow pace. Aims: We aim to obtain the first observational proof of the expected large-scale evolution and interaction with the environment in an astrophysical jet. Only jets from microquasars offer a chance to witness the real-time, full-jet evolution within a human lifetime, since they combine a "short", few parsec length with relativistic velocities. Methods: The methodology of this work is based on a systematic recalibraton of interferometric radio observations of microquasars available in public archives. In particular, radio observations of the microquasar GRS 1758-258 over less than two decades have provided the most striking results. Results: Significant morphological variations in the extended jet structure of GRS 1758-258 are reported here that were previously missed. Its northern radio lobe underwent a major morphological variation that rendered the hotspot undetectable in 2001 and reappeared again in the following years. The reported changes confirm the Galactic nature of the source. We tentatively interpret them in terms of the growth of instabilities in the jet flow. There is also evidence of surrounding cocoon. These results can provide a testbed for models accounting for the evolution of jets and their interaction with the environment.

  19. Field-scale effective matrix diffusion coefficient for fractured rock: results from literature survey.

    PubMed

    Zhou, Quanlin; Liu, Hui-Hai; Molz, Fred J; Zhang, Yingqi; Bodvarsson, Gudmundur S

    2007-08-15

    Matrix diffusion is an important mechanism for solute transport in fractured rock. We recently conducted a literature survey on the effective matrix diffusion coefficient, D(m)(e), a key parameter for describing matrix diffusion processes at the field scale. Forty field tracer tests at 15 fractured geologic sites were surveyed and selected for the study, based on data availability and quality. Field-scale D(m)(e) values were calculated, either directly using data reported in the literature, or by reanalyzing the corresponding field tracer tests. The reanalysis was conducted for the selected tracer tests using analytic or semi-analytic solutions for tracer transport in linear, radial, or interwell flow fields. Surveyed data show that the scale factor of the effective matrix diffusion coefficient (defined as the ratio of D(m)(e) to the lab-scale matrix diffusion coefficient, D(m), of the same tracer) is generally larger than one, indicating that the effective matrix diffusion coefficient in the field is comparatively larger than the matrix diffusion coefficient at the rock-core scale. This larger value can be attributed to the many mass-transfer processes at different scales in naturally heterogeneous, fractured rock systems. Furthermore, we observed a moderate, on average trend toward systematic increase in the scale factor with observation scale. This trend suggests that the effective matrix diffusion coefficient is likely to be statistically scale-dependent. The scale-factor value ranges from 0.5 to 884 for observation scales from 5 to 2000 m. At a given scale, the scale factor varies by two orders of magnitude, reflecting the influence of differing degrees of fractured rock heterogeneity at different geologic sites. In addition, the surveyed data indicate that field-scale longitudinal dispersivity generally increases with observation scale, which is consistent with previous studies. The scale-dependent field-scale matrix diffusion coefficient (and dispersivity) may have significant implications for assessing long-term, large-scale radionuclide and contaminant transport events in fractured rock, both for nuclear waste disposal and contaminant remediation.

  20. Geometagenomics illuminates the impact of agriculture on the distribution and prevalence of plant viruses at the ecosystem scale.

    PubMed

    Bernardo, Pauline; Charles-Dominique, Tristan; Barakat, Mohamed; Ortet, Philippe; Fernandez, Emmanuel; Filloux, Denis; Hartnady, Penelope; Rebelo, Tony A; Cousins, Stephen R; Mesleard, François; Cohez, Damien; Yavercovski, Nicole; Varsani, Arvind; Harkins, Gordon W; Peterschmitt, Michel; Malmstrom, Carolyn M; Martin, Darren P; Roumagnac, Philippe

    2018-01-01

    Disease emergence events regularly result from human activities such as agriculture, which frequently brings large populations of genetically uniform hosts into contact with potential pathogens. Although viruses cause nearly 50% of emerging plant diseases, there is little systematic information about virus distribution across agro-ecological interfaces and large gaps in understanding of virus diversity in nature. Here we applied a novel landscape-scale geometagenomics approach to examine relationships between agricultural land use and distributions of plant-associated viruses in two Mediterranean-climate biodiversity hotspots (Western Cape region of South Africa and Rhône river delta region of France). In total, we analysed 1725 geo-referenced plant samples collected over two years from 4.5 × 4.5 km 2 grids spanning farmlands and adjacent uncultivated vegetation. We found substantial virus prevalence (25.8-35.7%) in all ecosystems, but prevalence and identified family-level virus diversity were greatest in cultivated areas, with some virus families displaying strong agricultural associations. Our survey revealed 94 previously unknown virus species, primarily from uncultivated plants. This is the first effort to systematically evaluate plant-associated viromes across broad agro-ecological interfaces. Our findings indicate that agriculture substantially influences plant virus distributions and highlight the extent of current ignorance about the diversity and roles of viruses in nature.

Top