Sample records for large-scale systematic evaluation

  1. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  2. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  3. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    PubMed

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).

  4. Enablers and Barriers to Large-Scale Uptake of Improved Solid Fuel Stoves: A Systematic Review

    PubMed Central

    Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G.

    2013-01-01

    Background: Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. Objectives: We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. Methods: We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as “factors” relating to one of seven domains—fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms—and also recorded issues that impacted equity. Results: We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Conclusions: Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness. Citation: Rehfuess EA, Puzzolo E, Stanistreet D, Pope D, Bruce NG. 2014. Enablers and barriers to large-scale uptake of improved solid fuel stoves: a systematic review. Environ Health Perspect 122:120–130; http://dx.doi.org/10.1289/ehp.1306639 PMID:24300100

  5. Transport Coefficients from Large Deviation Functions

    NASA Astrophysics Data System (ADS)

    Gao, Chloe; Limmer, David

    2017-10-01

    We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.

  6. Recovery of Large Angular Scale CMB Polarization for Instruments Employing Variable-Delay Polarization Modulators

    NASA Technical Reports Server (NTRS)

    Miller, N. J.; Chuss, D. T.; Marriage, T. A.; Wollack, E. J.; Appel, J. W.; Bennett, C. L.; Eimer, J.; Essinger-Hileman, T.; Fixsen, D. J.; Harrington, K.; hide

    2016-01-01

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residual modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/ f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r= 0.01. Indeed, r less than 0.01 is achievable with commensurately improved characterizations and controls.

  7. The use of data from national and other large-scale user experience surveys in local quality work: a systematic review.

    PubMed

    Haugum, Mona; Danielsen, Kirsten; Iversen, Hilde Hestad; Bjertnaes, Oyvind

    2014-12-01

    An important goal for national and large-scale surveys of user experiences is quality improvement. However, large-scale surveys are normally conducted by a professional external surveyor, creating an institutionalized division between the measurement of user experiences and the quality work that is performed locally. The aim of this study was to identify and describe scientific studies related to the use of national and large-scale surveys of user experiences in local quality work. Ovid EMBASE, Ovid MEDLINE, Ovid PsycINFO and the Cochrane Database of Systematic Reviews. Scientific publications about user experiences and satisfaction about the extent to which data from national and other large-scale user experience surveys are used for local quality work in the health services. Themes of interest were identified and a narrative analysis was undertaken. Thirteen publications were included, all differed substantially in several characteristics. The results show that large-scale surveys of user experiences are used in local quality work. The types of follow-up activity varied considerably from conducting a follow-up analysis of user experience survey data to information sharing and more-systematic efforts to use the data as a basis for improving the quality of care. This review shows that large-scale surveys of user experiences are used in local quality work. However, there is a need for more, better and standardized research in this field. The considerable variation in follow-up activities points to the need for systematic guidance on how to use data in local quality work. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  8. The impact of new forms of large-scale general practice provider collaborations on England's NHS: a systematic review.

    PubMed

    Pettigrew, Luisa M; Kumpunen, Stephanie; Mays, Nicholas; Rosen, Rebecca; Posaner, Rachel

    2018-03-01

    Over the past decade, collaboration between general practices in England to form new provider networks and large-scale organisations has been driven largely by grassroots action among GPs. However, it is now being increasingly advocated for by national policymakers. Expectations of what scaling up general practice in England will achieve are significant. To review the evidence of the impact of new forms of large-scale general practice provider collaborations in England. Systematic review. Embase, MEDLINE, Health Management Information Consortium, and Social Sciences Citation Index were searched for studies reporting the impact on clinical processes and outcomes, patient experience, workforce satisfaction, or costs of new forms of provider collaborations between general practices in England. A total of 1782 publications were screened. Five studies met the inclusion criteria and four examined the same general practice networks, limiting generalisability. Substantial financial investment was required to establish the networks and the associated interventions that were targeted at four clinical areas. Quality improvements were achieved through standardised processes, incentives at network level, information technology-enabled performance dashboards, and local network management. The fifth study of a large-scale multisite general practice organisation showed that it may be better placed to implement safety and quality processes than conventional practices. However, unintended consequences may arise, such as perceptions of disenfranchisement among staff and reductions in continuity of care. Good-quality evidence of the impacts of scaling up general practice provider organisations in England is scarce. As more general practice collaborations emerge, evaluation of their impacts will be important to understand which work, in which settings, how, and why. © British Journal of General Practice 2018.

  9. An Evaluation of the Conditions, Processes, and Consequences of Laptop Computing in K-12 Classrooms

    ERIC Educational Resources Information Center

    Cavanaugh, Cathy; Dawson, Kara; Ritzhaupt, Albert

    2011-01-01

    This article examines how laptop computing technology, teacher professional development, and systematic support resulted in changed teaching practices and increased student achievement in 47 K-12 schools in 11 Florida school districts. The overview of a large-scale study documents the type and magnitude of change in student-centered teaching,…

  10. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2016-01-05

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  11. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratapa, Phanisri P.; Suryanarayana, Phanish; Pask, John E.

    We present the Clenshaw–Curtis Spectral Quadrature (SQ) method for real-space O(N) Density Functional Theory (DFT) calculations. In this approach, all quantities of interest are expressed as bilinear forms or sums over bilinear forms, which are then approximated by spatially localized Clenshaw–Curtis quadrature rules. This technique is identically applicable to both insulating and metallic systems, and in conjunction with local reformulation of the electrostatics, enables the O(N) evaluation of the electronic density, energy, and atomic forces. The SQ approach also permits infinite-cell calculations without recourse to Brillouin zone integration or large supercells. We employ a finite difference representation in order tomore » exploit the locality of electronic interactions in real space, enable systematic convergence, and facilitate large-scale parallel implementation. In particular, we derive expressions for the electronic density, total energy, and atomic forces that can be evaluated in O(N) operations. We demonstrate the systematic convergence of energies and forces with respect to quadrature order as well as truncation radius to the exact diagonalization result. In addition, we show convergence with respect to mesh size to established O(N 3) planewave results. In conclusion, we establish the efficiency of the proposed approach for high temperature calculations and discuss its particular suitability for large-scale parallel computation.« less

  13. Spectral Quadrature method for accurate O ( N ) electronic structure calculations of metals and insulators

    DOE PAGES

    Pratapa, Phanisri P.; Suryanarayana, Phanish; Pask, John E.

    2015-12-02

    We present the Clenshaw–Curtis Spectral Quadrature (SQ) method for real-space O(N) Density Functional Theory (DFT) calculations. In this approach, all quantities of interest are expressed as bilinear forms or sums over bilinear forms, which are then approximated by spatially localized Clenshaw–Curtis quadrature rules. This technique is identically applicable to both insulating and metallic systems, and in conjunction with local reformulation of the electrostatics, enables the O(N) evaluation of the electronic density, energy, and atomic forces. The SQ approach also permits infinite-cell calculations without recourse to Brillouin zone integration or large supercells. We employ a finite difference representation in order tomore » exploit the locality of electronic interactions in real space, enable systematic convergence, and facilitate large-scale parallel implementation. In particular, we derive expressions for the electronic density, total energy, and atomic forces that can be evaluated in O(N) operations. We demonstrate the systematic convergence of energies and forces with respect to quadrature order as well as truncation radius to the exact diagonalization result. In addition, we show convergence with respect to mesh size to established O(N 3) planewave results. In conclusion, we establish the efficiency of the proposed approach for high temperature calculations and discuss its particular suitability for large-scale parallel computation.« less

  14. Improving patient safety through the systematic evaluation of patient outcomes

    PubMed Central

    Forster, Alan J.; Dervin, Geoff; Martin, Claude; Papp, Steven

    2012-01-01

    Despite increased advocacy for patient safety and several large-scale programs designed to reduce preventable harm, most notably surgical checklists, recent data evaluating entire health systems suggests that we are no further ahead in improving patient safety and that hospital complications are no less frequent now than in the 1990s. We suggest that the failure to systematically measure patient safety is the reason for our limited progress. In addition to defining patient safety outcomes and describing their financial and clinical impact, we argue why the failure to implement patient safety measurement systems has compromised the ability to move the agenda forward. We also present an overview of how patient safety can be assessed and the strengths and weaknesses of each method and comment on some of the consequences created by the absence of a systematic measurement system. PMID:23177520

  15. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    NASA Astrophysics Data System (ADS)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  16. A Comprehensive Critique and Review of Published Measures of Acne Severity

    PubMed Central

    Furber, Gareth; Leach, Matthew; Segal, Leonie

    2016-01-01

    Objective: Acne vulgaris is a dynamic, complex condition that is notoriously difficult to evaluate. The authors set out to critically evaluate currently available measures of acne severity, particularly in terms of suitability for use in clinical trials. Design: A systematic review was conducted to identify methods used to measure acne severity, using MEDLINE, CINAHL, Scopus, and Wiley Online. Each method was critically reviewed and given a score out of 13 based on eight quality criteria under two broad groupings of psychometric testing and suitability for research and evaluation. Results: Twenty-four methods for assessing acne severity were identified. Four scales received a quality score of zero, and 11 scored ≤3. The highest rated scales achieved a total score of 6. Six scales reported strong inter-rater reliability (ICC>0.75), and four reported strong intra-rater reliability (ICC>0.75). The poor overall performance of most scales, largely characterized by the absence of reliability testing or evidence for independent assessment and validation indicates that generally, their application in clinical trials is not supported. Conclusion: This review and appraisal of instruments for measuring acne severity supports previously identified concerns regarding the quality of published measures. It highlights the need for a valid and reliable acne severity scale, especially for use in research and evaluation. The ideal scale would demonstrate adequate validation and reliability and be easily implemented for third-party analysis. The development of such a scale is critical to interpreting results of trials and facilitating the pooling of results for systematic reviews and meta-analyses. PMID:27672410

  17. Key principles to improve programmes and interventions in complementary feeding.

    PubMed

    Lutter, Chessa K; Iannotti, Lora; Creed-Kanashiro, Hilary; Guyon, Agnes; Daelmans, Bernadette; Robert, Rebecca; Haider, Rukhsana

    2013-09-01

    Although there are some examples of successful complementary feeding programmes to promote healthy growth and prevent stunting at the community level, to date there are few, if any, examples of successful programmes at scale. A lack of systematic process and impact evaluations on pilot projects to generate lessons learned has precluded scaling up of effective programmes. Programmes to effect positive change in nutrition rarely follow systematic planning, implementation, and evaluation (PIE) processes to enhance effectiveness over the long term. As a result a set of programme-oriented key principles to promote healthy growth remains elusive. The purpose of this paper is to fill this gap by proposing a set of principles to improve programmes and interventions to promote healthy growth and development. Identifying such principles for programme success has three requirements: rethinking traditional paradigms used to promote improved infant and young child feeding; ensuring better linkages to delivery platforms; and, improving programming. Following the PIE model for programmes and learning from experiences from four relatively large-scale programmes described in this paper, 10 key principles are identified in the areas of programme planning, programme implementation, programme evaluation, and dissemination, replication, and scaling up. Nonetheless, numerous operational research questions remain, some of which are highlighted in this paper. © 2013 John Wiley & Sons Ltd.

  18. Particle Acceleration in Mildly Relativistic Shearing Flows: The Interplay of Systematic and Stochastic Effects, and the Origin of the Extended High-energy Emission in AGN Jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ruo-Yu; Rieger, F. M.; Aharonian, F. A., E-mail: ruoyu@mpi-hd.mpg.de, E-mail: frank.rieger@mpi-hd.mpg.de, E-mail: aharon@mpi-hd.mpg.de

    The origin of the extended X-ray emission in the large-scale jets of active galactic nuclei (AGNs) poses challenges to conventional models of acceleration and emission. Although electron synchrotron radiation is considered the most feasible radiation mechanism, the formation of the continuous large-scale X-ray structure remains an open issue. As astrophysical jets are expected to exhibit some turbulence and shearing motion, we here investigate the potential of shearing flows to facilitate an extended acceleration of particles and evaluate its impact on the resultant particle distribution. Our treatment incorporates systematic shear and stochastic second-order Fermi effects. We show that for typical parametersmore » applicable to large-scale AGN jets, stochastic second-order Fermi acceleration, which always accompanies shear particle acceleration, can play an important role in facilitating the whole process of particle energization. We study the time-dependent evolution of the resultant particle distribution in the presence of second-order Fermi acceleration, shear acceleration, and synchrotron losses using a simple Fokker–Planck approach and provide illustrations for the possible emergence of a complex (multicomponent) particle energy distribution with different spectral branches. We present examples for typical parameters applicable to large-scale AGN jets, indicating the relevance of the underlying processes for understanding the extended X-ray emission and the origin of ultrahigh-energy cosmic rays.« less

  19. RECOVERY OF LARGE ANGULAR SCALE CMB POLARIZATION FOR INSTRUMENTS EMPLOYING VARIABLE-DELAY POLARIZATION MODULATORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, N. J.; Marriage, T. A.; Appel, J. W.

    2016-02-20

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residualmore » modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r = 0.01. Indeed, r < 0.01 is achievable with commensurately improved characterizations and controls.« less

  20. Atmospheric gravity waves with small vertical-to-horizotal wavelength ratios

    NASA Astrophysics Data System (ADS)

    Song, I. S.; Jee, G.; Kim, Y. H.; Chun, H. Y.

    2017-12-01

    Gravity wave modes with small vertical-to-horizontal wavelength ratios of an order of 10-3 are investigated through the systematic scale analysis of governing equations for gravity wave perturbations embedded in the quasi-geostrophic large-scale flow. These waves can be categorized as acoustic gravity wave modes because their total energy is given by the sum of kinetic, potential, and elastic parts. It is found that these waves can be forced by density fluctuations multiplied by the horizontal gradients of the large-scale pressure (geopotential) fields. These theoretical findings are evaluated using the results of a high-resolution global model (Specified Chemistry WACCM with horizontal resolution of 25 km and vertical resolution of 600 m) by computing the density-related gravity-wave forcing terms from the modeling results.

  1. Enablers and barriers to large-scale uptake of improved solid fuel stoves: a systematic review.

    PubMed

    Rehfuess, Eva A; Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G

    2014-02-01

    Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as "factors" relating to one of seven domains-fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms-and also recorded issues that impacted equity. We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness.

  2. A systematic approach for the development of a kindergarten-based intervention for the prevention of obesity in preschool age children: the ToyBox-study.

    PubMed

    Manios, Y; Grammatikaki, E; Androutsos, O; Chinapaw, M J M; Gibson, E L; Buijs, G; Iotova, V; Socha, P; Annemans, L; Wildgruber, A; Mouratidou, T; Yngve, A; Duvinage, K; de Bourdeaudhuij, I

    2012-03-01

    The increasing childhood obesity epidemic calls for appropriate measures and effective policies to be applied early in life. Large-scale socioecological frameworks providing a holistic multifactorial and cost-effective approach necessary to support obesity prevention initiatives in this age are however currently missing. To address this missing link, ToyBox-study aims to build and evaluate a cost-effective kindergarten-based, family-involved intervention scheme to prevent obesity in early childhood, which could potentially be expanded on a pan-European scale. A multidisciplinary team of researchers from 10 countries have joined forces and will work to realize this according to a systematic stepwise approach that combines the use of the PRECEDE-PROCEED model and intervention mapping protocol. ToyBox-study will conduct systematic and narrative reviews, secondary data analyses, focus group research and societal assessment to design, implement and evaluate outcome, impact, process and cost effectiveness of the intervention. This is the first time that such a holistic approach has been used on a pan-European scale to promote healthy weight and healthy energy balance-related behaviours for the prevention of early childhood obesity. The results of ToyBox-study will be disseminated among key stakeholders including researchers, policy makers, practitioners and the general population. © 2012 The Authors. obesity reviews © 2012 International Association for the Study of Obesity.

  3. Evaluation of the synoptic and mesoscale predictive capabilities of a mesoscale atmospheric simulation system

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K.; Keyser, D. A.; Mccumber, M. C.

    1983-01-01

    The overall performance characteristics of a limited area, hydrostatic, fine (52 km) mesh, primitive equation, numerical weather prediction model are determined in anticipation of satellite data assimilations with the model. The synoptic and mesoscale predictive capabilities of version 2.0 of this model, the Mesoscale Atmospheric Simulation System (MASS 2.0), were evaluated. The two part study is based on a sample of approximately thirty 12h and 24h forecasts of atmospheric flow patterns during spring and early summer. The synoptic scale evaluation results benchmark the performance of MASS 2.0 against that of an operational, synoptic scale weather prediction model, the Limited area Fine Mesh (LFM). The large sample allows for the calculation of statistically significant measures of forecast accuracy and the determination of systematic model errors. The synoptic scale benchmark is required before unsmoothed mesoscale forecast fields can be seriously considered.

  4. An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.

    PubMed

    2012-11-01

    Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.

  5. Wind power for the electric-utility industry: Policy incentives for fuel conservation

    NASA Astrophysics Data System (ADS)

    March, F.; Dlott, E. H.; Korn, D. H.; Madio, F. R.; McArthur, R. C.; Vachon, W. A.

    1982-06-01

    A systematic method for evaluating the economics of solar-electric/conservation technologies as fuel-savings investments for electric utilities in the presence of changing federal incentive policies is presented. The focus is on wind energy conversion systems (WECS) as the solar technology closest to near-term large scale implementation. Commercially available large WECS are described, along with computer models to calculate the economic impact of the inclusion of WECS as 10% of the base-load generating capacity on a grid. A guide to legal structures and relationships which impinge on large-scale WECS utilization is developed, together with a quantitative examination of the installation of 1000 MWe of WECS capacity by a utility in the northeast states. Engineering and financial analyses were performed, with results indicating government policy changes necessary to encourage the entrance of utilities into the field of windpower utilization.

  6. Developing a "Semi-Systematic" Approach to Using Large-Scale Data-Sets for Small-Scale Interventions: The "Baby Matterz" Initiative as a Case Study

    ERIC Educational Resources Information Center

    O'Brien, Mark

    2011-01-01

    The appropriateness of using statistical data to inform the design of any given service development or initiative often depends upon judgements regarding scale. Large-scale data sets, perhaps national in scope, whilst potentially important in informing the design, implementation and roll-out of experimental initiatives, will often remain unused…

  7. Constraining the baryon-dark matter relative velocity with the large-scale 3-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepian, Zachary; Slosar, Anze; Eisenstein, Daniel J.

    We search for a galaxy clustering bias due to a modulation of galaxy number with the baryon-dark matter relative velocity resulting from recombination-era physics. We find no detected signal and place the constraint bv <0.01 on the relative velocity bias for the CMASS galaxies. This bias is an important potential systematic of Baryon Acoustic Oscillation (BAO) method measurements of the cosmic distance scale using the 2-point clustering. Our limit on the relative velocity bias indicates a systematic shift of no more than 0.3% rms in the distance scale inferred from the BAO feature in the BOSS 2-point clustering, well belowmore » the 1% statistical error of this measurement. In conclusion, this constraint is the most stringent currently available and has important implications for the ability of upcoming large-scale structure surveys such as DESI to self-protect against the relative velocity as a possible systematic.« less

  8. Constraining the baryon-dark matter relative velocity with the large-scale 3-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    DOE PAGES

    Slepian, Zachary; Slosar, Anze; Eisenstein, Daniel J.; ...

    2017-10-24

    We search for a galaxy clustering bias due to a modulation of galaxy number with the baryon-dark matter relative velocity resulting from recombination-era physics. We find no detected signal and place the constraint bv <0.01 on the relative velocity bias for the CMASS galaxies. This bias is an important potential systematic of Baryon Acoustic Oscillation (BAO) method measurements of the cosmic distance scale using the 2-point clustering. Our limit on the relative velocity bias indicates a systematic shift of no more than 0.3% rms in the distance scale inferred from the BAO feature in the BOSS 2-point clustering, well belowmore » the 1% statistical error of this measurement. In conclusion, this constraint is the most stringent currently available and has important implications for the ability of upcoming large-scale structure surveys such as DESI to self-protect against the relative velocity as a possible systematic.« less

  9. Constraining the baryon-dark matter relative velocity with the large-scale three-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.; Blazek, Jonathan A.; Brownstein, Joel R.; Chuang, Chia-Hsun; Gil-Marín, Héctor; Ho, Shirley; Kitaura, Francisco-Shu; McEwen, Joseph E.; Percival, Will J.; Ross, Ashley J.; Rossi, Graziano; Seo, Hee-Jong; Slosar, Anže; Vargas-Magaña, Mariana

    2018-02-01

    We search for a galaxy clustering bias due to a modulation of galaxy number with the baryon-dark matter relative velocity resulting from recombination-era physics. We find no detected signal and place the constraint bv < 0.01 on the relative velocity bias for the CMASS galaxies. This bias is an important potential systematic of baryon acoustic oscillation (BAO) method measurements of the cosmic distance scale using the two-point clustering. Our limit on the relative velocity bias indicates a systematic shift of no more than 0.3 per cent rms in the distance scale inferred from the BAO feature in the BOSS two-point clustering, well below the 1 per cent statistical error of this measurement. This constraint is the most stringent currently available and has important implications for the ability of upcoming large-scale structure surveys such as the Dark Energy Spectroscopic Instrument (DESI) to self-protect against the relative velocity as a possible systematic.

  10. The Diversity of School Organizational Configurations

    ERIC Educational Resources Information Center

    Lee, Linda C.

    2013-01-01

    School reform on a large scale has largely been unsuccessful. Approaches designed to document and understand the variety of organizational conditions that comprise our school systems are needed so that reforms can be tailored and results scaled. Therefore, this article develops a configurational framework that allows a systematic analysis of many…

  11. Evaluating the Impact of Conceptual Knowledge Engineering on the Design and Usability of a Clinical and Translational Science Collaboration Portal

    PubMed Central

    Payne, Philip R.O.; Borlawsky, Tara B.; Rice, Robert; Embi, Peter J.

    2010-01-01

    With the growing prevalence of large-scale, team science endeavors in the biomedical and life science domains, the impetus to implement platforms capable of supporting asynchronous interaction among multidisciplinary groups of collaborators has increased commensurately. However, there is a paucity of literature describing systematic approaches to identifying the information needs of targeted end-users for such platforms, and the translation of such requirements into practicable software component design criteria. In previous studies, we have reported upon the efficacy of employing conceptual knowledge engineering (CKE) techniques to systematically address both of the preceding challenges in the context of complex biomedical applications. In this manuscript we evaluate the impact of CKE approaches relative to the design of a clinical and translational science collaboration portal, and report upon the preliminary qualitative users satisfaction as reported for the resulting system. PMID:21347146

  12. Characterizing unknown systematics in large scale structure surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Nishant; Ho, Shirley; Myers, Adam D.

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data,more » we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.« less

  13. The role of the airline transportation network in the prediction and predictability of global epidemics.

    PubMed

    Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro

    2006-02-14

    The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.

  14. Experimental quiet engine program

    NASA Technical Reports Server (NTRS)

    Cornell, W. G.

    1975-01-01

    Full-scale low-tip-speed fans, a full-scale high-tip-speed fan, scale model versions of fans, and two full-scale high-bypass-ratio turbofan engines, were designed, fabricated, tested, and evaluated. Turbine noise suppression was investigated. Preliminary design studies of flight propulsion system concepts were used in application studies to determine acoustic-economic tradeoffs. Salient results are as follows: tradeoff evaluation of fan tip speed and blade loading; systematic data on source noise characteristics and suppression effectiveness; documentation of high- and low-fan-speed aerodynamic and acoustic technology; aerodynamic and acoustic evaluation of acoustic treatment configurations, casing tip bleed, serrated and variable pitch rotor blades, leaned outlet guide vanes, slotted tip casings, rotor blade shape modifications, and inlet noise suppression; systematic evaluation of aerodynamic and acoustic effects; flyover noise projections of engine test data; turbine noise suppression technology development; and tradeoff evaluation of preliminary design high-fan-speed and low-fan-speed flight engines.

  15. A systematic review of systematic reviews on interventions for caregivers of people with chronic conditions.

    PubMed

    Corry, Margarita; While, Alison; Neenan, Kathleen; Smith, Valerie

    2015-04-01

    To evaluate the effectiveness of interventions to support caregivers of people with selected chronic conditions. Informal caregivers provide millions of care hours each week contributing to significant healthcare savings. Despite much research evaluating a range of interventions for caregivers, their impact remains unclear. A systematic review of systematic reviews of interventions to support caregivers of people with selected chronic conditions. The electronic databases of PubMed, CINAHL, British Nursing Index, PsycINFO, Social Science Index (January 1990-May 2014) and The Cochrane Library (Issue 6, June 2014), were searched using Medical Subject Heading and index term combinations of the keywords caregiver, systematic review, intervention and named chronic conditions. Papers were included if they reported a systematic review of interventions for caregivers of people with chronic conditions. The methodological quality of the included reviews was independently assessed by two reviewers using R-AMSTAR. Data were independently extracted by two reviewers using a pre-designed data extraction form. Narrative synthesis of review findings was used to present the results. Eight systematic reviews were included. There was evidence that education and support programme interventions improved caregiver quality of life. Information-giving interventions improved caregiver knowledge for stroke caregivers. Education, support and information-giving interventions warrant further investigation across caregiver groups. A large-scale funded programme for caregiver research is required to ensure that studies are of high quality to inform service development across settings. © 2014 John Wiley & Sons Ltd.

  16. Large-Scale Assessments of Students' Learning and Education Policy: Synthesising Evidence across World Regions

    ERIC Educational Resources Information Center

    Tobin, Mollie; Nugroho, Dita; Lietz, Petra

    2016-01-01

    This article synthesises findings from two systematic reviews that examined evidence of the link between large-scale assessments (LSAs) and education policy in economically developing countries and in countries of the Asia-Pacific. Analyses summarise evidence of assessment characteristics and policy goals of LSAs that influence education policy,…

  17. Towards national-scale greenhouse gas emissions evaluation with robust uncertainty estimates

    NASA Astrophysics Data System (ADS)

    Rigby, Matthew; Swallow, Ben; Lunt, Mark; Manning, Alistair; Ganesan, Anita; Stavert, Ann; Stanley, Kieran; O'Doherty, Simon

    2016-04-01

    Through the Deriving Emissions related to Climate Change (DECC) network and the Greenhouse gAs Uk and Global Emissions (GAUGE) programme, the UK's greenhouse gases are now monitored by instruments mounted on telecommunications towers and churches, on a ferry that performs regular transects of the North Sea, on-board a research aircraft and from space. When combined with information from high-resolution chemical transport models such as the Met Office Numerical Atmospheric dispersion Modelling Environment (NAME), these measurements are allowing us to evaluate emissions more accurately than has previously been possible. However, it has long been appreciated that current methods for quantifying fluxes using atmospheric data suffer from uncertainties, primarily relating to the chemical transport model, that have been largely ignored to date. Here, we use novel model reduction techniques for quantifying the influence of a set of potential systematic model errors on the outcome of a national-scale inversion. This new technique has been incorporated into a hierarchical Bayesian framework, which can be shown to reduce the influence of subjective choices on the outcome of inverse modelling studies. Using estimates of the UK's methane emissions derived from DECC and GAUGE tall-tower measurements as a case study, we will show that such model systematic errors have the potential to significantly increase the uncertainty on national-scale emissions estimates. Therefore, we conclude that these factors must be incorporated in national emissions evaluation efforts, if they are to be credible.

  18. Moored offshore structures - evaluation of forces in elastic mooring lines

    NASA Astrophysics Data System (ADS)

    Crudu, L.; Obreja, D. C.; Marcu, O.

    2016-08-01

    In most situations, the high frequency motions of the floating structure induce important effects in the mooring lines which affect also the motions of the structure. The experience accumulated during systematic experimental tests and calculations, carried out for different moored floating structures, showed a complex influence of various parameters on the dynamic effects. Therefore, it was considered that a systematic investigation is necessary. Due to the complexity of hydrodynamics aspects of offshore structures behaviour, experimental tests are practically compulsory in order to be able to properly evaluate and then to validate their behaviour in real sea. Moreover the necessity to carry out hydrodynamic tests is often required by customers, classification societies and other regulatory bodies. Consequently, the correct simulation of physical properties of the complex scaled models becomes a very important issue. The paper is investigating such kind of problems identifying the possible simplification, generating different approaches. One of the bases of the evaluation has been found consideringtheresults of systematic experimental tests on the dynamic behaviour of a mooring chain reproduced at five different scales. Dynamic effects as well as the influences of the elasticity simulation for 5 different scales are evaluated together. The paper presents systematic diagrams and practical results for a typical moored floating structure operating as pipe layer based on motion evaluations and accelerations in waves.

  19. Evaluation of Bias-Variance Trade-Off for Commonly Used Post-Summarizing Normalization Procedures in Large-Scale Gene Expression Studies

    PubMed Central

    Qiu, Xing; Hu, Rui; Wu, Zhixin

    2014-01-01

    Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114

  20. Continuous Flow Polymer Synthesis toward Reproducible Large-Scale Production for Efficient Bulk Heterojunction Organic Solar Cells.

    PubMed

    Pirotte, Geert; Kesters, Jurgen; Verstappen, Pieter; Govaerts, Sanne; Manca, Jean; Lutsen, Laurence; Vanderzande, Dirk; Maes, Wouter

    2015-10-12

    Organic photovoltaics (OPV) have attracted great interest as a solar cell technology with appealing mechanical, aesthetical, and economies-of-scale features. To drive OPV toward economic viability, low-cost, large-scale module production has to be realized in combination with increased top-quality material availability and minimal batch-to-batch variation. To this extent, continuous flow chemistry can serve as a powerful tool. In this contribution, a flow protocol is optimized for the high performance benzodithiophene-thienopyrroledione copolymer PBDTTPD and the material quality is probed through systematic solar-cell evaluation. A stepwise approach is adopted to turn the batch process into a reproducible and scalable continuous flow procedure. Solar cell devices fabricated using the obtained polymer batches deliver an average power conversion efficiency of 7.2 %. Upon incorporation of an ionic polythiophene-based cathodic interlayer, the photovoltaic performance could be enhanced to a maximum efficiency of 9.1 %. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. A Scale for Evaluating Practicum Students in Counseling and Supervision

    ERIC Educational Resources Information Center

    Myrick, Robert D.; Kelly, F. Donald, Jr.

    1971-01-01

    This article presents an instrument, the Counselor Evaluation Rating Scale, which can be used as an aid in the systematic evaluation of a student counselor in a supervised counseling experience. Development of the CERS and its reliability are discussed. (Author)

  2. Spatially distributed potential evapotranspiration modeling and climate projections.

    PubMed

    Gharbia, Salem S; Smullen, Trevor; Gill, Laurence; Johnston, Paul; Pilla, Francesco

    2018-08-15

    Evapotranspiration integrates energy and mass transfer between the Earth's surface and atmosphere and is the most active mechanism linking the atmosphere, hydrosphsophere, lithosphere and biosphere. This study focuses on the fine resolution modeling and projection of spatially distributed potential evapotranspiration on the large catchment scale as response to climate change. Six potential evapotranspiration designed algorithms, systematically selected based on a structured criteria and data availability, have been applied and then validated to long-term mean monthly data for the Shannon River catchment with a 50m 2 cell size. The best validated algorithm was therefore applied to evaluate the possible effect of future climate change on potential evapotranspiration rates. Spatially distributed potential evapotranspiration projections have been modeled based on climate change projections from multi-GCM ensembles for three future time intervals (2020, 2050 and 2080) using a range of different Representative Concentration Pathways producing four scenarios for each time interval. Finally, seasonal results have been compared to baseline results to evaluate the impact of climate change on the potential evapotranspiration and therefor on the catchment dynamical water balance. The results present evidence that the modeled climate change scenarios would have a significant impact on the future potential evapotranspiration rates. All the simulated scenarios predicted an increase in potential evapotranspiration for each modeled future time interval, which would significantly affect the dynamical catchment water balance. This study addresses the gap in the literature of using GIS-based algorithms to model fine-scale spatially distributed potential evapotranspiration on the large catchment systems based on climatological observations and simulations in different climatological zones. Providing fine-scale potential evapotranspiration data is very crucial to assess the dynamical catchment water balance to setup management scenarios for the water abstractions. This study illustrates a transferable systematic method to design GIS-based algorithms to simulate spatially distributed potential evapotranspiration on the large catchment systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Forecasting eruption size: what we know, what we don't know

    NASA Astrophysics Data System (ADS)

    Papale, Paolo

    2017-04-01

    Any eruption forecast includes an evaluation of the expected size of the forthcoming eruption, usually expressed as the probability associated to given size classes. Such evaluation is mostly based on the previous volcanic history at the specific volcano, or it is referred to a broader class of volcanoes constituting "analogues" of the one under specific consideration. In any case, use of knowledge from past eruptions implies considering the completeness of the reference catalogue, and most importantly, the existence of systematic biases in the catalogue, that may affect probability estimates and translate into biased volcanic hazard forecasts. An analysis of existing catalogues, with major reference to the catalogue from the Smithsonian Global Volcanism Program, suggests that systematic biases largely dominate at global, regional and local scale: volcanic histories reconstructed at individual volcanoes, often used as a reference for volcanic hazard forecasts, are the result of systematic loss of information with time and poor sample representativeness. That situation strictly requires the use of techniques to complete existing catalogues, as well as careful consideration of the uncertainties deriving from inadequate knowledge and model-dependent data elaboration. A reconstructed global eruption size distribution, obtained by merging information from different existing catalogues, shows a mode in the VEI 1-2 range, <0.1% incidence of eruptions with VEI 7 or larger, and substantial uncertainties associated with individual VEI frequencies. Even larger uncertainties are expected to derive from application to individual volcanoes or classes of analogue volcanoes, suggesting large to very large uncertainties associated to volcanic hazard forecasts virtually at any individual volcano worldwide.

  4. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    NASA Astrophysics Data System (ADS)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  5. Are large-scale flow experiments informing the science and management of freshwater ecosystems?

    USGS Publications Warehouse

    Olden, Julian D.; Konrad, Christopher P.; Melis, Theodore S.; Kennard, Mark J.; Freeman, Mary C.; Mims, Meryl C.; Bray, Erin N.; Gido, Keith B.; Hemphill, Nina P.; Lytle, David A.; McMullen, Laura E.; Pyron, Mark; Robinson, Christopher T.; Schmidt, John C.; Williams, John G.

    2013-01-01

    Greater scientific knowledge, changing societal values, and legislative mandates have emphasized the importance of implementing large-scale flow experiments (FEs) downstream of dams. We provide the first global assessment of FEs to evaluate their success in advancing science and informing management decisions. Systematic review of 113 FEs across 20 countries revealed that clear articulation of experimental objectives, while not universally practiced, was crucial for achieving management outcomes and changing dam-operating policies. Furthermore, changes to dam operations were three times less likely when FEs were conducted primarily for scientific purposes. Despite the recognized importance of riverine flow regimes, four-fifths of FEs involved only discrete flow events. Over three-quarters of FEs documented both abiotic and biotic outcomes, but only one-third examined multiple taxonomic responses, thus limiting how FE results can inform holistic dam management. Future FEs will present new opportunities to advance scientifically credible water policies.

  6. Progressive Mid-latitude Afforestation: Local and Remote Climate Impacts in the Framework of Two Coupled Earth System Models

    NASA Astrophysics Data System (ADS)

    Lague, Marysa

    Vegetation influences the atmosphere in complex and non-linear ways, such that large-scale changes in vegetation cover can drive changes in climate on both local and global scales. Large-scale land surface changes have been shown to introduce excess energy to one hemisphere, causing a shift in atmospheric circulation on a global scale. However, past work has not quantified how the climate response scales with the area of vegetation. Here, we systematically evaluate the response of climate to linearly increasing the area of forest cover over the northern mid-latitudes. We show that the magnitude of afforestation of the northern mid-latitudes determines the climate response in a non-linear fashion, and identify a threshold in vegetation-induced cloud feedbacks - a concept not previously addressed by large-scale vegetation manipulation experiments. Small increases in tree cover drive compensating cloud feedbacks, while latent heat fluxes reach a threshold after sufficiently large increases in tree cover, causing the troposphere to warm and dry, subsequently reducing cloud cover. Increased absorption of solar radiation at the surface is driven by both surface albedo changes and cloud feedbacks. We identify how vegetation-induced changes in cloud cover further feedback on changes in the global energy balance. We also show how atmospheric cross-equatorial energy transport changes as the area of afforestation is incrementally increased (a relationship which has not previously been demonstrated). This work demonstrates that while some climate effects (such as energy transport) of large scale mid-latitude afforestation scale roughly linearly across a wide range of afforestation areas, others (such as the local partitioning of the surface energy budget) are non-linear, and sensitive to the particular magnitude of mid-latitude forcing. Our results highlight the importance of considering both local and remote climate responses to large-scale vegetation change, and explore the scaling relationship between changes in vegetation cover and the resulting climate impacts.

  7. Measurement tools for assessment of older age bipolar disorder: A systematic review of the recent global literature.

    PubMed

    Rej, Soham; Quayle, William; Forester, Brent P; Dols, Annemiek; Gatchel, Jennifer; Chen, Peijun; Gough, Sarah; Fox, Rebecca; Sajatovic, Martha; Strejilevich, Sergio A; Eyler, Lisa T

    2018-06-01

    More than 50% of people with bipolar disorder will be age 60 years or older by 2030. There is a need for more data to guide assessment and treatment in older age bipolar disorder (OABD); however, interpretation of findings from small, single-site studies may not be generalizable and there are few large trials. As a step in the direction of coordinated large-scale OABD data collection, it is critical to identify which measurements are currently used and identify potential gaps in domains typically assessed. An international group of OABD experts performed a systematic literature review to identify studies examining OABD in the past 6 years. Relevant articles were assessed to categorize the types of clinical, cognitive, biomarker, and neuroimaging OABD tools routinely used in OABD studies. A total of 53 papers were identified, with a broad range of assessments. Most studies evaluated demographic and clinical domains, with fewer studies assessing cognition. There are relatively few biomarker and neuroimaging data, and data collection methods were less comprehensively covered. Assessment tools used in the recent OABD literature may help to identify both a minimum and a comprehensive dataset that should be evaluated in OABD. Our review also highlights gaps where key clinical outcomes have not been routinely assessed. Biomarker and neuroimaging assessment could be further developed and standardized. Clinical data could be combined with neuroimaging, genetic, and other biomarkers in large-scale coordinated data collection to further improve our understanding of OABD phenomenology and biology, thereby contributing to research that advances care. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. The Job Satisfaction of Finnish Nursing Staff: The Development of a Job Satisfaction Scale and Survey Results

    PubMed Central

    Kvist, Tarja; Mäntynen, Raija; Partanen, Pirjo; Turunen, Hannele; Miettinen, Merja; Vehviläinen-Julkunen, Katri

    2012-01-01

    This paper describes the development of the Kuopio University Hospital Job Satisfaction Scale (KUHJSS) and the results of the survey. The scale was developed through a systematic literature review, and its validity and reliability were assessed using several psychometric properties including expert evaluation (n = 5), a pilot survey (n = 172), and exploratory factor analysis. The final version of KUHJSS included 37 items. A large sample psychometric evaluation was made by nursing staff (n = 2708). The exploratory factor analysis revealed seven factors with modest internal consistency (0.64–0.92). The staff reported relatively high job satisfaction. The greatest satisfaction was derived from motivating factors associated with the work; the least, from the job's demands. Respondents who considered their working units to provide an excellent quality of care reported the highest job satisfaction in every subarea (P < .0001). The KUHJSS proved to be a reliable and valid tool for measuring job satisfaction in hospital care. PMID:23133750

  9. Multiscale solvers and systematic upscaling in computational physics

    NASA Astrophysics Data System (ADS)

    Brandt, A.

    2005-07-01

    Multiscale algorithms can overcome the scale-born bottlenecks that plague most computations in physics. These algorithms employ separate processing at each scale of the physical space, combined with interscale iterative interactions, in ways which use finer scales very sparingly. Having been developed first and well known as multigrid solvers for partial differential equations, highly efficient multiscale techniques have more recently been developed for many other types of computational tasks, including: inverse PDE problems; highly indefinite (e.g., standing wave) equations; Dirac equations in disordered gauge fields; fast computation and updating of large determinants (as needed in QCD); fast integral transforms; integral equations; astrophysics; molecular dynamics of macromolecules and fluids; many-atom electronic structures; global and discrete-state optimization; practical graph problems; image segmentation and recognition; tomography (medical imaging); fast Monte-Carlo sampling in statistical physics; and general, systematic methods of upscaling (accurate numerical derivation of large-scale equations from microscopic laws).

  10. Cost effects of hospital mergers in Portugal.

    PubMed

    Azevedo, Helda; Mateus, Céu

    2014-12-01

    The Portuguese hospital sector has been restructured by wide-ranging hospital mergers, following a conviction among policy makers that bigger hospitals lead to lower average costs. Since the effects of mergers have not been systematically evaluated, the purpose of this article is to contribute to this area of knowledge by assessing potential economies of scale to explore and compare these results with realized cost savings after mergers. Considering the period 2003-2009, we estimate the translog cost function to examine economies of scale in the years preceding restructuring. Additionally, we use the difference-in-differences approach to evaluate hospital centres (HC) that occurred between 2004 and 2007, comparing the years after and before mergers. Our findings suggest that economies of scale are present in the pre-merger configuration with an optimum hospital size of around 230 beds. However, the mergers between two or more hospitals led to statistically significant post-merger cost increases, of about 8 %. This result indicates that some HC become too large to explore economies of scale and suggests the difficulty of achieving efficiencies through combining operations and service specialization.

  11. Numerical Large Deviation Analysis of the Eigenstate Thermalization Hypothesis

    NASA Astrophysics Data System (ADS)

    Yoshizawa, Toru; Iyoda, Eiki; Sagawa, Takahiro

    2018-05-01

    A plausible mechanism of thermalization in isolated quantum systems is based on the strong version of the eigenstate thermalization hypothesis (ETH), which states that all the energy eigenstates in the microcanonical energy shell have thermal properties. We numerically investigate the ETH by focusing on the large deviation property, which directly evaluates the ratio of athermal energy eigenstates in the energy shell. As a consequence, we have systematically confirmed that the strong ETH is indeed true even for near-integrable systems. Furthermore, we found that the finite-size scaling of the ratio of athermal eigenstates is a double exponential for nonintegrable systems. Our result illuminates the universal behavior of quantum chaos, and suggests that a large deviation analysis would serve as a powerful method to investigate thermalization in the presence of the large finite-size effect.

  12. A Systematic Review and Psychometric Evaluation of Adaptive Behavior Scales and Recommendations for Practice

    ERIC Educational Resources Information Center

    Floyd, Randy G.; Shands, Elizabeth I.; Alfonso, Vincent C.; Phillips, Jessica F.; Autry, Beth K.; Mosteller, Jessica A.; Skinner, Mary; Irby, Sarah

    2015-01-01

    Adaptive behavior scales are vital in assessing children and adolescents who experience a range of disabling conditions in school settings. This article presents the results of an evaluation of the design characteristics, norming, scale characteristics, reliability and validity evidence, and bias identification studies supporting 14…

  13. Monitoring Forest Condition in Europe: Impacts of Nitrogen and Sulfur Depositions on Forest Ecosystems

    Treesearch

    M. Lorenz; G. Becher; V. Mues; E. Ulrich

    2006-01-01

    Forest condition in Europe has been monitored over 19 years jointly by the United Nations Economic Commission for Europe (UNECE) and the European Union (EU). Large-scale variations of forest condition over space and time in relation to natural and anthropogenic factors are assessed on about 6,000 plots systematically spread across Europe. This large-scale monitoring...

  14. Assessment of Somatization and Medically Unexplained Symptoms in Later Life

    PubMed Central

    van Driel, T. J. W.; Hilderink, P. H.; Hanssen, D. J. C.; de Boer, P.; Rosmalen, J. G. M.; Oude Voshaar, R. C.

    2017-01-01

    The assessment of medically unexplained symptoms and “somatic symptom disorders” in older adults is challenging due to somatic multimorbidity, which threatens the validity of somatization questionnaires. In a systematic review study, the Patient Health Questionnaire–15 (PHQ-15) and the somatization subscale of the Symptom Checklist 90-item version (SCL-90 SOM) are recommended out of 40 questionnaires for usage in large-scale studies. While both scales measure physical symptoms which in younger persons often refer to unexplained symptoms, in older persons, these symptoms may originate from somatic diseases. Using empirical data, we show that PHQ-15 and SCL-90 SOM among older patients correlate with proxies of somatization as with somatic disease burden. Updating the previous systematic review, revealed six additional questionnaires. Cross-validation studies are needed as none of 46 identified scales met the criteria of suitability for an older population. Nonetheless, specific recommendations can be made for studying older persons, namely the SCL-90 SOM and PHQ-15 for population-based studies, the Freiburg Complaint List and somatization subscale of the Brief Symptom Inventory 53-item version for studies in primary care, and finally the Schedule for Evaluating Persistent Symptoms and Somatic Symptom Experiences Questionnaire for monitoring treatment studies. PMID:28745072

  15. Obtaining systematic teacher reports of disruptive behavior disorders utilizing DSM-IV.

    PubMed

    Wolraich, M L; Feurer, I D; Hannah, J N; Baumgaertel, A; Pinnock, T Y

    1998-04-01

    This study examines the psychometric properties of the Vanderbilt AD/HD Diagnostic Teacher Rating Scale (VADTRS) and provides preliminary normative data from a large, geographically defined population. The VADTRS consists of the complete list of DSM-IV AD/HD symptoms, a screen for other disruptive behavior disorders, anxiety and depression, and ratings of academic and classroom behavior performance. Teachers in one suburban county completed the scale for their students during 2 consecutive years. Statistical methods included (a) exploratory and confirmatory latent variable analyses of item data, (b) evaluation of the internal consistency of the latent dimensions, (c) evaluation of latent structure concordance between school year samples, and (d) preliminary evaluation of criterion-related validity. The instrument comprises four behavioral dimensions and two performance dimensions. The behavioral dimensions were concordant between school years and were consistent with a priori DSM-IV diagnostic criteria. Correlations between latent dimensions and relevant, known disorders or problems varied from .25 to .66.

  16. Framework for rapid assessment and adoption of new vector control tools.

    PubMed

    Vontas, John; Moore, Sarah; Kleinschmidt, Immo; Ranson, Hilary; Lindsay, Steve; Lengeler, Christian; Hamon, Nicholas; McLean, Tom; Hemingway, Janet

    2014-04-01

    Evidence-informed health policy making is reliant on systematic access to, and appraisal of, the best available research evidence. This review suggests a strategy to improve the speed at which evidence is gathered on new vector control tools (VCTs) using a framework based on measurements of the vectorial capacity of an insect population to transmit disease. We explore links between indicators of VCT efficacy measurable in small-scale experiments that are relevant to entomological and epidemiological parameters measurable only in large-scale proof-of-concept randomised control trials (RCTs). We hypothesise that once RCTs establish links between entomological and epidemiological indicators then rapid evaluation of new products within the same product category may be conducted through smaller scale experiments without repetition of lengthy and expensive RCTs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Assessing Hydrological and Energy Budgets in Amazonia through Regional Downscaling, and Comparisons with Global Reanalysis Products

    NASA Astrophysics Data System (ADS)

    Nunes, A.; Ivanov, V. Y.

    2014-12-01

    Although current global reanalyses provide reasonably accurate large-scale features of the atmosphere, systematic errors are still found in the hydrological and energy budgets of such products. In the tropics, precipitation is particularly challenging to model, which is also adversely affected by the scarcity of hydrometeorological datasets in the region. With the goal of producing downscaled analyses that are appropriate for a climate assessment at regional scales, a regional spectral model has used a combination of precipitation assimilation with scale-selective bias correction. The latter is similar to the spectral nudging technique, which prevents the departure of the regional model's internal states from the large-scale forcing. The target area in this study is the Amazon region, where large errors are detected in reanalysis precipitation. To generate the downscaled analysis, the regional climate model used NCEP/DOE R2 global reanalysis as the initial and lateral boundary conditions, and assimilated NOAA's Climate Prediction Center (CPC) MORPHed precipitation (CMORPH), available at 0.25-degree resolution, every 3 hours. The regional model's precipitation was successfully brought closer to the observations, in comparison to the NCEP global reanalysis products, as a result of the impact of a precipitation assimilation scheme on cumulus-convection parameterization, and improved boundary forcing achieved through a new version of scale-selective bias correction. Water and energy budget terms were also evaluated against global reanalyses and other datasets.

  18. Intercomparison of methods of coupling between convection and large-scale circulation: 2. Comparison over nonuniform surface conditions

    DOE PAGES

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.; ...

    2016-03-18

    As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison ofmore » the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. Lastly, these large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.« less

  19. A potential role of anti-poverty programs in health promotion

    PubMed Central

    Silverman, Kenneth; Holtyn, August F.; Jarvis, Brantley

    2016-01-01

    Poverty is one of the most pervasive risk factors underlying poor health, but is rarely targeted to improve health. Research on the effects of anti-poverty interventions on health has been limited, at least in part because funding for that research has been limited. Anti-poverty programs have been applied on a large scale, frequently by governments, but without systematic development and cumulative programmatic experimental studies. Anti-poverty programs that produce lasting effects on poverty have not been developed. Before evaluating the effect of anti-poverty programs on health, programs must be developed that can reduce poverty consistently. Anti-poverty programs require systematic development and cumulative programmatic scientific evaluation. Research on the therapeutic workplace could provide a model for that research and an adaptation of the therapeutic workplace could serve as a foundation of a comprehensive anti-poverty program. Once effective anti-poverty programs are developed, future research could determine if those programs improve health in addition to increasing income. The potential personal, health and economic benefits of effective anti-poverty programs could be substantial, and could justify the major efforts and expenses that would be required to support systematic research to develop such programs. PMID:27235603

  20. A potential role of anti-poverty programs in health promotion.

    PubMed

    Silverman, Kenneth; Holtyn, August F; Jarvis, Brantley P

    2016-11-01

    Poverty is one of the most pervasive risk factors underlying poor health, but is rarely targeted to improve health. Research on the effects of anti-poverty interventions on health has been limited, at least in part because funding for that research has been limited. Anti-poverty programs have been applied on a large scale, frequently by governments, but without systematic development and cumulative programmatic experimental studies. Anti-poverty programs that produce lasting effects on poverty have not been developed. Before evaluating the effect of anti-poverty programs on health, programs must be developed that can reduce poverty consistently. Anti-poverty programs require systematic development and cumulative programmatic scientific evaluation. Research on the therapeutic workplace could provide a model for that research and an adaptation of the therapeutic workplace could serve as a foundation of a comprehensive anti-poverty program. Once effective anti-poverty programs are developed, future research could determine if those programs improve health in addition to increasing income. The potential personal, health and economic benefits of effective anti-poverty programs could be substantial, and could justify the major efforts and expenses that would be required to support systematic research to develop such programs. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. A systematic review of wheelchair skills tests for manual wheelchair users with a spinal cord injury: towards a standardized outcome measure.

    PubMed

    Fliess-Douer, Osnat; Vanlandewijck, Yves C; Lubel Manor, Galia; Van Der Woude, Lucas H V

    2010-10-01

    To review, analyse, evaluate and critically appraise available wheelchair skill tests in the international literature and to determine the need for a standardized measurement tool of manual wheeled mobility in those with spinal cord injury. A systematic review of literature (databases PubMed, Web of Science and Cochrane Library (1970-December 2009). Hand rim wheelchair users, mainly those with spinal cord injury. Studies' content and methodology were analysed qualitatively. Study quality was assessed using the scale of Gardner and Altman. Thirteen studies fell within the inclusion criteria and were critically reviewed. The 13 studies covered 11 tests, which involved 14 different skills. These 14 skills were categorized into: wheelchair manoeuvring and basic daily living skills; obstacle-negotiating skills; wheelie tasks; and transfers. The Wheelchair Skills Test version 2.4 (WST-2.4) and Wheelchair Circuit tests scored best on the Gardner and Altman scale, the Obstacle Course Assessment of Wheelchair User Performances (OCAWUP) test was found to be the most relevant for daily needs in a wheelchair. The different tests used different measurement scales, varying from binary to ordinal and continuous. Comparison of outcomes between tests was not possible because of differences in skills assessed, measurement scales, environment and equipment selected for each test. A lack of information regarding protocols as well as differences in terminology was also detected. This systematic review revealed large inconsistencies among the current available wheelchair skill tests. This makes it difficult to compare study results and to create norms and standards for wheelchair skill performance.

  2. Conceptualization and Assessment of Hypersexual Disorder: A Systematic Review of the Literature.

    PubMed

    Montgomery-Graham, Stephanie

    2017-04-01

    Despite the rejection of hypersexual disorder (HD) as a new diagnosis in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5), clinical and research interest in HD continues. To systematically review the existing scientific literature on the conceptualization and assessment of HD and out-of-control sexual behavior. Studies were identified from PsychInfo, PubMed, JSTOR, Google Scholar, and Scholar's Portal using an exhaustive list of key terms. Of 299 total articles identified and screened, 252 were excluded, and 47 are included in this review. To review two categories of articles: HD conceptualization and HD psychometric assessment. First, results of the review of theoretical conceptualizations of HD reflected a large proportion of the peer-reviewed literature devoted to discussing conceptualizations of HD without reaching consensus. Second, results of the review of HD psychometric assessments were analyzed using Hunsley and Mash's (2008) criteria to evaluate psychometric adequacy of evidence-based assessment measurements. The six most researched measurements of HD were evaluated, including the Hypersexual Disorder Screening Inventory, the Hypersexual Behavior Inventory, the Sexual Compulsivity Scale, the Sexual Addiction Screening Test, the Sexual Addiction Screening Test-Revised, and the Compulsive Sexual Behavior Inventory. Psychometric properties of the scales are reviewed, evaluated, and discussed. The Hypersexual Disorder Screening Inventory, the measurement proposed for the clinical screening of HD by the DSM-5 workgroup, currently has the strongest psychometric support. Future research and clinical directions are discussed in light of findings after the literature review and synthesis. Montgomery-Graham S. Conceptualization and Assessment of Hypersexual Disorder: A Systematic Review of the Literature. Sex Med Rev 2017;5:146-162. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  3. Based on Real Time Remote Health Monitoring Systems: A New Approach for Prioritization "Large Scales Data" Patients with Chronic Heart Diseases Using Body Sensors and Communication Technology.

    PubMed

    Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Albahri, O S; Albahri, A S

    2018-03-02

    This paper presents a new approach to prioritize "Large-scale Data" of patients with chronic heart diseases by using body sensors and communication technology during disasters and peak seasons. An evaluation matrix is used for emergency evaluation and large-scale data scoring of patients with chronic heart diseases in telemedicine environment. However, one major problem in the emergency evaluation of these patients is establishing a reasonable threshold for patients with the most and least critical conditions. This threshold can be used to detect the highest and lowest priority levels when all the scores of patients are identical during disasters and peak seasons. A practical study was performed on 500 patients with chronic heart diseases and different symptoms, and their emergency levels were evaluated based on four main measurements: electrocardiogram, oxygen saturation sensor, blood pressure monitoring, and non-sensory measurement tool, namely, text frame. Data alignment was conducted for the raw data and decision-making matrix by converting each extracted feature into an integer. This integer represents their state in the triage level based on medical guidelines to determine the features from different sources in a platform. The patients were then scored based on a decision matrix by using multi-criteria decision-making techniques, namely, integrated multi-layer for analytic hierarchy process (MLAHP) and technique for order performance by similarity to ideal solution (TOPSIS). For subjective validation, cardiologists were consulted to confirm the ranking results. For objective validation, mean ± standard deviation was computed to check the accuracy of the systematic ranking. This study provides scenarios and checklist benchmarking to evaluate the proposed and existing prioritization methods. Experimental results revealed the following. (1) The integration of TOPSIS and MLAHP effectively and systematically solved the patient settings on triage and prioritization problems. (2) In subjective validation, the first five patients assigned to the doctors were the most urgent cases that required the highest priority, whereas the last five patients were the least urgent cases and were given the lowest priority. In objective validation, scores significantly differed between the groups, indicating that the ranking results were identical. (3) For the first, second, and third scenarios, the proposed method exhibited an advantage over the benchmark method with percentages of 40%, 60%, and 100%, respectively. In conclusion, patients with the most and least urgent cases received the highest and lowest priority levels, respectively.

  4. Educational Interventions for Children with ASD: A Systematic Literature Review 2008-2013

    ERIC Educational Resources Information Center

    Bond, Caroline; Symes, Wendy; Hebron, Judith; Humphrey, Neil; Morewood, Gareth; Woods, Kevin

    2016-01-01

    Systematic literature reviews can play a key role in underpinning evidence-based practice. To date, large-scale reviews of interventions for individuals with Autism Spectrum Disorder (ASD) have focused primarily on research quality. To assist practitioners, the current review adopted a broader framework which allowed for greater consideration of…

  5. Functional Independent Scaling Relation for ORR/OER Catalysts

    DOE PAGES

    Christensen, Rune; Hansen, Heine A.; Dickens, Colin F.; ...

    2016-10-11

    A widely used adsorption energy scaling relation between OH* and OOH* intermediates in the oxygen reduction reaction (ORR) and oxygen evolution reaction (OER), has previously been determined using density functional theory and shown to dictate a minimum thermodynamic overpotential for both reactions. Here, we show that the oxygen–oxygen bond in the OOH* intermediate is, however, not well described with the previously used class of exchange-correlation functionals. By quantifying and correcting the systematic error, an improved description of gaseous peroxide species versus experimental data and a reduction in calculational uncertainty is obtained. For adsorbates, we find that the systematic error largelymore » cancels the vdW interaction missing in the original determination of the scaling relation. An improved scaling relation, which is fully independent of the applied exchange–correlation functional, is obtained and found to differ by 0.1 eV from the original. Lastly, this largely confirms that, although obtained with a method suffering from systematic errors, the previously obtained scaling relation is applicable for predictions of catalytic activity.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.

    As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison ofmore » the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. Lastly, these large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.« less

  7. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we improved the redshift space distortion growth rate measurement precision by a factor of 2.5 using customized clustering statistics in the non-linear regime that were immunized against observational systematics. We look forward to addressing the unique challenges of modeling and empirically characterizing the WFIRST galaxies and observational systematics.

  8. Multi-scale comparison of source parameter estimation using empirical Green's function approach

    NASA Astrophysics Data System (ADS)

    Chen, X.; Cheng, Y.

    2015-12-01

    Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.

  9. Building work engagement: A systematic review and meta-analysis investigating the effectiveness of work engagement interventions.

    PubMed

    Knight, Caroline; Patterson, Malcolm; Dawson, Jeremy

    2017-07-01

    Low work engagement may contribute towards decreased well-being and work performance. Evaluating, boosting and sustaining work engagement are therefore of interest to many organisations. However, the evidence on which to base interventions has not yet been synthesised. A systematic review with meta-analysis was conducted to assess the evidence for the effectiveness of work engagement interventions. A systematic literature search identified controlled workplace interventions employing a validated measure of work engagement. Most used the Utrecht Work Engagement Scale (UWES). Studies containing the relevant quantitative data underwent random-effects meta-analyses. Results were assessed for homogeneity, systematic sampling error, publication bias and quality. Twenty studies met the inclusion criteria and were categorised into four types of interventions: (i) personal resource building; (ii) job resource building; (iii) leadership training; and (iv) health promotion. The overall effect on work engagement was small, but positive, k  = 14, Hedges g  = 0.29, 95%-CI = 0.12-0.46. Moderator analyses revealed a significant result for intervention style, with a medium to large effect for group interventions. Heterogeneity between the studies was high, and the success of implementation varied. More studies are needed, and researchers are encouraged to collaborate closely with organisations to design interventions appropriate to individual contexts and settings, and include evaluations of intervention implementation. © 2016 The Authors. Journal of Organizational Behavior published by John Wiley & Sons, Ltd.

  10. Economic evaluation of vaccines in Canada: A systematic review.

    PubMed

    Chit, Ayman; Lee, Jason K H; Shim, Minsup; Nguyen, Van Hai; Grootendorst, Paul; Wu, Jianhong; Van Exan, Robert; Langley, Joanne M

    2016-05-03

    Economic evaluations should form part of the basis for public health decision making on new vaccine programs. While Canada's national immunization advisory committee does not systematically include economic evaluations in immunization decision making, there is increasing interest in adopting them. We therefore sought to examine the extent and quality of economic evaluations of vaccines in Canada. We conducted a systematic review of economic evaluations of vaccines in Canada to determine and summarize: comprehensiveness across jurisdictions, studied vaccines, funding sources, study designs, research quality, and changes over time. Searches in multiple databases were conducted using the terms "vaccine," "economics" and "Canada." Descriptive data from eligible manuscripts was abstracted and three authors independently evaluated manuscript quality using a 7-point Likert-type scale scoring tool based on criteria from the International Society for Pharmacoeconomics and Outcomes Research (ISPOR). 42/175 articles met the search criteria. Of these, Canada-wide studies were most common (25/42), while provincial studies largely focused on the three populous provinces of Ontario, Quebec and British Columbia. The most common funding source was industry (17/42), followed by government (7/42). 38 studies used mathematical models estimating expected economic benefit while 4 studies examined post-hoc data on established programs. Studies covered 10 diseases, with 28/42 addressing pediatric vaccines. Many studies considered cost-utility (22/42) and the majority of these studies reported favorable economic results (16/22). The mean quality score was 5.9/7 and was consistent over publication date, funding sources, and disease areas. We observed diverse approaches to evaluate vaccine economics in Canada. Given the increased complexity of economic studies evaluating vaccines and the impact of results on public health practice, Canada needs improved, transparent and consistent processes to review and assess the findings of the economic evaluations of vaccines.

  11. Wall Modeled Large Eddy Simulation of Airfoil Trailing Edge Noise

    NASA Astrophysics Data System (ADS)

    Kocheemoolayil, Joseph; Lele, Sanjiva

    2014-11-01

    Large eddy simulation (LES) of airfoil trailing edge noise has largely been restricted to low Reynolds numbers due to prohibitive computational cost. Wall modeled LES (WMLES) is a computationally cheaper alternative that makes full-scale Reynolds numbers relevant to large wind turbines accessible. A systematic investigation of trailing edge noise prediction using WMLES is conducted. Detailed comparisons are made with experimental data. The stress boundary condition from a wall model does not constrain the fluctuating velocity to vanish at the wall. This limitation has profound implications for trailing edge noise prediction. The simulation over-predicts the intensity of fluctuating wall pressure and far-field noise. An improved wall model formulation that minimizes the over-prediction of fluctuating wall pressure is proposed and carefully validated. The flow configurations chosen for the study are from the workshop on benchmark problems for airframe noise computations. The large eddy simulation database is used to examine the adequacy of scaling laws that quantify the dependence of trailing edge noise on Mach number, Reynolds number and angle of attack. Simplifying assumptions invoked in engineering approaches towards predicting trailing edge noise are critically evaluated. We gratefully acknowledge financial support from GE Global Research and thank Cascade Technologies Inc. for providing access to their massively-parallel large eddy simulation framework.

  12. Search for subgrid scale parameterization by projection pursuit regression

    NASA Technical Reports Server (NTRS)

    Meneveau, C.; Lund, T. S.; Moin, Parviz

    1992-01-01

    The dependence of subgrid-scale stresses on variables of the resolved field is studied using direct numerical simulations of isotropic turbulence, homogeneous shear flow, and channel flow. The projection pursuit algorithm, a promising new regression tool for high-dimensional data, is used to systematically search through a large collection of resolved variables, such as components of the strain rate, vorticity, velocity gradients at neighboring grid points, etc. For the case of isotropic turbulence, the search algorithm recovers the linear dependence on the rate of strain (which is necessary to transfer energy to subgrid scales) but is unable to determine any other more complex relationship. For shear flows, however, new systematic relations beyond eddy viscosity are found. For the homogeneous shear flow, the results suggest that products of the mean rotation rate tensor with both the fluctuating strain rate and fluctuating rotation rate tensors are important quantities in parameterizing the subgrid-scale stresses. A model incorporating these terms is proposed. When evaluated with direct numerical simulation data, this model significantly increases the correlation between the modeled and exact stresses, as compared with the Smagorinsky model. In the case of channel flow, the stresses are found to correlate with products of the fluctuating strain and rotation rate tensors. The mean rates of rotation or strain do not appear to be important in this case, and the model determined for homogeneous shear flow does not perform well when tested with channel flow data. Many questions remain about the physical mechanisms underlying these findings, about possible Reynolds number dependence, and, given the low level of correlations, about their impact on modeling. Nevertheless, demonstration of the existence of causal relations between sgs stresses and large-scale characteristics of turbulent shear flows, in addition to those necessary for energy transfer, provides important insight into the relation between scales in turbulent flows.

  13. Invertebrate iridoviruses: A glance over the last decade

    USDA-ARS?s Scientific Manuscript database

    Iridovirus is a genus of large dsDNA viruses that predominantly infects both invertebrate and vertebrate ectotherms and whose symptoms range in severity from minor reductions in fitness to systematic disease and large-scale mortality. Several characteristics have been useful for taxonomically classi...

  14. Translation of SNOMED CT - strategies and description of a pilot project.

    PubMed

    Klein, Gunnar O; Chen, Rong

    2009-01-01

    The translation and localization of SNOMED CT (Systematized Nomenclature of Medicine - Clinical Terms) have been initiated in a few countries. In Sweden, we conducted the first evaluation of this terminology in a project called REFTERM in which we also developed a software tool which could handle a large scale translation with a number of translators and reviewers in a web-based environment. The system makes use of existing authorized English-Swedish translations of medical terminologies such as ICD-10. The paper discusses possible strategies for a national project to translate and adapt this terminology.

  15. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    PubMed

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Large-scale block adjustment without use of ground control points based on the compensation of geometric calibration for ZY-3 images

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Mi; Xu, Wen; Li, Deren; Gong, Jianya; Pi, Yingdong

    2017-12-01

    The potential of large-scale block adjustment (BA) without ground control points (GCPs) has long been a concern among photogrammetric researchers, which is of effective guiding significance for global mapping. However, significant problems with the accuracy and efficiency of this method remain to be solved. In this study, we analyzed the effects of geometric errors on BA, and then developed a step-wise BA method to conduct integrated processing of large-scale ZY-3 satellite images without GCPs. We first pre-processed the BA data, by adopting a geometric calibration (GC) method based on the viewing-angle model to compensate for systematic errors, such that the BA input images were of good initial geometric quality. The second step was integrated BA without GCPs, in which a series of technical methods were used to solve bottleneck problems and ensure accuracy and efficiency. The BA model, based on virtual control points (VCPs), was constructed to address the rank deficiency problem caused by lack of absolute constraints. We then developed a parallel matching strategy to improve the efficiency of tie points (TPs) matching, and adopted a three-array data structure based on sparsity to relieve the storage and calculation burden of the high-order modified equation. Finally, we used the conjugate gradient method to improve the speed of solving the high-order equations. To evaluate the feasibility of the presented large-scale BA method, we conducted three experiments on real data collected by the ZY-3 satellite. The experimental results indicate that the presented method can effectively improve the geometric accuracies of ZY-3 satellite images. This study demonstrates the feasibility of large-scale mapping without GCPs.

  17. Dynamics of oxygen supply and consumption during mainstream large-scale composting in China.

    PubMed

    Zeng, Jianfei; Shen, Xiuli; Han, Lujia; Huang, Guangqun

    2016-11-01

    This study characterized some physicochemical and biological parameters to systematically evaluate the dynamics of oxygen supply and consumption during large-scale trough composting in China. The results showed that long active phases, low maximum temperatures, low organic matter losses and high pore methane concentrations were observed in different composting layers. Pore oxygen concentrations in the top, middle and bottom layers maintained <5vol.% for 40, 42 and 45days, respectively, which accounted for more than 89% of the whole period. After each mechanical turning, oxygen was consumed at a stable respiration rate to a concentration of 5vol.% in no more than 99min and remained anaerobic in the subsequent static condition. The daily percentage of time under aerobic condition was no more than 14% of a single day. Therefore, improving FAS, adjusting aeration interval or combining turning with forced aeration was suggested to provide sufficient oxygen during composting. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. State of the Art Methodology for the Design and Analysis of Future Large Scale Evaluations: A Selective Examination.

    ERIC Educational Resources Information Center

    Burstein, Leigh

    Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…

  19. Geometagenomics illuminates the impact of agriculture on the distribution and prevalence of plant viruses at the ecosystem scale.

    PubMed

    Bernardo, Pauline; Charles-Dominique, Tristan; Barakat, Mohamed; Ortet, Philippe; Fernandez, Emmanuel; Filloux, Denis; Hartnady, Penelope; Rebelo, Tony A; Cousins, Stephen R; Mesleard, François; Cohez, Damien; Yavercovski, Nicole; Varsani, Arvind; Harkins, Gordon W; Peterschmitt, Michel; Malmstrom, Carolyn M; Martin, Darren P; Roumagnac, Philippe

    2018-01-01

    Disease emergence events regularly result from human activities such as agriculture, which frequently brings large populations of genetically uniform hosts into contact with potential pathogens. Although viruses cause nearly 50% of emerging plant diseases, there is little systematic information about virus distribution across agro-ecological interfaces and large gaps in understanding of virus diversity in nature. Here we applied a novel landscape-scale geometagenomics approach to examine relationships between agricultural land use and distributions of plant-associated viruses in two Mediterranean-climate biodiversity hotspots (Western Cape region of South Africa and Rhône river delta region of France). In total, we analysed 1725 geo-referenced plant samples collected over two years from 4.5 × 4.5 km 2 grids spanning farmlands and adjacent uncultivated vegetation. We found substantial virus prevalence (25.8-35.7%) in all ecosystems, but prevalence and identified family-level virus diversity were greatest in cultivated areas, with some virus families displaying strong agricultural associations. Our survey revealed 94 previously unknown virus species, primarily from uncultivated plants. This is the first effort to systematically evaluate plant-associated viromes across broad agro-ecological interfaces. Our findings indicate that agriculture substantially influences plant virus distributions and highlight the extent of current ignorance about the diversity and roles of viruses in nature.

  20. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  1. Characterization of complex networks by higher order neighborhood properties

    NASA Astrophysics Data System (ADS)

    Andrade, R. F. S.; Miranda, J. G. V.; Pinho, S. T. R.; Lobão, T. P.

    2008-01-01

    A concept of higher order neighborhood in complex networks, introduced previously [Phys. Rev. E 73, 046101 (2006)], is systematically explored to investigate larger scale structures in complex networks. The basic idea is to consider each higher order neighborhood as a network in itself, represented by a corresponding adjacency matrix, and to settle a plenty of new parameters in order to obtain a best characterization of the whole network. Usual network indices are then used to evaluate the properties of each neighborhood. The identification of high order neighborhoods is also regarded as intermediary step towards the evaluation of global network properties, like the diameter, average shortest path between node, and network fractal dimension. Results for a large number of typical networks are presented and discussed.

  2. Identification and evaluation of software measures

    NASA Technical Reports Server (NTRS)

    Card, D. N.

    1981-01-01

    A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.

  3. Structural similitude and design of scaled down laminated models

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Rezaeepazhand, J.

    1993-01-01

    The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial similarity are discussed. The procedure consists of systematically observing the effect of each parameter and corresponding scaling laws. Then acceptable intervals and limitations for these parameters and scaling laws are discussed. In each case, a set of valid scaling factors and corresponding response scaling laws that accurately predict the response of prototypes from experimental models is introduced. The examples used include rectangular laminated plates under destabilizing loads, applied individually, vibrational characteristics of same plates, as well as cylindrical bending of beam-plates.

  4. A Systematic Multi-Time Scale Solution for Regional Power Grid Operation

    NASA Astrophysics Data System (ADS)

    Zhu, W. J.; Liu, Z. G.; Cheng, T.; Hu, B. Q.; Liu, X. Z.; Zhou, Y. F.

    2017-10-01

    Many aspects need to be taken into consideration in a regional grid while making schedule plans. In this paper, a systematic multi-time scale solution for regional power grid operation considering large scale renewable energy integration and Ultra High Voltage (UHV) power transmission is proposed. In the time scale aspect, we discuss the problem from month, week, day-ahead, within-day to day-behind, and the system also contains multiple generator types including thermal units, hydro-plants, wind turbines and pumped storage stations. The 9 subsystems of the scheduling system are described, and their functions and relationships are elaborated. The proposed system has been constructed in a provincial power grid in Central China, and the operation results further verified the effectiveness of the system.

  5. Spatiotemporal patterns of plant water isotope values from a continental-scale sample network in Europe as a tool to improve hydroclimate proxies

    NASA Astrophysics Data System (ADS)

    Nelson, D. B.; Kahmen, A.

    2016-12-01

    The hydrogen and oxygen isotopic composition of water available for biosynthetic processes in vascular plants plays an important role in shaping the isotopic composition of organic compounds that these organisms produce, including leaf waxes and cellulose in leaves and tree rings. Characterizing changes in large scale spatial patterns of precipitation, soil water, stem water, and leaf water isotope values over time is therefore useful for evaluating how plants reflect changes in the isotopic composition of these source waters in different environments. This information can, in turn, provide improved calibration targets for understanding the environmental signals that plants preserve. The pathway of water through this continuum can include several isotopic fractionations, but the extent to which the isotopic composition of each of these water pools varies under normal field conditions and over space and time has not been systematically and concurrently evaluated at large spatial scales. Two season-long sampling campaigns were conducted at nineteen sites throughout Europe over the 2014 and 2015 growing seasons to track changes in the isotopic composition of plant-relevant waters. Samples of precipitation, soil water, stem water, and leaf water were collected over more than 200 field days and include more than 500 samples from each water pool. Measurements were used to validate continent-wide gridded estimates of leaf water isotope values derived from a combination of mechanistic and statistical modeling conducted with temperature, precipitation, and relative humidity data. Data-model comparison shows good agreement for summer leaf waters, and substantiates the incorporation of modeled leaf waters in evaluating how plants respond to hydroclimate changes at large spatial scales. These results also suggest that modeled leaf water isotope values might be used in future studies in similar ecosystems to improve the coverage density of spatial or temporal data.

  6. Aromatherapy for managing menopausal symptoms: A protocol for systematic review and meta-analysis.

    PubMed

    Choi, Jiae; Lee, Hye Won; Lee, Ju Ah; Lim, Hyun-Ja; Lee, Myeong Soo

    2018-02-01

    Aromatherapy is often used as a complementary therapy for women's health. This systematic review aims to evaluate the therapeutic effects of aromatherapy as a management for menopausal symptoms. Eleven electronic databases will be searched from inception to February 2018. Randomized controlled trials that evaluated any type of aromatherapy against any type of control in individuals with menopausal symptoms will be eligible. The methodological quality will be assessed using the Cochrane risk of bias tool. Two authors will independently assess each study for eligibility and risk of bias and to extract data. This study will provide a high quality synthesis of current evidence of aromatherapy for menopausal symptoms measured with Menopause Rating Scale, the Kupperman Index, the Greene Climacteric Scale, or other validated questionnaires. The conclusion of our systematic review will provide evidence to judge whether aromatherapy is an effective intervention for patient with menopausal women. Ethical approval will not be required, given that this protocol is for a systematic review. The systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. PROSPERO CRD42017079191.

  7. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions

    PubMed Central

    Taylor, Richard L.; Bentley, Christopher D. B.; Pedernales, Julen S.; Lamata, Lucas; Solano, Enrique; Carvalho, André R. R.; Hope, Joseph J.

    2017-01-01

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10−5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period. PMID:28401945

  8. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions.

    PubMed

    Taylor, Richard L; Bentley, Christopher D B; Pedernales, Julen S; Lamata, Lucas; Solano, Enrique; Carvalho, André R R; Hope, Joseph J

    2017-04-12

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10 -5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period.

  9. Association of Gestational Hypertensive Disorders with Retinopathy of prematurity: A Systematic Review and Meta-analysis.

    PubMed

    Chan, Priscilla Y L; Tang, Shu-Min; Au, Sunny C L; Rong, Shi-Song; Lau, Henry H W; Ko, Simon T C; Ng, Danny S C; Chen, Li Jia; Yam, Jason C S

    2016-08-05

    The role of gestational hypertensive disorders, which includes both pre-eclampsia and gestational hypertension, in the development of retinopathy of prematurity (ROP) has been controversial. Therefore, this systematic review and meta-analysis is to evaluate the association between gestational hypertensive disoders and ROP. Eligible studies published up to June 5, 2016 were identified from MEDLINE and EMBASE that evaluated the association between the two conditions. Totally 1142 published records were retrieved for screening, 925 of them eligible for detailed evaluation. Finally 19 studies involving 45281 infants with 5388 cases of ROP met our criteria for meta-analysis. Gestational hypertensive disorders were not associated with ROP (unadjusted OR: 0.89; P = 0.38; adjusted OR: 1.35; P = 0.18). Subgroup analyses also revealed no significant association between ROP with pre-eclampsia (unadjusted OR: 0.85; P = 0.29; adjusted OR:1.29; P = 0.28) or with gestational hypertension (unadjusted OR: 1.10; P = 0.39; adjusted OR: 1.25; P = 0.60) separately. Sensitivity analysis indicated our results were robust. We concluded no significant association between gestational hypertensive disorders and ROP. More large scale well-conducted prospective cohorts on the topic are needed.

  10. Systematic evaluation of common lubricants for optimal use in tablet formulation.

    PubMed

    Paul, Shubhajit; Sun, Changquan Calvin

    2018-05-30

    As an essential formulation component for large-scale tablet manufacturing, the lubricant preserves tooling by reducing die-wall friction. Unfortunately, lubrication also often results in adverse effects on tablet characteristics, such as prolonged disintegration, slowed dissolution, and reduced mechanical strength. Therefore, the choice of lubricant and its optimal concentration in a tablet formulation is a critical decision in tablet formulation development to attain low die-wall friction while minimizing negative impact on other tablet properties. Three commercially available tablet lubricants, i.e., magnesium stearate, sodium stearyl fumerate, and stearic acid, were systematically investigated in both plastic and brittle matrices to elucidate their effects on reducing die-wall friction, tablet strength, tablet hardness, tablet friability, and tablet disintegration kinetics. Clear understanding of the lubrication efficiency of commonly used lubricants as well as their impact on tablet characteristics would help future tablet formulation efforts. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Assessing the performance of community-available global MHD models using key system parameters and empirical relationships

    NASA Astrophysics Data System (ADS)

    Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.

    2015-12-01

    Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively evaluate progress in magnetospheric modeling.

  12. Large-Scale SRM Screen of Urothelial Bladder Cancer Candidate Biomarkers in Urine.

    PubMed

    Duriez, Elodie; Masselon, Christophe D; Mesmin, Cédric; Court, Magali; Demeure, Kevin; Allory, Yves; Malats, Núria; Matondo, Mariette; Radvanyi, François; Garin, Jérôme; Domon, Bruno

    2017-04-07

    Urothelial bladder cancer is a condition associated with high recurrence and substantial morbidity and mortality. Noninvasive urinary tests that would detect bladder cancer and tumor recurrence are required to significantly improve patient care. Over the past decade, numerous bladder cancer candidate biomarkers have been identified in the context of extensive proteomics or transcriptomics studies. To translate these findings in clinically useful biomarkers, the systematic evaluation of these candidates remains the bottleneck. Such evaluation involves large-scale quantitative LC-SRM (liquid chromatography-selected reaction monitoring) measurements, targeting hundreds of signature peptides by monitoring thousands of transitions in a single analysis. The design of highly multiplexed SRM analyses is driven by several factors: throughput, robustness, selectivity and sensitivity. Because of the complexity of the samples to be analyzed, some measurements (transitions) can be interfered by coeluting isobaric species resulting in biased or inconsistent estimated peptide/protein levels. Thus the assessment of the quality of SRM data is critical to allow flagging these inconsistent data. We describe an efficient and robust method to process large SRM data sets, including the processing of the raw data, the detection of low-quality measurements, the normalization of the signals for each protein, and the estimation of protein levels. Using this methodology, a variety of proteins previously associated with bladder cancer have been assessed through the analysis of urine samples from a large cohort of cancer patients and corresponding controls in an effort to establish a priority list of most promising candidates to guide subsequent clinical validation studies.

  13. Synergy of Stochastic and Systematic Energization of Plasmas during Turbulent Reconnection

    NASA Astrophysics Data System (ADS)

    Pisokas, Theophilos; Vlahos, Loukas; Isliker, Heinz

    2018-01-01

    The important characteristic of turbulent reconnection is that it combines large-scale magnetic disturbances (δ B/B∼ 1) with randomly distributed unstable current sheets (UCSs). Many well-known nonlinear MHD structures (strong turbulence, current sheet(s), shock(s)) lead asymptotically to the state of turbulent reconnection. We analyze in this article, for the first time, the energization of electrons and ions in a large-scale environment that combines large-amplitude disturbances propagating with sub-Alfvénic speed with UCSs. The magnetic disturbances interact stochastically (second-order Fermi) with the charged particles and play a crucial role in the heating of the particles, while the UCSs interact systematically (first-order Fermi) and play a crucial role in the formation of the high-energy tail. The synergy of stochastic and systematic acceleration provided by the mixture of magnetic disturbances and UCSs influences the energetics of the thermal and nonthermal particles, the power-law index, and the length of time the particles remain inside the energy release volume. We show that this synergy can explain the observed very fast and impulsive particle acceleration and the slightly delayed formation of a superhot particle population.

  14. A Variational Assimilation Method for Satellite and Conventional Data: a Revised Basic Model 2B

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.; Scott, Robert W.; Chen, J.

    1991-01-01

    A variational objective analysis technique that modifies observations of temperature, height, and wind on the cyclone scale to satisfy the five 'primitive' model forecast equations is presented. This analysis method overcomes all of the problems that hindered previous versions, such as over-determination, time consistency, solution method, and constraint decoupling. A preliminary evaluation of the method shows that it converges rapidly, the divergent part of the wind is strongly coupled in the solution, fields of height and temperature are well-preserved, and derivative quantities such as vorticity and divergence are improved. Problem areas are systematic increases in the horizontal velocity components, and large magnitudes of the local tendencies of the horizontal velocity components. The preliminary evaluation makes note of these problems but detailed evaluations required to determine the origin of these problems await future research.

  15. Evaluation of Normalization Methods to Pave the Way Towards Large-Scale LC-MS-Based Metabolomics Profiling Experiments

    PubMed Central

    Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya

    2013-01-01

    Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607

  16. Performance evaluation and bias correction of DBS measurements for a 1290-MHz boundary layer profiler.

    PubMed

    Liu, Zhao; Zheng, Chaorong; Wu, Yue

    2018-02-01

    Recently, the government installed a boundary layer profiler (BLP), which is operated under the Doppler beam swinging mode, in a coastal area of China, to acquire useful wind field information in the atmospheric boundary layer for several purposes. And under strong wind conditions, the performance of the BLP is evaluated. It is found that, even though the quality controlled BLP data show good agreement with the balloon observations, a systematic bias can always be found for the BLP data. For the low wind velocities, the BLP data tend to overestimate the atmospheric wind. However, with the increment of wind velocity, the BLP data show a tendency of underestimation. In order to remove the effect of poor quality data on bias correction, the probability distribution function of the differences between the two instruments is discussed, and it is found that the t location scale distribution is the most suitable probability model when compared to other probability models. After the outliers with a large discrepancy, which are outside of 95% confidence interval of the t location scale distribution, are discarded, the systematic bias can be successfully corrected using a first-order polynomial correction function. The methodology of bias correction used in the study not only can be referred for the correction of other wind profiling radars, but also can lay a solid basis for further analysis of the wind profiles.

  17. Performance evaluation and bias correction of DBS measurements for a 1290-MHz boundary layer profiler

    NASA Astrophysics Data System (ADS)

    Liu, Zhao; Zheng, Chaorong; Wu, Yue

    2018-02-01

    Recently, the government installed a boundary layer profiler (BLP), which is operated under the Doppler beam swinging mode, in a coastal area of China, to acquire useful wind field information in the atmospheric boundary layer for several purposes. And under strong wind conditions, the performance of the BLP is evaluated. It is found that, even though the quality controlled BLP data show good agreement with the balloon observations, a systematic bias can always be found for the BLP data. For the low wind velocities, the BLP data tend to overestimate the atmospheric wind. However, with the increment of wind velocity, the BLP data show a tendency of underestimation. In order to remove the effect of poor quality data on bias correction, the probability distribution function of the differences between the two instruments is discussed, and it is found that the t location scale distribution is the most suitable probability model when compared to other probability models. After the outliers with a large discrepancy, which are outside of 95% confidence interval of the t location scale distribution, are discarded, the systematic bias can be successfully corrected using a first-order polynomial correction function. The methodology of bias correction used in the study not only can be referred for the correction of other wind profiling radars, but also can lay a solid basis for further analysis of the wind profiles.

  18. Breakdowns in coordinated decision making at and above the incident management team level: an analysis of three large scale Australian wildfires.

    PubMed

    Bearman, Chris; Grunwald, Jared A; Brooks, Benjamin P; Owen, Christine

    2015-03-01

    Emergency situations are by their nature difficult to manage and success in such situations is often highly dependent on effective team coordination. Breakdowns in team coordination can lead to significant disruption to an operational response. Breakdowns in coordination were explored in three large-scale bushfires in Australia: the Kilmore East fire, the Wangary fire, and the Canberra Firestorm. Data from these fires were analysed using a top-down and bottom-up qualitative analysis technique. Forty-four breakdowns in coordinated decision making were identified, which yielded 83 disconnects grouped into three main categories: operational, informational and evaluative. Disconnects were specific instances where differences in understanding existed between team members. The reasons why disconnects occurred were largely consistent across the three sets of data. In some cases multiple disconnects occurred in a temporal manner, which suggested some evidence of disconnects creating states that were conducive to the occurrence of further disconnects. In terms of resolution, evaluative disconnects were nearly always resolved however operational and informational disconnects were rarely resolved effectively. The exploratory data analysis and discussion presented here represents the first systematic research to provide information about the reasons why breakdowns occur in emergency management and presents an account of how team processes can act to disrupt coordination and the operational response. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  20. An intercomparison of GCM and RCM dynamical downscaling for characterizing the hydroclimatology of California and Nevada

    NASA Astrophysics Data System (ADS)

    Xu, Z.; Rhoades, A.; Johansen, H.; Ullrich, P. A.; Collins, W. D.

    2017-12-01

    Dynamical downscaling is widely used to properly characterize regional surface heterogeneities that shape the local hydroclimatology. However, the factors in dynamical downscaling, including the refinement of model horizontal resolution, large-scale forcing datasets and dynamical cores, have not been fully evaluated. Two cutting-edge global-to-regional downscaling methods are used to assess these, specifically the variable-resolution Community Earth System Model (VR-CESM) and the Weather Research & Forecasting (WRF) regional climate model, under different horizontal resolutions (28, 14, and 7 km). Two groups of WRF simulations are driven by either the NCEP reanalysis dataset (WRF_NCEP) or VR-CESM outputs (WRF_VRCESM) to evaluate the effects of the large-scale forcing datasets. The impacts of dynamical core are assessed by comparing the VR-CESM simulations to the coupled WRF_VRCESM simulations with the same physical parameterizations and similar grid domains. The simulated hydroclimatology (i.e., total precipitation, snow cover, snow water equivalent and surface temperature) are compared with the reference datasets. The large-scale forcing datasets are critical to the WRF simulations in more accurately simulating total precipitation, SWE and snow cover, but not surface temperature. Both the WRF and VR-CESM results highlight that no significant benefit is found in the simulated hydroclimatology by just increasing horizontal resolution refinement from 28 to 7 km. Simulated surface temperature is sensitive to the choice of dynamical core. WRF generally simulates higher temperatures than VR-CESM, alleviates the systematic cold bias of DJF temperatures over the California mountain region, but overestimates the JJA temperature in California's Central Valley.

  1. Sloan Digital Sky Survey III photometric quasar clustering: probing the initial conditions of the Universe

    NASA Astrophysics Data System (ADS)

    Ho, Shirley; Agarwal, Nishant; Myers, Adam D.; Lyons, Richard; Disbrow, Ashley; Seo, Hee-Jong; Ross, Ashley; Hirata, Christopher; Padmanabhan, Nikhil; O'Connell, Ross; Huff, Eric; Schlegel, David; Slosar, Anže; Weinberg, David; Strauss, Michael; Ross, Nicholas P.; Schneider, Donald P.; Bahcall, Neta; Brinkmann, J.; Palanque-Delabrouille, Nathalie; Yèche, Christophe

    2015-05-01

    The Sloan Digital Sky Survey has surveyed 14,555 square degrees of the sky, and delivered over a trillion pixels of imaging data. We present the large-scale clustering of 1.6 million quasars between z=0.5 and z=2.5 that have been classified from this imaging, representing the highest density of quasars ever studied for clustering measurements. This data set spans 0~ 11,00 square degrees and probes a volume of 80 h-3 Gpc3. In principle, such a large volume and medium density of tracers should facilitate high-precision cosmological constraints. We measure the angular clustering of photometrically classified quasars using an optimal quadratic estimator in four redshift slices with an accuracy of ~ 25% over a bin width of δl ~ 10-15 on scales corresponding to matter-radiation equality and larger (0l ~ 2-3). Observational systematics can strongly bias clustering measurements on large scales, which can mimic cosmologically relevant signals such as deviations from Gaussianity in the spectrum of primordial perturbations. We account for systematics by employing a new method recently proposed by Agarwal et al. (2014) to the clustering of photometrically classified quasars. We carefully apply our methodology to mitigate known observational systematics and further remove angular bins that are contaminated by unknown systematics. Combining quasar data with the photometric luminous red galaxy (LRG) sample of Ross et al. (2011) and Ho et al. (2012), and marginalizing over all bias and shot noise-like parameters, we obtain a constraint on local primordial non-Gaussianity of fNL = -113+154-154 (1σ error). We next assume that the bias of quasar and galaxy distributions can be obtained independently from quasar/galaxy-CMB lensing cross-correlation measurements (such as those in Sherwin et al. (2013)). This can be facilitated by spectroscopic observations of the sources, enabling the redshift distribution to be completely determined, and allowing precise estimates of the bias parameters. In this paper, if the bias and shot noise parameters are fixed to their known values (which we model by fixing them to their best-fit Gaussian values), we find that the error bar reduces to 1σ simeq 65. We expect this error bar to reduce further by at least another factor of five if the data is free of any observational systematics. We therefore emphasize that in order to make best use of large scale structure data we need an accurate modeling of known systematics, a method to mitigate unknown systematics, and additionally independent theoretical models or observations to probe the bias of dark matter halos.

  2. Building work engagement: A systematic review and meta‐analysis investigating the effectiveness of work engagement interventions

    PubMed Central

    Patterson, Malcolm; Dawson, Jeremy

    2016-01-01

    Summary Low work engagement may contribute towards decreased well‐being and work performance. Evaluating, boosting and sustaining work engagement are therefore of interest to many organisations. However, the evidence on which to base interventions has not yet been synthesised. A systematic review with meta‐analysis was conducted to assess the evidence for the effectiveness of work engagement interventions. A systematic literature search identified controlled workplace interventions employing a validated measure of work engagement. Most used the Utrecht Work Engagement Scale (UWES). Studies containing the relevant quantitative data underwent random‐effects meta‐analyses. Results were assessed for homogeneity, systematic sampling error, publication bias and quality. Twenty studies met the inclusion criteria and were categorised into four types of interventions: (i) personal resource building; (ii) job resource building; (iii) leadership training; and (iv) health promotion. The overall effect on work engagement was small, but positive, k = 14, Hedges g = 0.29, 95%‐CI = 0.12–0.46. Moderator analyses revealed a significant result for intervention style, with a medium to large effect for group interventions. Heterogeneity between the studies was high, and the success of implementation varied. More studies are needed, and researchers are encouraged to collaborate closely with organisations to design interventions appropriate to individual contexts and settings, and include evaluations of intervention implementation. © 2016 The Authors. Journal of Organizational Behavior published by John Wiley & Sons, Ltd. PMID:28781428

  3. Systematic methods for defining coarse-grained maps in large biomolecules.

    PubMed

    Zhang, Zhiyong

    2015-01-01

    Large biomolecules are involved in many important biological processes. It would be difficult to use large-scale atomistic molecular dynamics (MD) simulations to study the functional motions of these systems because of the computational expense. Therefore various coarse-grained (CG) approaches have attracted rapidly growing interest, which enable simulations of large biomolecules over longer effective timescales than all-atom MD simulations. The first issue in CG modeling is to construct CG maps from atomic structures. In this chapter, we review the recent development of a novel and systematic method for constructing CG representations of arbitrarily complex biomolecules, in order to preserve large-scale and functionally relevant essential dynamics (ED) at the CG level. In this ED-CG scheme, the essential dynamics can be characterized by principal component analysis (PCA) on a structural ensemble, or elastic network model (ENM) of a single atomic structure. Validation and applications of the method cover various biological systems, such as multi-domain proteins, protein complexes, and even biomolecular machines. The results demonstrate that the ED-CG method may serve as a very useful tool for identifying functional dynamics of large biomolecules at the CG level.

  4. The Cosmology Large Angular Scale Surveyor

    NASA Technical Reports Server (NTRS)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  5. Alexithymia in eating disorders: Systematic review and meta-analyses of studies using the Toronto Alexithymia Scale.

    PubMed

    Westwood, Heather; Kerr-Gaffney, Jess; Stahl, Daniel; Tchanturia, Kate

    2017-08-01

    The aim of this review was to synthesise the literature on the use of the Toronto Alexithymia Scale (TAS) in eating disorder populations and Healthy Controls (HCs) and to compare TAS scores in these groups. Electronic databases were searched systematically for studies using the TAS and meta-analyses were performed to statistically compare scores on the TAS between individuals with eating disorders and HCs. Forty-eight studies using the TAS with both a clinical eating disorder group and HCs were identified. Of these, 44 were included in the meta-analyses, separated into: Anorexia Nervosa; Anorexia Nervosa, Restricting subtype; Anorexia Nervosa, Binge-Purge subtype, Bulimia Nervosa and Binge Eating Disorder. For all groups, there were significant differences with medium or large effect sizes between the clinical group and HCs, with the clinical group scoring significantly higher on the TAS, indicating greater difficulty with identifying and labelling emotions. Across the spectrum of eating disorders, individuals report having difficulties recognising or describing their emotions. Given the self-report design of the TAS, research to develop and evaluate treatments and clinician-administered assessments of alexithymia is warranted. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Using Markov chains of nucleotide sequences as a possible precursor to predict functional roles of human genome: a case study on inactive chromatin regions.

    PubMed

    Lee, K-E; Lee, E-J; Park, H-S

    2016-08-30

    Recent advances in computational epigenetics have provided new opportunities to evaluate n-gram probabilistic language models. In this paper, we describe a systematic genome-wide approach for predicting functional roles in inactive chromatin regions by using a sequence-based Markovian chromatin map of the human genome. We demonstrate that Markov chains of sequences can be used as a precursor to predict functional roles in heterochromatin regions and provide an example comparing two publicly available chromatin annotations of large-scale epigenomics projects: ENCODE project consortium and Roadmap Epigenomics consortium.

  7. A Programming Environment Evaluation Methodology for Object-Oriented Systems. Ph.D Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1987-01-01

    The object-oriented design strategy as both a problem decomposition and system development paradigm has made impressive inroads into the various areas of the computing sciences. Substantial development productivity improvements have been demonstrated in areas ranging from artificial intelligence to user interface design. However, there has been very little progress in the formal characterization of these productivity improvements and in the identification of the underlying cognitive mechanisms. The development and validation of models and metrics of this sort require large amounts of systematically-gathered structural and productivity data. There has, however, been a notable lack of systematically-gathered information on these development environments. A large part of this problem is attributable to the lack of a systematic programming environment evaluation methodology that is appropriate to the evaluation of object-oriented systems.

  8. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  9. Effect of Home Exercise Program in Patients With Knee Osteoarthritis: A Systematic Review and Meta-analysis.

    PubMed

    Anwer, Shahnawaz; Alghadir, Ahmad; Brismée, Jean-Michel

    2016-01-01

    The Osteoarthritis Research Society International recommended that nonpharmacological methods include patient education programs, weight reduction, coping strategies, and exercise programs for the management of knee osteoarthritis (OA). However, neither a systematic review nor a meta-analysis has been published regarding the effectiveness of home exercise programs for the management of knee OA. The purpose of this systematic review was to examine the evidence regarding the effect of home exercise programs with and without supervised clinic-based exercises in the management of knee OA. We searched PubMed, CINAHL, Embase, Scopus, and PEDro for research articles published prior to September 2014 using key words such as pain, exercise, home exercise program, rehabilitation, supervised exercise program, and physiotherapy in combination with Medical Subject Headings "Osteoarthritis knee." We selected randomized and case-controlled trials published in English language. To verify the quality of the selected studies, we applied the PEDro Scale. Two evaluators individually selected the studies based on titles, excluding those articles that were not related to the objectives of this review. One evaluator extracted data from the included studies. A second evaluator independently verified extracted data for accuracy. A total of 31 studies were found in the search. Of these, 19 studies met the inclusion criteria and were further analyzed. Seventeen of these 19 studies reached high methodological quality on the PEDro scale. Although the methods and home exercise program interventions varied widely in these studies, most found significant improvements in pain and function in individuals with knee OA. The analysis indicated that both home exercise programs with and without supervised clinic-based exercises were beneficial in the management of knee OA. The large evidence of high-quality trials supports the effectiveness of home exercise programs with and without supervised clinic-based exercises in the rehabilitation of knee OA. In addition, small but growing evidence supports the effectiveness of other types of exercise such as tai chi, balance, and proprioceptive training for individuals with knee OA.

  10. Measuring the Large-scale Solar Magnetic Field

    NASA Astrophysics Data System (ADS)

    Hoeksema, J. T.; Scherrer, P. H.; Peterson, E.; Svalgaard, L.

    2017-12-01

    The Sun's large-scale magnetic field is important for determining global structure of the corona and for quantifying the evolution of the polar field, which is sometimes used for predicting the strength of the next solar cycle. Having confidence in the determination of the large-scale magnetic field of the Sun is difficult because the field is often near the detection limit, various observing methods all measure something a little different, and various systematic effects can be very important. We compare resolved and unresolved observations of the large-scale magnetic field from the Wilcox Solar Observatory, Heliseismic and Magnetic Imager (HMI), Michelson Doppler Imager (MDI), and Solis. Cross comparison does not enable us to establish an absolute calibration, but it does allow us to discover and compensate for instrument problems, such as the sensitivity decrease seen in the WSO measurements in late 2016 and early 2017.

  11. Dietary interventions, lifestyle changes, and dietary supplements in preventing gestational diabetes mellitus: a literature review.

    PubMed

    Facchinetti, Fabio; Dante, Giulia; Petrella, Elisabetta; Neri, Isabella

    2014-11-01

    Gestational diabetes mellitus (GDM) is associated with increased rates of fetal morbidity and mortality, both during the pregnancy and in the postnatal life. Current treatment of GDM includes diet with or without medications, but this management is expensive and poorly cost-effective for the health care systems. Strategies to prevent such condition would be preferable with respect to its treatment. The aim of this literature review was to evaluate studies reporting the efficacy of the most used approaches to prevent GDM as well as evidences of efficacy and safety of dietary supplementations. Systematic literature searches were performed in electronic databases, covering the period January 1983 to April 2014. Randomized controlled clinical trials were included. Quality of the articles was evaluated with the Jadad scale. We did not evaluate those articles that were already entered in the most recent systematic reviews, and we completed the research with the trials published thereafter. Of 55 articles identified, 15 randomized controlled trials were eligible. Quality and heterogeneity of the studies cannot allow firm conclusions. Anyway, trials in which only intake or expenditure has been targeted mostly reported negative results. On the contrary, combined lifestyle programs including diet control (orienting food intake, restricting energy intake) associated with moderate but continuous physical activity exhibit better efficacy in reducing GDM prevalence. The results from dietary supplements with myoinositol or probiotics are promising. The actual evidences provide enough arguments for implementing large-scale, high-quality randomized controlled trials looking at the possible benefits of these new approaches for preventing GDM.

  12. As a Matter of Force—Systematic Biases in Idealized Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Grete, Philipp; O’Shea, Brian W.; Beckwith, Kris

    2018-05-01

    Many astrophysical systems encompass very large dynamical ranges in space and time, which are not accessible by direct numerical simulations. Thus, idealized subvolumes are often used to study small-scale effects including the dynamics of turbulence. These turbulent boxes require an artificial driving in order to mimic energy injection from large-scale processes. In this Letter, we show and quantify how the autocorrelation time of the driving and its normalization systematically change the properties of an isothermal compressible magnetohydrodynamic flow in the sub- and supersonic regime and affect astrophysical observations such as Faraday rotation. For example, we find that δ-in-time forcing with a constant energy injection leads to a steeper slope in kinetic energy spectrum and less-efficient small-scale dynamo action. In general, we show that shorter autocorrelation times require more power in the acceleration field, which results in more power in compressive modes that weaken the anticorrelation between density and magnetic field strength. Thus, derived observables, such as the line-of-sight (LOS) magnetic field from rotation measures, are systematically biased by the driving mechanism. We argue that δ-in-time forcing is unrealistic and numerically unresolved, and conclude that special care needs to be taken in interpreting observational results based on the use of idealized simulations.

  13. Detection of the pairwise kinematic Sunyaev-Zel'dovich effect with BOSS DR11 and the Atacama Cosmology Telescope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernardis, F. De; Aiola, S.; Vavagiakis, E. M.

    Here, we present a new measurement of the kinematic Sunyaev-Zel'dovich effect using data from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS). Using 600 square degrees of overlapping sky area, we evaluate the mean pairwise baryon momentum associated with the positions of 50,000 bright galaxies in the BOSS DR11 Large Scale Structure catalog. A non-zero signal arises from the large-scale motions of halos containing the sample galaxies. The data fits an analytical signal model well, with the optical depth to microwave photon scattering as a free parameter determining the overall signal amplitude. We estimate the covariancemore » matrix of the mean pairwise momentum as a function of galaxy separation, using microwave sky simulations, jackknife evaluation, and bootstrap estimates. The most conservative simulation-based errors give signal-to-noise estimates between 3.6 and 4.1 for varying galaxy luminosity cuts. We discuss how the other error determinations can lead to higher signal-to-noise values, and consider the impact of several possible systematic errors. Estimates of the optical depth from the average thermal Sunyaev-Zel'dovich signal at the sample galaxy positions are broadly consistent with those obtained from the mean pairwise momentum signal.« less

  14. Detection of the pairwise kinematic Sunyaev-Zel'dovich effect with BOSS DR11 and the Atacama Cosmology Telescope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernardis, F. De; Vavagiakis, E.M.; Niemack, M.D.

    We present a new measurement of the kinematic Sunyaev-Zel'dovich effect using data from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS). Using 600 square degrees of overlapping sky area, we evaluate the mean pairwise baryon momentum associated with the positions of 50,000 bright galaxies in the BOSS DR11 Large Scale Structure catalog. A non-zero signal arises from the large-scale motions of halos containing the sample galaxies. The data fits an analytical signal model well, with the optical depth to microwave photon scattering as a free parameter determining the overall signal amplitude. We estimate the covariance matrixmore » of the mean pairwise momentum as a function of galaxy separation, using microwave sky simulations, jackknife evaluation, and bootstrap estimates. The most conservative simulation-based errors give signal-to-noise estimates between 3.6 and 4.1 for varying galaxy luminosity cuts. We discuss how the other error determinations can lead to higher signal-to-noise values, and consider the impact of several possible systematic errors. Estimates of the optical depth from the average thermal Sunyaev-Zel'dovich signal at the sample galaxy positions are broadly consistent with those obtained from the mean pairwise momentum signal.« less

  15. Detection of the Pairwise Kinematic Sunyaev-Zel'dovich Effect with BOSS DR11 and the Atacama Cosmology Telescope

    NASA Technical Reports Server (NTRS)

    De Bernardis, F.; Aiola, S.; Vavagiakis, E. M.; Battaglia, N.; Niemack, M. D.; Beall, J.; Becker, D. T.; Bond, J. R.; Calabrese, E.; Cho, H.; hide

    2017-01-01

    We present a new measurement of the kinematic Sunyaev-Zel'dovich effect using data from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS). Using 600 square degrees of overlapping sky area, we evaluate the mean pairwise baryon momentum associated with the positions of 50,000 bright galaxies in the BOSS DR11 Large Scale Structure catalog. A non-zero signal arises from the large-scale motions of halos containing the sample galaxies. The data fits an analytical signal model well, with the optical depth to microwave photon scattering as a free parameter determining the overall signal amplitude. We estimate the covariance matrix of the mean pairwise momentum as a function of galaxy separation, using microwave sky simulations, jackknife evaluation, and bootstrap estimates. The most conservative simulation-based errors give signal-to-noise estimates between 3.6 and 4.1 for varying galaxy luminosity cuts. We discuss how the other error determinations can lead to higher signal-to-noise values, and consider the impact of several possible systematic errors. Estimates of the optical depth from the average thermal Sunyaev-Zel'dovich signal at the sample galaxy positions are broadly consistent with those obtained from the mean pairwise momentum signal.

  16. Detection of the pairwise kinematic Sunyaev-Zel'dovich effect with BOSS DR11 and the Atacama Cosmology Telescope

    NASA Astrophysics Data System (ADS)

    De Bernardis, F.; Aiola, S.; Vavagiakis, E. M.; Battaglia, N.; Niemack, M. D.; Beall, J.; Becker, D. T.; Bond, J. R.; Calabrese, E.; Cho, H.; Coughlin, K.; Datta, R.; Devlin, M.; Dunkley, J.; Dunner, R.; Ferraro, S.; Fox, A.; Gallardo, P. A.; Halpern, M.; Hand, N.; Hasselfield, M.; Henderson, S. W.; Hill, J. C.; Hilton, G. C.; Hilton, M.; Hincks, A. D.; Hlozek, R.; Hubmayr, J.; Huffenberger, K.; Hughes, J. P.; Irwin, K. D.; Koopman, B. J.; Kosowsky, A.; Li, D.; Louis, T.; Lungu, M.; Madhavacheril, M. S.; Maurin, L.; McMahon, J.; Moodley, K.; Naess, S.; Nati, F.; Newburgh, L.; Nibarger, J. P.; Page, L. A.; Partridge, B.; Schaan, E.; Schmitt, B. L.; Sehgal, N.; Sievers, J.; Simon, S. M.; Spergel, D. N.; Staggs, S. T.; Stevens, J. R.; Thornton, R. J.; van Engelen, A.; Van Lanen, J.; Wollack, E. J.

    2017-03-01

    We present a new measurement of the kinematic Sunyaev-Zel'dovich effect using data from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS). Using 600 square degrees of overlapping sky area, we evaluate the mean pairwise baryon momentum associated with the positions of 50,000 bright galaxies in the BOSS DR11 Large Scale Structure catalog. A non-zero signal arises from the large-scale motions of halos containing the sample galaxies. The data fits an analytical signal model well, with the optical depth to microwave photon scattering as a free parameter determining the overall signal amplitude. We estimate the covariance matrix of the mean pairwise momentum as a function of galaxy separation, using microwave sky simulations, jackknife evaluation, and bootstrap estimates. The most conservative simulation-based errors give signal-to-noise estimates between 3.6 and 4.1 for varying galaxy luminosity cuts. We discuss how the other error determinations can lead to higher signal-to-noise values, and consider the impact of several possible systematic errors. Estimates of the optical depth from the average thermal Sunyaev-Zel'dovich signal at the sample galaxy positions are broadly consistent with those obtained from the mean pairwise momentum signal.

  17. Detection of the pairwise kinematic Sunyaev-Zel'dovich effect with BOSS DR11 and the Atacama Cosmology Telescope

    DOE PAGES

    Bernardis, F. De; Aiola, S.; Vavagiakis, E. M.; ...

    2017-03-07

    Here, we present a new measurement of the kinematic Sunyaev-Zel'dovich effect using data from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS). Using 600 square degrees of overlapping sky area, we evaluate the mean pairwise baryon momentum associated with the positions of 50,000 bright galaxies in the BOSS DR11 Large Scale Structure catalog. A non-zero signal arises from the large-scale motions of halos containing the sample galaxies. The data fits an analytical signal model well, with the optical depth to microwave photon scattering as a free parameter determining the overall signal amplitude. We estimate the covariancemore » matrix of the mean pairwise momentum as a function of galaxy separation, using microwave sky simulations, jackknife evaluation, and bootstrap estimates. The most conservative simulation-based errors give signal-to-noise estimates between 3.6 and 4.1 for varying galaxy luminosity cuts. We discuss how the other error determinations can lead to higher signal-to-noise values, and consider the impact of several possible systematic errors. Estimates of the optical depth from the average thermal Sunyaev-Zel'dovich signal at the sample galaxy positions are broadly consistent with those obtained from the mean pairwise momentum signal.« less

  18. Analyzing the cosmic variance limit of remote dipole measurements of the cosmic microwave background using the large-scale kinetic Sunyaev Zel'dovich effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terrana, Alexandra; Johnson, Matthew C.; Harris, Mary-Jean, E-mail: aterrana@perimeterinstitute.ca, E-mail: mharris8@perimeterinstitute.ca, E-mail: mjohnson@perimeterinstitute.ca

    Due to cosmic variance we cannot learn any more about large-scale inhomogeneities from the primary cosmic microwave background (CMB) alone. More information on large scales is essential for resolving large angular scale anomalies in the CMB. Here we consider cross correlating the large-scale kinetic Sunyaev Zel'dovich (kSZ) effect and probes of large-scale structure, a technique known as kSZ tomography. The statistically anisotropic component of the cross correlation encodes the CMB dipole as seen by free electrons throughout the observable Universe, providing information about long wavelength inhomogeneities. We compute the large angular scale power asymmetry, constructing the appropriate transfer functions, andmore » estimate the cosmic variance limited signal to noise for a variety of redshift bin configurations. The signal to noise is significant over a large range of power multipoles and numbers of bins. We present a simple mode counting argument indicating that kSZ tomography can be used to estimate more modes than the primary CMB on comparable scales. A basic forecast indicates that a first detection could be made with next-generation CMB experiments and galaxy surveys. This paper motivates a more systematic investigation of how close to the cosmic variance limit it will be possible to get with future observations.« less

  19. Aromatherapy for managing menopausal symptoms

    PubMed Central

    Choi, Jiae; Lee, Hye Won; Lee, Ju Ah; Lim, Hyun-Ja; Lee, Myeong Soo

    2018-01-01

    Abstract Background: Aromatherapy is often used as a complementary therapy for women's health. This systematic review aims to evaluate the therapeutic effects of aromatherapy as a management for menopausal symptoms. Methods: Eleven electronic databases will be searched from inception to February 2018. Randomized controlled trials that evaluated any type of aromatherapy against any type of control in individuals with menopausal symptoms will be eligible. The methodological quality will be assessed using the Cochrane risk of bias tool. Two authors will independently assess each study for eligibility and risk of bias and to extract data. Results: This study will provide a high quality synthesis of current evidence of aromatherapy for menopausal symptoms measured with Menopause Rating Scale, the Kupperman Index, the Greene Climacteric Scale, or other validated questionnaires. Conclusions: The conclusion of our systematic review will provide evidence to judge whether aromatherapy is an effective intervention for patient with menopausal women. Ethics and dissemination: Ethical approval will not be required, given that this protocol is for a systematic review. The systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. Systematic review registration: PROSPERO CRD42017079191. PMID:29419673

  20. Comprehensive evaluation of an image segmentation technique for measuring tumor volume from CT images

    NASA Astrophysics Data System (ADS)

    Deng, Xiang; Huang, Haibin; Zhu, Lei; Du, Guangwei; Xu, Xiaodong; Sun, Yiyong; Xu, Chenyang; Jolly, Marie-Pierre; Chen, Jiuhong; Xiao, Jie; Merges, Reto; Suehling, Michael; Rinck, Daniel; Song, Lan; Jin, Zhengyu; Jiang, Zhaoxia; Wu, Bin; Wang, Xiaohong; Zhang, Shuai; Peng, Weijun

    2008-03-01

    Comprehensive quantitative evaluation of tumor segmentation technique on large scale clinical data sets is crucial for routine clinical use of CT based tumor volumetry for cancer diagnosis and treatment response evaluation. In this paper, we present a systematic validation study of a semi-automatic image segmentation technique for measuring tumor volume from CT images. The segmentation algorithm was tested using clinical data of 200 tumors in 107 patients with liver, lung, lymphoma and other types of cancer. The performance was evaluated using both accuracy and reproducibility. The accuracy was assessed using 7 commonly used metrics that can provide complementary information regarding the quality of the segmentation results. The reproducibility was measured by the variation of the volume measurements from 10 independent segmentations. The effect of disease type, lesion size and slice thickness of image data on the accuracy measures were also analyzed. Our results demonstrate that the tumor segmentation algorithm showed good correlation with ground truth for all four lesion types (r = 0.97, 0.99, 0.97, 0.98, p < 0.0001 for liver, lung, lymphoma and other respectively). The segmentation algorithm can produce relatively reproducible volume measurements on all lesion types (coefficient of variation in the range of 10-20%). Our results show that the algorithm is insensitive to lesion size (coefficient of determination close to 0) and slice thickness of image data(p > 0.90). The validation framework used in this study has the potential to facilitate the development of new tumor segmentation algorithms and assist large scale evaluation of segmentation techniques for other clinical applications.

  1. Accurate evaluation and analysis of functional genomics data and methods

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2016-01-01

    The development of technology capable of inexpensively performing large-scale measurements of biological systems has generated a wealth of data. Integrative analysis of these data holds the promise of uncovering gene function, regulation, and, in the longer run, understanding complex disease. However, their analysis has proved very challenging, as it is difficult to quickly and effectively assess the relevance and accuracy of these data for individual biological questions. Here, we identify biases that present challenges for the assessment of functional genomics data and methods. We then discuss evaluation methods that, taken together, begin to address these issues. We also argue that the funding of systematic data-driven experiments and of high-quality curation efforts will further improve evaluation metrics so that they more-accurately assess functional genomics data and methods. Such metrics will allow researchers in the field of functional genomics to continue to answer important biological questions in a data-driven manner. PMID:22268703

  2. Infusion phlebitis assessment measures: a systematic review.

    PubMed

    Ray-Barruel, Gillian; Polit, Denise F; Murfield, Jenny E; Rickard, Claire M

    2014-04-01

    Phlebitis is a common and painful complication of peripheral intravenous cannulation. The aim of this review was to identify the measures used in infusion phlebitis assessment and evaluate evidence regarding their reliability, validity, responsiveness and feasibility. We conducted a systematic literature review of the Cochrane library, Ovid MEDLINE and EBSCO CINAHL until September 2013. All English-language studies (randomized controlled trials, prospective cohort and cross-sectional) that used an infusion phlebitis scale were retrieved and analysed to determine which symptoms were included in each scale and how these were measured. We evaluated studies that reported testing the psychometric properties of phlebitis assessment scales using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) guidelines. Infusion phlebitis was the primary outcome measure in 233 studies. Fifty-three (23%) of these provided no actual definition of phlebitis. Of the 180 studies that reported measuring phlebitis incidence and/or severity, 101 (56%) used a scale and 79 (44%) used a definition alone. We identified 71 different phlebitis assessment scales. Three scales had undergone some psychometric analyses, but no scale had been rigorously tested. Many phlebitis scales exist, but none has been thoroughly validated for use in clinical practice. A lack of consensus on phlebitis measures has likely contributed to disparities in reported phlebitis incidence, precluding meaningful comparison of phlebitis rates. © 2014 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  3. Disentangling dark energy and cosmic tests of gravity from weak lensing systematics

    NASA Astrophysics Data System (ADS)

    Laszlo, Istvan; Bean, Rachel; Kirk, Donnacha; Bridle, Sarah

    2012-06-01

    We consider the impact of key astrophysical and measurement systematics on constraints on dark energy and modifications to gravity on cosmic scales. We focus on upcoming photometric ‘stage III’ and ‘stage IV’ large-scale structure surveys such as the Dark Energy Survey (DES), the Subaru Measurement of Images and Redshifts survey, the Euclid survey, the Large Synoptic Survey Telescope (LSST) and Wide Field Infra-Red Space Telescope (WFIRST). We illustrate the different redshift dependencies of gravity modifications compared to intrinsic alignments, the main astrophysical systematic. The way in which systematic uncertainties, such as galaxy bias and intrinsic alignments, are modelled can change dark energy equation-of-state parameter and modified gravity figures of merit by a factor of 4. The inclusion of cross-correlations of cosmic shear and galaxy position measurements helps reduce the loss of constraining power from the lensing shear surveys. When forecasts for Planck cosmic microwave background and stage IV surveys are combined, constraints on the dark energy equation-of-state parameter and modified gravity model are recovered, relative to those from shear data with no systematic uncertainties, provided fewer than 36 free parameters in total are used to describe the galaxy bias and intrinsic alignment models as a function of scale and redshift. While some uncertainty in the intrinsic alignment (IA) model can be tolerated, it is going to be important to be able to parametrize IAs well in order to realize the full potential of upcoming surveys. To facilitate future investigations, we also provide a fitting function for the matter power spectrum arising from the phenomenological modified gravity model we consider.

  4. Impact of systemic sclerosis oral manifestations on patients' health-related quality of life: a systematic review.

    PubMed

    Smirani, Rawen; Truchetet, Marie-Elise; Poursac, Nicolas; Naveau, Adrien; Schaeverbeke, Thierry; Devillard, Raphaël

    2018-06-01

    Oropharyngeal features are frequent and often understated in the treatment clinical guidelines of systemic sclerosis in spite of important consequences on comfort, esthetics, nutrition and daily life. The aim of this systematic review was to assess a correlation between the oropharyngeal manifestations of systemic sclerosis and patients' health-related quality of life. A systematic search was conducted using four databases [PubMed ® , Cochrane Database ® , Dentistry & Oral Sciences Source ® , and SCOPUS ® ] up to January 2018, according to the Preferred reporting items for systematic reviews and meta analyses. Grey literature and hand search were also included. Study selection, risk bias assessment (Newcastle-Ottawa scale) and data extraction were performed by two independent reviewers. The review protocol was registered on PROSPERO database with the code CRD42018085994. From 375 screened studies, 6 cross-sectional studies were included in the systematic review. The total number of patients included per study ranged from 84 to 178. These studies reported a statistically significant association between oropharyngeal manifestations of systemic sclerosis (mainly assessed by maximal mouth opening and the mouth handicap in systemic sclerosis scale) and an impaired quality of life (measured by different scales). Studies were unequal concerning risk of bias mostly because of low level of evidence, different recruiting sources of samples, and different scales to assess the quality of life. This systematic review demonstrates a correlation between oropharyngeal manifestations of systemic sclerosis and impaired quality of life, despite the low level of evidence of included studies. Large-scaled studies are needed to provide stronger evidence of this association. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  5. Measurement properties of patient-reported outcome measures (PROMS) in Patellofemoral Pain Syndrome: a systematic review.

    PubMed

    Green, Andrew; Liles, Clive; Rushton, Alison; Kyte, Derek G

    2014-12-01

    This systematic review investigated the measurement properties of disease-specific patient-reported outcome measures used in Patellofemoral Pain Syndrome. Two independent reviewers conducted a systematic search of key databases (MEDLINE, EMBASE, AMED, CINHAL+ and the Cochrane Library from inception to August 2013) to identify relevant studies. A third reviewer mediated in the event of disagreement. Methodological quality was evaluated using the validated COSMIN (Consensus-based Standards for the Selection of Health Measurement Instruments) tool. Data synthesis across studies determined the level of evidence for each patient-reported outcome measure. The search strategy returned 2177 citations. Following the eligibility review phase, seven studies, evaluating twelve different patient-reported outcome measures, met inclusion criteria. A 'moderate' level of evidence supported the structural validity of several measures: the Flandry Questionnaire, Anterior Knee Pain Scale, Functional Index Questionnaire, Eng and Pierrynowski Questionnaire and Visual Analogue Scales for 'usual' and 'worst' pain. In addition, there was a 'Limited' level of evidence supporting the test-retest reliability and validity (cross-cultural, hypothesis testing) of the Persian version of the Anterior Knee Pain Scale. Other measurement properties were evaluated with poor methodological quality, and many properties were not evaluated in any of the included papers. Current disease-specific outcome measures for Patellofemoral Pain Syndrome require further investigation. Future studies should evaluate all important measurement properties, utilising an appropriate framework such as COSMIN to guide study design, to facilitate optimal methodological quality. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. The use of concept mapping for scale development and validation in evaluation.

    PubMed

    Rosas, Scott R; Camphausen, Lauren C

    2007-05-01

    Evaluators often make key decisions about what content to include when designing new scales. However, without clear conceptual grounding, there is a risk these decisions may compromise the scale's validity. Techniques such as concept mapping are available to evaluators for the specification of conceptual frameworks, but have not been used as a fully integrated part of scale development. As part of a multi-site evaluation of family support programs, we integrated concept mapping with traditional scale-development processes to strengthen the creation of a scale for inclusion in an evaluation instrument. Using concept mapping, we engaged staff and managers in the development of a framework of intended benefits of program participation and used the information to systematically select the scale's content. The psychometric characteristics of the scale were then formally assessed using a sample of program participants. The implications of the approach for supporting construct validity, inclusion of staff and managers, and theory-driven evaluation are discussed.

  7. The large-scale organization of metabolic networks

    NASA Astrophysics Data System (ADS)

    Jeong, H.; Tombor, B.; Albert, R.; Oltvai, Z. N.; Barabási, A.-L.

    2000-10-01

    In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents.

  8. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  9. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    PubMed Central

    Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine

    2016-01-01

    The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400

  10. Patient-reported outcome instruments that evaluate adherence behaviours in adults with asthma: A systematic review of measurement properties.

    PubMed

    Gagné, Myriam; Boulet, Louis-Philippe; Pérez, Norma; Moisan, Jocelyne

    2018-04-30

    To systematically identify the measurement properties of patient-reported outcome instruments (PROs) that evaluate adherence to inhaled maintenance medication in adults with asthma. We conducted a systematic review of six databases. Two reviewers independently included studies on the measurement properties of PROs that evaluated adherence in asthmatic participants aged ≥18 years. Based on the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN), the reviewers (1) extracted data on internal consistency, reliability, measurement error, content validity, structural validity, hypotheses testing, cross-cultural validity, criterion validity, and responsiveness; (2) assessed the methodological quality of the included studies; (3) assessed the quality of the measurement properties (positive or negative); and (4) summarised the level of evidence (limited, moderate, or strong). We screened 6,068 records and included 15 studies (14 PROs). No studies evaluated measurement error or responsiveness. Based on methodological and measurement property quality assessments, we found limited positive evidence of: (a) internal consistency of the Adherence Questionnaire, Refined Medication Adherence Reason Scale (MAR-Scale), Medication Adherence Report Scale for Asthma (MARS-A), and Test of the Adherence to Inhalers (TAI); (b) reliability of the TAI; and (c) structural validity of the Adherence Questionnaire, MAR-Scale, MARS-A, and TAI. We also found limited negative evidence of: (d) hypotheses testing of Adherence Questionnaire; (e) reliability of the MARS-A; and (f) criterion validity of the MARS-A and TAI. Our results highlighted the need to conduct further high-quality studies that will positively evaluate the reliability, validity, and responsiveness of the available PROs. This article is protected by copyright. All rights reserved.

  11. Deciphering landslide behavior using large-scale flume experiments

    USGS Publications Warehouse

    Reid, Mark E.; Iverson, Richard M.; Iverson, Neal R.; LaHusen, Richard G.; Brien, Dianne L.; Logan, Matthew

    2008-01-01

    Landslides can be triggered by a variety of hydrologic events and they can exhibit a wide range of movement dynamics. Effective prediction requires understanding these diverse behaviors. Precise evaluation in the field is difficult; as an alternative we performed a series of landslide initiation experiments in the large-scale, USGS debris-flow flume. We systematically investigated the effects of three different hydrologic triggering mechanisms, including groundwater exfiltration from bedrock, prolonged rainfall infiltration, and intense bursts of rain. We also examined the effects of initial soil porosity (loose or dense) relative to the soil’s critical-state porosity. Results show that all three hydrologic mechanisms can instigate landsliding, but water pathways, sensor response patterns, and times to failure differ. Initial soil porosity has a profound influence on landslide movement behavior. Experiments using loose soil show rapid soil contraction during failure, with elevated pore pressures liquefying the sediment and creating fast-moving debris flows. In contrast, dense soil dilated upon shearing, resulting in slow, gradual, and episodic motion. These results have fundamental implications for forecasting landslide behavior and developing effective warning systems.

  12. Not a Copernican observer: biased peculiar velocity statistics in the local Universe

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Nusser, Adi; Feix, Martin; Bilicki, Maciej

    2017-05-01

    We assess the effect of the local large-scale structure on the estimation of two-point statistics of the observed radial peculiar velocities of galaxies. A large N-body simulation is used to examine these statistics from the perspective of random observers as well as 'Local Group-like' observers conditioned to reside in an environment resembling the observed Universe within 20 Mpc. The local environment systematically distorts the shape and amplitude of velocity statistics with respect to ensemble-averaged measurements made by a Copernican (random) observer. The Virgo cluster has the most significant impact, introducing large systematic deviations in all the statistics. For a simple 'top-hat' selection function, an idealized survey extending to ˜160 h-1 Mpc or deeper is needed to completely mitigate the effects of the local environment. Using shallower catalogues leads to systematic deviations of the order of 50-200 per cent depending on the scale considered. For a flat redshift distribution similar to the one of the CosmicFlows-3 survey, the deviations are even more prominent in both the shape and amplitude at all separations considered (≲100 h-1 Mpc). Conclusions based on statistics calculated without taking into account the impact of the local environment should be revisited.

  13. Economic evaluation of vaccines in Canada: A systematic review

    PubMed Central

    Chit, Ayman; Lee, Jason K. H.; Shim, Minsup; Nguyen, Van Hai; Grootendorst, Paul; Wu, Jianhong; Van Exan, Robert; Langley, Joanne M.

    2016-01-01

    ABSTRACT Background: Economic evaluations should form part of the basis for public health decision making on new vaccine programs. While Canada's national immunization advisory committee does not systematically include economic evaluations in immunization decision making, there is increasing interest in adopting them. We therefore sought to examine the extent and quality of economic evaluations of vaccines in Canada. Objective: We conducted a systematic review of economic evaluations of vaccines in Canada to determine and summarize: comprehensiveness across jurisdictions, studied vaccines, funding sources, study designs, research quality, and changes over time. Methods: Searches in multiple databases were conducted using the terms “vaccine,” “economics” and “Canada.” Descriptive data from eligible manuscripts was abstracted and three authors independently evaluated manuscript quality using a 7-point Likert-type scale scoring tool based on criteria from the International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Results: 42/175 articles met the search criteria. Of these, Canada-wide studies were most common (25/42), while provincial studies largely focused on the three populous provinces of Ontario, Quebec and British Columbia. The most common funding source was industry (17/42), followed by government (7/42). 38 studies used mathematical models estimating expected economic benefit while 4 studies examined post-hoc data on established programs. Studies covered 10 diseases, with 28/42 addressing pediatric vaccines. Many studies considered cost-utility (22/42) and the majority of these studies reported favorable economic results (16/22). The mean quality score was 5.9/7 and was consistent over publication date, funding sources, and disease areas. Conclusions: We observed diverse approaches to evaluate vaccine economics in Canada. Given the increased complexity of economic studies evaluating vaccines and the impact of results on public health practice, Canada needs improved, transparent and consistent processes to review and assess the findings of the economic evaluations of vaccines. PMID:26890128

  14. A Comparison of Systematic Screening Tools for Emotional and Behavioral Disorders: A Replication

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Kalberg, Jemma Robertson; Lambert, E. Warren; Crnobori, Mary; Bruhn, Allison Leigh

    2010-01-01

    In this article, the authors examine the psychometric properties of the Student Risk Screening Scale (SRSS), including evaluating the concurrent validity of the SRSS to predict results from the Systematic Screening for Behavior Disorders (SSBD) when used to detect school children with externalizing or internalizing behavior concerns at three…

  15. Eating Disorders in Non-Dance Performing Artists: A Systematic Literature Review.

    PubMed

    Kapsetaki, Marianna E; Easmon, Charlie

    2017-12-01

    Previous literature on dancers and athletes has shown a large impact of eating disorders (EDs) on these individuals, but there is limited research on EDs affecting non-dance performing artists (i.e., musicians, actors, etc.). This systematic review aimed to identify and evaluate the literature on EDs in non-dance performing artists. A systematic review of the literature was performed on 24 databases, using search terms related to EDs and non-dance performing artists. All results from the databases were systematically screened for inclusion and exclusion criteria. The initial search returned 86,383 total articles, which after screening and removal of duplicates and irrelevant papers yielded 129 results. After screening the 129 full-text results for eligibility, 10 studies met criteria for inclusion: 6 papers addressed EDs in musicians, and 4 papers addressed EDs in theatre performers. Most studies used questionnaires and body mass index (BMI) as diagnostic tools for EDs. Most were small-scale studies and participants were mostly students. Because of the studies' heterogeneity and varying quality, the results obtained were often contradictory and questionable. Although there has been a lot of literature in dancers, we found relatively few studies associating EDs with other performing artists, and most were inconsistent in their information.

  16. Uneven flows: On cosmic bulk flows, local observers, and gravity

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Bilicki, Maciej; Libeskind, Noam I.

    2018-05-01

    Using N -body simulations we study the impact of various systematic effects on the low-order moments of the cosmic velocity field: the bulk flow (BF) and the cosmic Mach number (CMN). We consider two types of systematics: those related to survey properties and those induced by the observer's location in the Universe. In the former category we model sparse sampling, velocity errors, and survey incompleteness (radial and geometrical). In the latter, we consider local group (LG) analogue observers, placed in a specific location within the cosmic web, satisfying various observational criteria. We differentiate such LG observers from Copernican ones, who are at random locations. We report strong systematic effects on the measured BF and CMN induced by sparse sampling, velocity errors and radial incompleteness. For BF most of these effects exceed 10% for scales R ≲100 h-1 Mpc . For CMN some of these systematics can be catastrophically large (i.e., >50 %) also on bigger scales. Moreover, we find that the position of the observer in the cosmic web significantly affects the locally measured BF (CMN), with effects as large as ˜20 % (30 % ) at R ≲50 h-1 Mpc for a LG-like observer as compared to a random one. This effect is comparable to the sample variance at the same scales. Such location-dependent effects have not been considered previously in BF and CMN studies and here we report their magnitude and scale for the first time. To highlight the importance of these systematics, we additionally study a model of modified gravity with ˜15 % enhanced growth rate (compared to general relativity). We found that the systematic effects can mimic the modified gravity signal. The worst-case scenario is realized for a case of a LG-like observer, when the effects induced by local structures are degenerate with the enhanced growth rate fostered by modified gravity. Our results indicate that dedicated constrained simulations and realistic mock galaxy catalogs will be absolutely necessary to fully benefit from the statistical power of the forthcoming peculiar velocity data from surveys such as TAIPAN, WALLABY, COSMICFLOWS-4 and SKA.

  17. [Measurement of shoulder disability in the athlete: a systematic review].

    PubMed

    Fayad, F; Mace, Y; Lefevre-Colau, M M; Poiraudeau, S; Rannou, F; Revel, M

    2004-08-01

    To identify all available shoulder disability questionnaires and to examine those that could be used for athlete. We systematically reviewed the literature in Medline using the keywords shoulder, function, scale, index, score, questionnaire, disability, quality of life, assessment, and evaluation. We searched for scales used for athletes with the keywords scale name AND (sport OR athlete). Data were completed by using the "Guide des Outils de Mesure et d'Evaluation en Médecine Physique et de Réadaptation" textbook. Analysis took into account the clinimetric quality of the instruments and the number of items specifically related to sports. A total of 37 instruments have been developed to measure disease-, shoulder-specific or upper extremity specific outcome. Older instruments were developed before the advent of modern measurement methods. They usually combined objective and subjective measures. Recent instruments were designed with use of more advanced methods. Most are self-administered questionnaires. Fourteen scales included items assessing sport activity. Four of these scales have been used to assess shoulder disability in athlete. Six scales have been used to assess such disability but do not have specific items related to sports. There is no gold standard for assessing shoulder outcome in the general population and no validated outcome instruments specifically for athletes. We suggest the use of ASES, WOSI and WORC scales for evaluating shoulder function in the recreational athletes. The DASH scale should be evaluated in this population. The principal criterion in evaluating shoulder function in the high level athlete is a return to the same level of sport performance. Further studies are required to identify measurement tools for shoulder disability that have a high predictive value for return to sport.

  18. Methodologic quality of meta-analyses and systematic reviews on the Mediterranean diet and cardiovascular disease outcomes: a review.

    PubMed

    Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane

    2016-03-01

    Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic quality standards but also include more statistical modeling results when data allow. © 2016 American Society for Nutrition.

  19. A national evaluation of a dissemination and implementation initiative to enhance primary care practice capacity and improve cardiovascular disease care: the ESCALATES study protocol.

    PubMed

    Cohen, Deborah J; Balasubramanian, Bijal A; Gordon, Leah; Marino, Miguel; Ono, Sarah; Solberg, Leif I; Crabtree, Benjamin F; Stange, Kurt C; Davis, Melinda; Miller, William L; Damschroder, Laura J; McConnell, K John; Creswell, John

    2016-06-29

    The Agency for Healthcare Research and Quality (AHRQ) launched the EvidenceNOW Initiative to rapidly disseminate and implement evidence-based cardiovascular disease (CVD) preventive care in smaller primary care practices. AHRQ funded eight grantees (seven regional Cooperatives and one independent national evaluation) to participate in EvidenceNOW. The national evaluation examines quality improvement efforts and outcomes for more than 1500 small primary care practices (restricted to those with fewer than ten physicians per clinic). Examples of external support include practice facilitation, expert consultation, performance feedback, and educational materials and activities. This paper describes the study protocol for the EvidenceNOW national evaluation, which is called Evaluating System Change to Advance Learning and Take Evidence to Scale (ESCALATES). This prospective observational study will examine the portfolio of EvidenceNOW Cooperatives using both qualitative and quantitative data. Qualitative data include: online implementation diaries, observation and interviews at Cooperatives and practices, and systematic assessment of context from the perspective of Cooperative team members. Quantitative data include: practice-level performance on clinical quality measures (aspirin prescribing, blood pressure and cholesterol control, and smoking cessation; ABCS) collected by Cooperatives from electronic health records (EHRs); practice and practice member surveys to assess practice capacity and other organizational and structural characteristics; and systematic tracking of intervention delivery. Quantitative, qualitative, and mixed methods analyses will be conducted to examine how Cooperatives organize to provide external support to practices, to compare effectiveness of the dissemination and implementation approaches they implement, and to examine how regional variations and other organization and contextual factors influence implementation and effectiveness. ESCALATES is a national evaluation of an ambitious large-scale dissemination and implementation effort focused on transforming smaller primary care practices. Insights will help to inform the design of national health care practice extension systems aimed at supporting practice transformation efforts in the USA. NCT02560428 (09/21/15).

  20. Benefits and Challenges of Scaling Up Expansion of Marine Protected Area Networks in the Verde Island Passage, Central Philippines.

    PubMed

    Horigue, Vera; Pressey, Robert L; Mills, Morena; Brotánková, Jana; Cabral, Reniel; Andréfouët, Serge

    2015-01-01

    Locally-established marine protected areas (MPAs) have been proven to achieve local-scale fisheries and conservation objectives. However, since many of these MPAs were not designed to form ecologically-connected networks, their contributions to broader-scale goals such as complementarity and connectivity can be limited. In contrast, integrated networks of MPAs designed with systematic conservation planning are assumed to be more effective--ecologically, socially, and economically--than collections of locally-established MPAs. There is, however, little empirical evidence that clearly demonstrates the supposed advantages of systematic MPA networks. A key reason is the poor record of implementation of systematic plans attributable to lack of local buy-in. An intermediate scenario for the expansion of MPAs is scaling up of local decisions, whereby locally-driven MPA initiatives are coordinated through collaborative partnerships among local governments and their communities. Coordination has the potential to extend the benefits of individual MPAs and perhaps to approach the potential benefits offered by systematic MPA networks. We evaluated the benefits of scaling up local MPAs to form networks by simulating seven expansion scenarios for MPAs in the Verde Island Passage, central Philippines. The scenarios were: uncoordinated community-based establishment of MPAs; two scenarios reflecting different levels of coordinated MPA expansion through collaborative partnerships; and four scenarios guided by systematic conservation planning with different contexts for governance. For each scenario, we measured benefits through time in terms of achievement of objectives for representation of marine habitats. We found that: in any governance context, systematic networks were more efficient than non-systematic ones; systematic networks were more efficient in broader governance contexts; and, contrary to expectations but with caveats, the uncoordinated scenario was slightly more efficient than the coordinated scenarios. Overall, however, coordinated MPA networks have the potential to be more efficient than the uncoordinated ones, especially when coordinated planning uses systematic methods.

  1. Benefits and Challenges of Scaling Up Expansion of Marine Protected Area Networks in the Verde Island Passage, Central Philippines

    PubMed Central

    Horigue, Vera; Pressey, Robert L.; Mills, Morena; Brotánková, Jana; Cabral, Reniel; Andréfouët, Serge

    2015-01-01

    Locally-established marine protected areas (MPAs) have been proven to achieve local-scale fisheries and conservation objectives. However, since many of these MPAs were not designed to form ecologically-connected networks, their contributions to broader-scale goals such as complementarity and connectivity can be limited. In contrast, integrated networks of MPAs designed with systematic conservation planning are assumed to be more effective—ecologically, socially, and economically—than collections of locally-established MPAs. There is, however, little empirical evidence that clearly demonstrates the supposed advantages of systematic MPA networks. A key reason is the poor record of implementation of systematic plans attributable to lack of local buy-in. An intermediate scenario for the expansion of MPAs is scaling up of local decisions, whereby locally-driven MPA initiatives are coordinated through collaborative partnerships among local governments and their communities. Coordination has the potential to extend the benefits of individual MPAs and perhaps to approach the potential benefits offered by systematic MPA networks. We evaluated the benefits of scaling up local MPAs to form networks by simulating seven expansion scenarios for MPAs in the Verde Island Passage, central Philippines. The scenarios were: uncoordinated community-based establishment of MPAs; two scenarios reflecting different levels of coordinated MPA expansion through collaborative partnerships; and four scenarios guided by systematic conservation planning with different contexts for governance. For each scenario, we measured benefits through time in terms of achievement of objectives for representation of marine habitats. We found that: in any governance context, systematic networks were more efficient than non-systematic ones; systematic networks were more efficient in broader governance contexts; and, contrary to expectations but with caveats, the uncoordinated scenario was slightly more efficient than the coordinated scenarios. Overall, however, coordinated MPA networks have the potential to be more efficient than the uncoordinated ones, especially when coordinated planning uses systematic methods. PMID:26288089

  2. The Cosmology Large Angular Scale Surveyor

    NASA Astrophysics Data System (ADS)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; Dahal, Sumit; Denis, Kevin; Dünner, Rolando; Eimer, Joseph; Essinger-Hileman, Thomas; Fluxa, Pedro; Halpern, Mark; Hilton, Gene; Hinshaw, Gary F.; Hubmayr, Johannes; Iuliano, Jeffrey; Karakla, John; McMahon, Jeff; Miller, Nathan T.; Moseley, Samuel H.; Palma, Gonzalo; Parker, Lucas; Petroff, Matthew; Pradenas, Bastián.; Rostem, Karwan; Sagliocca, Marco; Valle, Deniz; Watts, Duncan; Wollack, Edward; Xu, Zhilei; Zeng, Lingzhen

    2016-07-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  3. The Cosmology Large Angular Scale Surveyor (CLASS)

    NASA Technical Reports Server (NTRS)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  4. Systematic review of empowerment measures in health promotion.

    PubMed

    Cyril, Sheila; Smith, Ben J; Renzaho, Andre M N

    2016-12-01

    Empowerment, a multi-level construct comprising individual, community and organizational domains, is a fundamental value and goal in health promotion. While a range of scales have been developed for the measurement of empowerment, the qualities of these have not been rigorously assessed. The aim of this study was to evaluate the measurement properties of quantitative empowerment scales and their applicability in health promotion programs. A systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was done to evaluate empowerment scales across three dimensions: item development, reliability and validity. This was followed by assessment of measurement properties using a ratings scale with criteria addressing an a priori explicit theoretical framework, assessment of content validity, internal consistency and factor analysis to test structural validity. Of the 20 studies included in this review, only 8 (40%) used literature reviews, expert panels and empirical studies to develop scale items and 9 (45%) of studies fulfilled ≥5 criteria on the ratings scale. Two studies (10%) measured community empowerment and one study measured organizational empowerment, the rest (85%) measured individual empowerment. This review highlights important gaps in the measurement of community and organizational domains of empowerment using quantitative scales. A priority for future empowerment research is to investigate and explore approaches such as mixed methods to enable adequate measurement of empowerment across all three domains. This would help health promotion practitioners to effectively measure empowerment as a driver of change and an outcome in health promotion programs. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Sustainable Utilization of Traditional Chinese Medicine Resources: Systematic Evaluation on Different Production Modes

    PubMed Central

    Li, Xiwen; Chen, Yuning; Yang, Qing; Wang, Yitao

    2015-01-01

    The usage amount of medicinal plant rapidly increased along with the development of traditional Chinese medicine industry. The higher market demand and the shortage of wild herbal resources enforce us to carry out large-scale introduction and cultivation. Herbal cultivation can ease current contradiction between medicinal resources supply and demand while they bring new problems such as pesticide residues and plant disease and pests. Researchers have recently placed high hopes on the application of natural fostering, a new method incorporated herbal production and diversity protecting practically, which can solve the problems brought by artificial cultivation. However no modes can solve all problems existing in current herbal production. This study evaluated different production modes including cultivation, natural fostering, and wild collection to guide the traditional Chinese medicine production for sustainable utilization of herbal resources. PMID:26074987

  6. A quasi-experimental feasibility study to determine the effect of a systematic treatment programme on the scores of the Nottingham Adjustment Scale of individuals with visual field deficits following stroke.

    PubMed

    Taylor, Lisa; Poland, Fiona; Harrison, Peter; Stephenson, Richard

    2011-01-01

    To evaluate a systematic treatment programme developed by the researcher that targeted aspects of visual functioning affected by visual field deficits following stroke. The study design was a non-equivalent control (conventional) group pretest-posttest quasi-experimental feasibility design, using multisite data collection methods at specified stages. The study was undertaken within three acute hospital settings as outpatient follow-up sessions. Individuals who had visual field deficits three months post stroke were studied. A treatment group received routine occupational therapy and an experimental group received, in addition, a systematic treatment programme. The treatment phase of both groups lasted six weeks. The Nottingham Adjustment Scale, a measure developed specifically for visual impairment, was used as the primary outcome measure. The change in Nottingham Adjustment Scale score was compared between the experimental (n = 7) and conventional (n = 8) treatment groups using the Wilcoxon signed ranks test. The result of Z = -2.028 (P = 0.043) showed that there was a statistically significant difference between the change in Nottingham Adjustment Scale score between both groups. The introduction of the systematic treatment programme resulted in a statistically significant change in the scores of the Nottingham Adjustment Scale.

  7. Strategies for delivering insecticide-treated nets at scale for malaria control: a systematic review

    PubMed Central

    Paintain, Lucy Smith; Mangham, Lindsay; Car, Josip; Schellenberg, Joanna Armstrong

    2012-01-01

    Abstract Objective To synthesize findings from recent studies of strategies to deliver insecticide-treated nets (ITNs) at scale in malaria-endemic areas. Methods Databases were searched for studies published between January 2000 and December 2010 in which: subjects resided in areas with endemicity for Plasmodium falciparum and Plasmodium vivax malaria; ITN delivery at scale was evaluated; ITN ownership among households, receipt by pregnant women and/or use among children aged < 5 years was evaluated; and the study design was an individual or cluster-randomized controlled design, nonrandomized, quasi-experimental, before-and-after, interrupted time series or cross-sectional without temporal or geographical controls. Papers describing qualitative studies, case studies, process evaluations and cost-effectiveness studies linked to an eligible paper were also included. Study quality was assessed using the Cochrane risk of bias checklist and GRADE criteria. Important influences on scaling up were identified and assessed across delivery strategies. Findings A total of 32 papers describing 20 African studies were reviewed. Many delivery strategies involved health sectors and retail outlets (partial subsidy), antenatal care clinics (full subsidy) and campaigns (full subsidy). Strategies achieving high ownership among households and use among children < 5 delivered ITNs free through campaigns. Costs were largely comparable across strategies; ITNs were the main cost. Cost-effectiveness estimates were most sensitive to the assumed net lifespan and leakage. Common barriers to delivery included cost, stock-outs and poor logistics. Common facilitators were staff training and supervision, cooperation across departments or ministries and stakeholder involvement. Conclusion There is a broad taxonomy of strategies for delivering ITNs at scale. PMID:22984312

  8. Construction and comparison of parallel implicit kinetic solvers in three spatial dimensions

    NASA Astrophysics Data System (ADS)

    Titarev, Vladimir; Dumbser, Michael; Utyuzhnikov, Sergey

    2014-01-01

    The paper is devoted to the further development and systematic performance evaluation of a recent deterministic framework Nesvetay-3D for modelling three-dimensional rarefied gas flows. Firstly, a review of the existing discretization and parallelization strategies for solving numerically the Boltzmann kinetic equation with various model collision integrals is carried out. Secondly, a new parallelization strategy for the implicit time evolution method is implemented which improves scaling on large CPU clusters. Accuracy and scalability of the methods are demonstrated on a pressure-driven rarefied gas flow through a finite-length circular pipe as well as an external supersonic flow over a three-dimensional re-entry geometry of complicated aerodynamic shape.

  9. Getting the Most Bang from Your Volunteer Hour: Easy Assessments in the Dark Skies, Bright Kids Program

    NASA Astrophysics Data System (ADS)

    Beaton, R. L.; Sokal, K. R.; Liss, S. E.; Johnson, K. E.

    2015-11-01

    Dark Skies, Bright Kids! (DSBK) is an outreach organization that seeks to enhance elementary-level science literacy and encourage inquiry through fun, hands-on activities. DSBK was formed by, and is operated through, volunteer efforts from professional scientists at all career stages, e.g., from first-year undergraduate students to tenured professors. Although DSBK has amassed over 14,000 contact hours since 2009, there has been no formal evaluation of the programs impacts. Over the past year, DSBK introduced a large-scale, student-led internal assessments program with the systematic evaluation of student workbooks, volunteer surveys, and observations. While the data indicated broad-scale success for the program for both of its goals, it also revealed the organizational and educational practices that not only maximized student achievement, but also created the largest overall volunteer satisfaction with their time commitment. Here we describe DSBK in detail, summarize the student-led implementation of the assessments program, discuss how the results of the assessments have positively impacted our operations, and generalize these results for other scientist-led outreach efforts.

  10. The Effects of Systematic Training for Effective Parenting on Parental Attitudes.

    ERIC Educational Resources Information Center

    Nystul, Michael S.

    1982-01-01

    The Attitude toward the Freedom of Children Scale and the revised Parent Attitude Research Instrument were administered to 28 Australian mothers. Half of the mothers attended a nine-week course in Systematic Training for Effective Parenting (STEP), while the remaining half acted as the control group. A one-way analysis of variance evaluated the…

  11. The role and benefits of accessing primary care patient records during unscheduled care: a systematic review.

    PubMed

    Bowden, Tom; Coiera, Enrico

    2017-09-22

    The purpose of this study was to assess the impact of accessing primary care records on unscheduled care. Unscheduled care is typically delivered in hospital Emergency Departments. Studies published to December 2014 reporting on primary care record access during unscheduled care were retrieved. Twenty-two articles met inclusion criteria from a pool of 192. Many shared electronic health records (SEHRs) were large in scale, servicing many millions of patients. Reported utilization rates by clinicians was variable, with rates >20% amongst health management organizations but much lower in nation-scale systems. No study reported on clinical outcomes or patient safety, and no economic studies of SEHR access during unscheduled care were available. Design factors that may affect utilization included consent and access models, SEHR content, and system usability and reliability. Despite their size and expense, SEHRs designed to support unscheduled care have been poorly evaluated, and it is not possible to draw conclusions about any likely benefits associated with their use. Heterogeneity across the systems and the populations they serve make generalization about system design or performance difficult. None of the reviewed studies used a theoretical model to guide evaluation. Value of Information models may be a useful theoretical approach to design evaluation metrics, facilitating comparison across systems in future studies. Well-designed SEHRs should in principle be capable of improving the efficiency, quality and safety of unscheduled care, but at present the evidence for such benefits is weak, largely because it has not been sought.

  12. Systematic Identification of Combinatorial Drivers and Targets in Cancer Cell Lines

    PubMed Central

    Tabchy, Adel; Eltonsy, Nevine; Housman, David E.; Mills, Gordon B.

    2013-01-01

    There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance. PMID:23577104

  13. Systematic identification of combinatorial drivers and targets in cancer cell lines.

    PubMed

    Tabchy, Adel; Eltonsy, Nevine; Housman, David E; Mills, Gordon B

    2013-01-01

    There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance.

  14. Accuracy of the Alberta Infant Motor Scale (AIMS) to detect developmental delay of gross motor skills in preterm infants: a systematic review.

    PubMed

    de Albuquerque, Plínio Luna; Lemos, Andrea; Guerra, Miriam Queiroz de Farias; Eickmann, Sophie Helena

    2015-02-01

    To assess, through a systematic review, the ability of Alberta Infant Motor Scale (AIMS) to diagnose delayed motor development in preterm infants. Systematic searches identified five studies meeting inclusion criteria. These studies were evaluated in terms of: participants' characteristics, main results and risk of bias. The risk of bias was assessed with the Quality Assessment of Diagnostic Accuracy Studies--second edition (QUADAS-2). All five studies included a high risk of bias in at least one of the assessed fields. The most frequent biases included were presented in patient selection and lost follow up. All studies used the Pearson correlation coefficient to assess the diagnostic capability of the Alberta Infant Motor Scale. None of the assessed studies used psychometric measures to analyze the data. Given the evidence, the research supporting the ability of Alberta Infant Motor Scale to diagnose delayed motor development in preterm infants presents limitations. Further studies are suggested in order to avoid the above-mentioned biases to assess the Alberta Infant Motor Scale accuracy in preterm babies.

  15. SHEAR-DRIVEN DYNAMO WAVES IN THE FULLY NONLINEAR REGIME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pongkitiwanichakul, P.; Nigro, G.; Cattaneo, F.

    2016-07-01

    Large-scale dynamo action is well understood when the magnetic Reynolds number ( Rm ) is small, but becomes problematic in the astrophysically relevant large Rm limit since the fluctuations may control the operation of the dynamo, obscuring the large-scale behavior. Recent works by Tobias and Cattaneo demonstrated numerically the existence of large-scale dynamo action in the form of dynamo waves driven by strongly helical turbulence and shear. Their calculations were carried out in the kinematic regime in which the back-reaction of the Lorentz force on the flow is neglected. Here, we have undertaken a systematic extension of their work tomore » the fully nonlinear regime. Helical turbulence and large-scale shear are produced self-consistently by prescribing body forces that, in the kinematic regime, drive flows that resemble the original velocity used by Tobias and Cattaneo. We have found four different solution types in the nonlinear regime for various ratios of the fluctuating velocity to the shear and Reynolds numbers. Some of the solutions are in the form of propagating waves. Some solutions show large-scale helical magnetic structure. Both waves and structures are permanent only when the kinetic helicity is non-zero on average.« less

  16. Hail statistics for Germany derived from single-polarization radar data

    NASA Astrophysics Data System (ADS)

    Puskeiler, Marc; Kunz, Michael; Schmidberger, Manuel

    2016-09-01

    Despite the considerable damage potential related to severe hailstorms, knowledge about the local hail probability in Germany is very limited. Constructing a reliable hail probability map is challenging due largely to the lack of direct hail observations. In our study, we suggest a reasonable method by which to estimate hail signals from 3D radar reflectivity measured by conventional single-polarization radars between 2005 and 2011. Evaluating the radar-derived hail days with loss data from a building and an agricultural insurance company confirmed the reliability of the method and the results as expressed, for example, by a Heidke Skill Score HSS of 0.7. Overall, radar-derived hail days demonstrate very high spatial variability, which reflects the local-scale nature of deep moist convection. Nonetheless, systematic patterns related to climatic conditions and orography can also be observed. On the large scale, the number of hail days substantially increases from north to south, which may plausibly be explained by the higher thermal instability in the south. At regional and local scales, several hot spots with elevated hail frequency can be identified, in most cases downstream of the mountains. Several other characteristics including convective energy related to the events identified, differences in track lengths, and seasonal cycles are discussed.

  17. Prototype of an Integrated Hurricane Information System for Research: Description and Illustration of its Use in Evaluating WRF Model Simulations

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S.; Chao, Y.; Vane, D.; Lambrigtsen, B.; Li, P. P.; Knosp, B.; Vu, Q. A.; Su, H.; Dang, V.; Fovell, R.; Tanelli, S.; Garay, M.; Willis, J.; Poulsen, W.; Fishbein, E.; Ao, C. O.; Vazquez, J.; Park, K. J.; Callahan, P.; Marcus, S.; Haddad, Z.; Fetzer, E.; Kahn, R.

    2007-12-01

    In spite of recent improvements in hurricane track forecast accuracy, currently there are still many unanswered questions about the physical processes that determine hurricane genesis, intensity, track and impact on large- scale environment. Furthermore, a significant amount of work remains to be done in validating hurricane forecast models, understanding their sensitivities and improving their parameterizations. None of this can be accomplished without a comprehensive set of multiparameter observations that are relevant to both the large- scale and the storm-scale processes in the atmosphere and in the ocean. To address this need, we have developed a prototype of a comprehensive hurricane information system of high- resolution satellite, airborne and in-situ observations and model outputs pertaining to: i) the thermodynamic and microphysical structure of the storms; ii) the air-sea interaction processes; iii) the larger-scale environment as depicted by the SST, ocean heat content and the aerosol loading of the environment. Our goal was to create a one-stop place to provide the researchers with an extensive set of observed hurricane data, and their graphical representation, together with large-scale and convection-resolving model output, all organized in an easy way to determine when coincident observations from multiple instruments are available. Analysis tools will be developed in the next step. The analysis tools will be used to determine spatial, temporal and multiparameter covariances that are needed to evaluate model performance, provide information for data assimilation and characterize and compare observations from different platforms. We envision that the developed hurricane information system will help in the validation of the hurricane models, in the systematic understanding of their sensitivities and in the improvement of the physical parameterizations employed by the models. Furthermore, it will help in studying the physical processes that affect hurricane development and impact on large-scale environment. This talk will describe the developed prototype of the hurricane information systems. Furthermore, we will use a set of WRF hurricane simulations and compare simulated to observed structures to illustrate how the information system can be used to discriminate between simulations that employ different physical parameterizations. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics ans Space Administration.

  18. Wetlands as large-scale nature-based solutions: status and future challenges for research and management

    NASA Astrophysics Data System (ADS)

    Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia

    2017-04-01

    Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.

  19. [Evaluation of traditional German undergraduate surgical training. An analysis at Heidelberg University].

    PubMed

    Schürer, S; Schellberg, D; Schmidt, J; Kallinowski, F; Mehrabi, A; Herfarth, Ch; Büchler, M W; Kadmon, M

    2006-04-01

    The medical faculty of Heidelberg University implemented a new problem-based clinical curriculum (Heidelberg Curriculum Medicinale, or Heicumed) in 2001. The present study analyses the evaluation data of two student cohorts prior to the introduction of Heicumed. Its aim was to specify problems of the traditional training and to draw conclusions for implementation of a new curriculum. The evaluation instrument was the Heidelberg Inventory for the Evaluation of Teaching (HILVE-I). The data were analysed calculating differences in the means between defined groups, with the 13 primary scales of the HILVE I-instrument as dependent variables. Teaching method and subject had no systematic influence on evaluation results. Thus, didactic lecture in orthopedic surgery achieved better results than small group tutorials, while the data on vascular and general surgery showed opposite results. Major factors for success were continuity and didactic training of lecturers and tutors. This is convincingly reflected by the results of the lecture course "Differential diagnosis in general surgery". The good evaluation data on small group tutorials resulted largely from the "participation" and "discussion" scales, which represent interactivity in learning. The results of the present study suggest the importance of two major pedagogic ideas: continuity and didactic training of lecturers and tutors. These principles were widely implemented in Heicumed and have contributed to the success of the new curriculum.

  20. Incorporating principal component analysis into air quality ...

    EPA Pesticide Factsheets

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Principal Component Analysis (PCA) with the intent of motivating its use by the evaluation community. One of the main objectives of PCA is to identify, through data reduction, the recurring and independent modes of variations (or signals) within a very large dataset, thereby summarizing the essential information of that dataset so that meaningful and descriptive conclusions can be made. In this demonstration, PCA is applied to a simple evaluation metric – the model bias associated with EPA's Community Multi-scale Air Quality (CMAQ) model when compared to weekly observations of sulfate (SO42−) and ammonium (NH4+) ambient air concentrations measured by the Clean Air Status and Trends Network (CASTNet). The advantages of using this technique are demonstrated as it identifies strong and systematic patterns of CMAQ model bias across a myriad of spatial and temporal scales that are neither constrained to geopolitical boundaries nor monthly/seasonal time periods (a limitation of many current studies). The technique also identifies locations (station–grid cell pairs) that are used as indicators for a more thorough diagnostic evaluation thereby hastening and facilitating understanding of the prob

  1. Treatment for insertional Achilles tendinopathy: a systematic review.

    PubMed

    Wiegerinck, J I; Kerkhoffs, G M; van Sterkenburg, M N; Sierevelt, I N; van Dijk, C N

    2013-06-01

    Systematically search and analyse the results of surgical and non-surgical treatments for insertional Achilles tendinopathy. A structured systematic review of the literature was performed to identify surgical and non-surgical therapeutic studies reporting on ten or more adults with insertional Achilles tendinopathy. MEDLINE, CINAHL, EMBASE (Classic) and the Cochrane database of controlled trials (1945-March 2011) were searched. The Coleman methodology score was used to assess the quality of included articles, and these were analysed with an emphasis on change in pain score, patient satisfaction and complication rate. Of 451 reviewed abstracts, 14 trials met our inclusion criteria evaluating 452 procedures in 433 patients. Five surgical techniques were evaluated; all had a good patient satisfaction (avg. 89 %). The complication ratio differed substantially between techniques. Two studies analysed injections showing significant decrease in visual analogue scale (VAS). Eccentric exercises showed a significant decrease in VAS, but a large group of patients was unsatisfied. Extracorporeal shockwave therapy (ESWT) was superior to both wait-and-see and an eccentric training regime. One study evaluated laser CO(2), TECAR and cryoultrasound, all with significant decrease in VAS. Despite differences in outcome and complication ratio, the patient satisfaction is high in all surgical studies. It is not possible to draw conclusions regarding the best surgical treatment for insertional Achilles tendinopathy. ESWT seems effective in patients with non-calcified insertional Achilles tendinopathy. Although both eccentric exercises resulted in a decrease in VAS score, full range of motion eccentric exercises shows a low patient satisfaction compared to floor level exercises and other conservative treatment modalities.

  2. The Organic Brain Syndrome (OBS) scale: a systematic review.

    PubMed

    Björkelund, Karin Björkman; Larsson, Sylvia; Gustafson, Lars; Andersson, Edith

    2006-03-01

    The Organic Brain Syndrome (OBS) Scale was developed to determine elderly patients' disturbances of awareness and orientation as to time, place and own identity, and assessment of various emotional and behavioural symptoms appearing in delirium, dementia and other organic mental diseases. The aim of the study was to examine the OBS Scale, using the eight criteria and guidelines formulated by the Scientific Advisory Committee of the Medical Outcomes Trust (SAC), and to investigate its relevance and suitability for use in various clinical settings. Systematic search and analysis of papers (30) on the OBS Scale were carried out using the criteria suggested by the SAC. The OBS Scale in many aspects satisfies the requirements suggested by the SAC: conceptual and measurement model, reliability, validity, responsiveness, interpretability, respondent and administrative burden, alternative forms of administration, and cultural and language adaptations, but there is a need for additional evaluation, especially with regard to different forms of reliability, and the translation and adaptation to other languages. The OBS Scale is a sensitive scale which is clinically useful for the description and long-term follow-up of patients showing symptoms of acute confusional state and dementia. Although the OBS Scale has been used in several clinical studies there is need for further evaluation.

  3. Underestimation of Microearthquake Size by the Magnitude Scale of the Japan Meteorological Agency: Influence on Earthquake Statistics

    NASA Astrophysics Data System (ADS)

    Uchide, Takahiko; Imanishi, Kazutoshi

    2018-01-01

    Magnitude scales based on the amplitude of seismic waves, including the Japan Meteorological Agency magnitude scale (Mj), are commonly used in routine processes. The moment magnitude scale (Mw), however, is more physics based and is able to evaluate any type and size of earthquake. This paper addresses the relation between Mj and Mw for microearthquakes. The relative moment magnitudes among earthquakes are well constrained by multiple spectral ratio analyses. The results for the events in the Fukushima Hamadori and northern Ibaraki prefecture areas of Japan imply that Mj is significantly and systematically smaller than Mw for microearthquakes. The Mj-Mw curve has slopes of 1/2 and 1 for small and large values of Mj, respectively; for example, Mj = 1.0 corresponds to Mw = 2.0. A simple numerical simulation implies that this is due to anelastic attenuation and the recording using a finite sampling interval. The underestimation affects earthquake statistics. The completeness magnitude, Mc, for magnitudes lower than which the magnitude-frequency distribution deviates from the Gutenberg-Richter law, is effectively lower for Mw than that for Mj, by taking into account the systematic difference between Mj and Mw. The b values of the Gutenberg-Richter law are larger for Mw than for Mj. As the b values for Mj and Mw are well correlated, qualitative argument using b values is not affected. While the estimated b values for Mj are below 1.5, those for Mw often exceed 1.5. This may affect the physical implication of the seismicity.

  4. [Reliability and validity of depression scales of Chinese version: a systematic review].

    PubMed

    Sun, X Y; Li, Y X; Yu, C Q; Li, L M

    2017-01-10

    Objective: Through systematically reviewing the reliability and validity of depression scales of Chinese version in adults in China to evaluate the psychometric properties of depression scales for different groups. Methods: Eligible studies published before 6 May 2016 were retrieved from the following database: CNKI, Wanfang, PubMed and Embase. The HSROC model of the diagnostic test accuracy (DTA) for Meta-analysis was used to calculate the pooled sensitivity and specificity of the PHQ-9. Results: A total of 44 papers evaluating the performance of depression scales were included. Results showed that the reliability and validity of the common depression scales were eligible, including the Beck depression inventory (BDI), the Hamilton depression scale (HAMD), the center epidemiological studies depression scale (CES-D), the patient health questionnaire (PHQ) and the Geriatric depression scale (GDS). The Cronbach' s coefficient of most tools were larger than 0.8, while the test-retest reliability and split-half reliability were larger than 0.7, indicating good internal consistency and stability. The criterion validity, convergent validity, discrimination validity and screening validity were acceptable though different cut-off points were recommended by different studies. The pooled sensitivity of the 11 studies evaluating PHQ-9 was 0.88 (95 %CI : 0.85-0.91) while the pooled specificity was 0.89 (95 %CI : 0.82-0.94), which demonstrated the applicability of PHQ-9 in screening depression. Conclusion: The reliability and validity of different depression scales of Chinese version are acceptable. The characteristics of different tools and study population should be taken into consideration when choosing a specific scale.

  5. Developmental Assessment with Young Children: A Systematic Review of Battelle Studies

    ERIC Educational Resources Information Center

    Cunha, Ana C. B.; Berkovits, Michelle D.; Albuquerque, Karolina A.

    2018-01-01

    Developmental assessment scales are important tools for determining developmental delays and planning preventive interventions. One broad assessment scale used to evaluate child development is the Battelle Developmental Inventories (BDIs). The BDI-2 has a standardized version in English with good psychometric properties and a translated version in…

  6. Bridging the Gap: Direct Behavior Rating-Single Item Scales

    ERIC Educational Resources Information Center

    Miller, Faith G.; Crovello, Nicholas; Swenson, Nicole

    2017-01-01

    Direct Behavior Ratings (DBRs) are behavioral assessment methods that combine the benefits of systematic direct observation and behavior rating scales. That is, DBRs involve the observation of operationally defined target behaviors during a prespecified observation period and the evaluation of those behaviors via brief ratings. In this way, DBR is…

  7. Will COBE challenge the inflationary paradigm - Cosmic microwave background anisotropies versus large-scale streaming motions revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorski, K.M.

    1991-03-01

    The relation between cosmic microwave background (CMB) anisotropies and large-scale galaxy streaming motions is examined within the framework of inflationary cosmology. The minimal Sachs and Wolfe (1967) CMB anisotropies at large angular scales in the models with initial Harrison-Zel'dovich spectrum of inhomogeneity normalized to the local large-scale bulk flow, which are independent of the Hubble constant and specific nature of dark matter, are found to be within the anticipated ultimate sensitivity limits of COBE's Differential Microwave Radiometer experiment. For example, the most likely value of the quadrupole coefficient is predicted to be a2 not less than 7 x 10 tomore » the -6th, where equality applies to the limiting minimal model. If (1) COBE's DMR instruments perform well throughout the two-year period; (2) the anisotropy data are not marred by the systematic errors; (3) the large-scale motions retain their present observational status; (4) there is no statistical conspiracy in a sense of the measured bulk flow being of untypically high and the large-scale anisotropy of untypically low amplitudes; and (5) the low-order multipoles in the all-sky primordial fireball temperature map are not detected, the inflationary paradigm will have to be questioned. 19 refs.« less

  8. Scaling and stochastic cascade properties of NEMO oceanic simulations and their potential value for GCM evaluation and downscaling

    NASA Astrophysics Data System (ADS)

    Verrier, Sébastien; Crépon, Michel; Thiria, Sylvie

    2014-09-01

    Spectral scaling properties have already been evidenced on oceanic numerical simulations and have been subject to several interpretations. They can be used to evaluate classical turbulence theories that predict scaling with specific exponents and to evaluate the quality of GCM outputs from a statistical and multiscale point of view. However, a more complete framework based on multifractal cascades is able to generalize the classical but restrictive second-order spectral framework to other moment orders, providing an accurate description of probability distributions of the fields at multiple scales. The predictions of this formalism still needed systematic verification in oceanic GCM while they have been confirmed recently for their atmospheric counterparts by several papers. The present paper is devoted to a systematic analysis of several oceanic fields produced by the NEMO oceanic GCM. Attention is focused to regional, idealized configurations that permit to evaluate the NEMO engine core from a scaling point of view regardless of limitations involved by land masks. Based on classical multifractal analysis tools, multifractal properties were evidenced for several oceanic state variables (sea surface temperature and salinity, velocity components, etc.). While first-order structure functions estimated a different nonconservativity parameter H in two scaling ranges, the multiorder statistics of turbulent fluxes were scaling over almost the whole available scaling range. This multifractal scaling was then parameterized with the help of the universal multifractal framework, providing parameters that are coherent with existing empirical literature. Finally, we argue that the knowledge of these properties may be useful for oceanographers. The framework seems very well suited for the statistical evaluation of OGCM outputs. Moreover, it also provides practical solutions to simulate subpixel variability stochastically for GCM downscaling purposes. As an independent perspective, the existence of multifractal properties in oceanic flows seems also interesting for investigating scale dependencies in remote sensing inversion algorithms.

  9. Impact of lateral boundary conditions on regional analyses

    NASA Astrophysics Data System (ADS)

    Chikhar, Kamel; Gauthier, Pierre

    2017-04-01

    Regional and global climate models are usually validated by comparison to derived observations or reanalyses. Using a model in data assimilation results in a direct comparison to observations to produce its own analyses that may reveal systematic errors. In this study, regional analyses over North America are produced based on the fifth-generation Canadian Regional Climate Model (CRCM5) combined with the variational data assimilation system of the Meteorological Service of Canada (MSC). CRCM5 is driven at its boundaries by global analyses from ERA-interim or produced with the global configuration of the CRCM5. Assimilation cycles for the months of January and July 2011 revealed systematic errors in winter through large values in the mean analysis increments. This bias is attributed to the coupling of the lateral boundary conditions of the regional model with the driving data particularly over the northern boundary where a rapidly changing large scale circulation created significant cross-boundary flows. Increasing the time frequency of the lateral driving and applying a large-scale spectral nudging improved significantly the circulation through the lateral boundaries which translated in a much better agreement with observations.

  10. Integrated water and renewable energy management: the Acheloos-Peneios region case study

    NASA Astrophysics Data System (ADS)

    Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.

  11. Time to "go large" on biofilm research: advantages of an omics approach.

    PubMed

    Azevedo, Nuno F; Lopes, Susana P; Keevil, Charles W; Pereira, Maria O; Vieira, Maria J

    2009-04-01

    In nature, the biofilm mode of life is of great importance in the cell cycle for many microorganisms. Perhaps because of biofilm complexity and variability, the characterization of a given microbial system, in terms of biofilm formation potential, structure and associated physiological activity, in a large-scale, standardized and systematic manner has been hindered by the absence of high-throughput methods. This outlook is now starting to change as new methods involving the utilization of microtiter-plates and automated spectrophotometry and microscopy systems are being developed to perform large-scale testing of microbial biofilms. Here, we evaluate if the time is ripe to start an integrated omics approach, i.e., the generation and interrogation of large datasets, to biofilms--"biofomics". This omics approach would bring much needed insight into how biofilm formation ability is affected by a number of environmental, physiological and mutational factors and how these factors interplay between themselves in a standardized manner. This could then lead to the creation of a database where biofilm signatures are identified and interrogated. Nevertheless, and before embarking on such an enterprise, the selection of a versatile, robust, high-throughput biofilm growing device and of appropriate methods for biofilm analysis will have to be performed. Whether such device and analytical methods are already available, particularly for complex heterotrophic biofilms is, however, very debatable.

  12. Systematic review and meta-analysis of genetic association studies in idiopathic recurrent spontaneous abortion.

    PubMed

    Pereza, Nina; Ostojić, Saša; Kapović, Miljenko; Peterlin, Borut

    2017-01-01

    1) To perform the first comprehensive systematic review of genetic association studies (GASs) in idiopathic recurrent spontaneous abortion (IRSA); 2) to analyze studies according to recurrent spontaneous abortion (RSA) definition and selection criteria for patients and control subjects; and 3) to perform meta-analyses for the association of candidate genes with IRSA. Systematic review and meta-analysis. Not applicable. Couples with IRSA and their spontaneously aborted embryos. Summary odds ratios (ORs) were calculated by means of fixed- or random-effects models. Association of genetic variants with IRSA. The systematic review included 428 case-control studies (1990-2015), which differed substantially regarding RSA definition, clinical evaluation of patients, and selection of control subjects. In women, 472 variants in 187 genes were investigated. Meta-analyses were performed for 36 variants in 16 genes. Association with IRSA defined as three or more spontaneous abortions (SAs) was detected for 21 variants in genes involved in immune response (IFNG, IL10, KIR2DS2, KIR2DS3, KIR2DS4, MBL, TNF), coagulation (F2, F5, PAI-1, PROZ), metabolism (GSTT1, MTHFR), and angiogenesis (NOS3, VEGFA). However, ORs were modest (0.51-2.37), with moderate or weak epidemiologic credibility. Minor differences in summary ORs were detected between IRSA defined as two or more and as three or more SAs. Male partners were included in 12.1% of studies, and one study included spontaneously aborted embryos. Candidate gene studies show moderate associations with IRSA. Owing to large differences in RSA definition and selection criteria for participants, consensus is needed. Future GASs should include both partners and spontaneously aborted embryos. Genome-wide association studies and large-scale replications of identified associations are recommended. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Similarity spectra analysis of high-performance jet aircraft noise.

    PubMed

    Neilsen, Tracianne B; Gee, Kent L; Wall, Alan T; James, Michael M

    2013-04-01

    Noise measured in the vicinity of an F-22A Raptor has been compared to similarity spectra found previously to represent mixing noise from large-scale and fine-scale turbulent structures in laboratory-scale jet plumes. Comparisons have been made for three engine conditions using ground-based sideline microphones, which covered a large angular aperture. Even though the nozzle geometry is complex and the jet is nonideally expanded, the similarity spectra do agree with large portions of the measured spectra. Toward the sideline, the fine-scale similarity spectrum is used, while the large-scale similarity spectrum provides a good fit to the area of maximum radiation. Combinations of the two similarity spectra are shown to match the data in between those regions. Surprisingly, a combination of the two is also shown to match the data at the farthest aft angle. However, at high frequencies the degree of congruity between the similarity and the measured spectra changes with engine condition and angle. At the higher engine conditions, there is a systematically shallower measured high-frequency slope, with the largest discrepancy occurring in the regions of maximum radiation.

  14. Dark matter, long-range forces, and large-scale structure

    NASA Technical Reports Server (NTRS)

    Gradwohl, Ben-Ami; Frieman, Joshua A.

    1992-01-01

    If the dark matter in galaxies and clusters is nonbaryonic, it can interact with additional long-range fields that are invisible to experimental tests of the equivalence principle. We discuss the astrophysical and cosmological implications of a long-range force coupled only to the dark matter and find rather tight constraints on its strength. If the force is repulsive (attractive), the masses of galaxy groups and clusters (and the mean density of the universe inferred from them) have been systematically underestimated (overestimated). We explore the consequent effects on the two-point correlation function, large-scale velocity flows, and microwave background anisotropies, for models with initial scale-invariant adiabatic perturbations and cold dark matter.

  15. What is the clinical effectiveness and cost-effectiveness of conservative interventions for tendinopathy? An overview of systematic reviews of clinical effectiveness and systematic review of economic evaluations.

    PubMed

    Long, Linda; Briscoe, Simon; Cooper, Chris; Hyde, Chris; Crathorne, Louise

    2015-01-01

    Lateral elbow tendinopathy (LET) is a common complaint causing characteristic pain in the lateral elbow and upper forearm, and tenderness of the forearm extensor muscles. It is thought to be an overuse injury and can have a major impact on the patient's social and professional life. The condition is challenging to treat and prone to recurrent episodes. The average duration of a typical episode ranges from 6 to 24 months, with most (89%) reporting recovery by 1 year. This systematic review aims to summarise the evidence concerning the clinical effectiveness and cost-effectiveness of conservative interventions for LET. A comprehensive search was conducted from database inception to 2012 in a range of databases including MEDLINE, EMBASE and Cochrane Databases. We conducted an overview of systematic reviews to summarise the current evidence concerning the clinical effectiveness and a systematic review for the cost-effectiveness of conservative interventions for LET. We identified additional randomised controlled trials (RCTs) that could contribute further evidence to existing systematic reviews. We searched MEDLINE, EMBASE, Allied and Complementary Medicine Database, Cumulative Index to Nursing and Allied Health Literature, Web of Science, The Cochrane Library and other important databases from inception to January 2013. A total of 29 systematic reviews published since 2003 matched our inclusion criteria. These were quality appraised using the Assessment of Multiple Systematic Reviews (AMSTAR) checklist; five were considered high quality and evaluated using a Grading of Recommendations, Assessment, Development and Evaluation approach. A total of 36 RCTs were identified that were not included in a systematic review and 29 RCTs were identified that had only been evaluated in an included systematic review of intermediate/low quality. These were then mapped to existing systematic reviews where further evidence could provide updates. Two economic evaluations were identified. The summary of findings from the review was based only on high-quality evidence (scoring of > 5 AMSTAR). Other limitations were that identified RCTs were not quality appraised and dichotomous outcomes were also not considered. Economic evaluations took effectiveness estimates from trials that had small sample sizes leading to uncertainty surrounding the effect sizes reported. This, in turn, led to uncertainty of the reported cost-effectiveness and, as such, no robust recommendations could be made in this respect. Clinical effectiveness evidence from the high-quality systematic reviews identified in this overview continues to suggest uncertainty as to the effectiveness of many conservative interventions for the treatment of LET. Although new RCT evidence has been identified with either placebo or active controls, there is uncertainty as to the size of effects reported within them because of the small sample size. Conclusions regarding cost-effectiveness are also unclear. We consider that, although updated or new systematic reviews may also be of value, the primary focus of future work should be on conducting large-scale, good-quality clinical trials using a core set of outcome measures (for defined time points) and appropriate follow-up. Subgroup analysis of existing RCT data may be beneficial to ascertain whether or not certain patient groups are more likely to respond to treatments. This study is registered as PROSPERO CRD42013003593. The National Institute for Health Research Health Technology Assessment programme.

  16. Understanding attrition from international Internet health interventions: a step towards global eHealth.

    PubMed

    Geraghty, Adam W A; Torres, Leandro D; Leykin, Yan; Pérez-Stable, Eliseo J; Muñoz, Ricardo F

    2013-09-01

    Worldwide automated Internet health interventions have the potential to greatly reduce health disparities. High attrition from automated Internet interventions is ubiquitous, and presents a challenge in the evaluation of their effectiveness. Our objective was to evaluate variables hypothesized to be related to attrition, by modeling predictors of attrition in a secondary data analysis of two cohorts of an international, dual language (English and Spanish) Internet smoking cessation intervention. The two cohorts were identical except for the approach to follow-up (FU): one cohort employed only fully automated FU (n = 16 430), while the other cohort also used 'live' contact conditional upon initial non-response (n = 1000). Attrition rates were 48.1 and 10.8% for the automated FU and live FU cohorts, respectively. Significant attrition predictors in the automated FU cohort included higher levels of nicotine dependency, lower education, lower quitting confidence and receiving more contact emails. Participants' younger age was the sole predictor of attrition in the live FU cohort. While research on large-scale deployment of Internet interventions is at an early stage, this study demonstrates that differences in attrition from trials on this scale are (i) systematic and predictable and (ii) can largely be eliminated by live FU efforts. In fully automated trials, targeting the predictors we identify may reduce attrition, a necessary precursor to effective behavioral Internet interventions that can be accessed globally.

  17. Improving health aid for a better planet: The planning, monitoring and evaluation tool (PLANET).

    PubMed

    Sridhar, Devi; Car, Josip; Chopra, Mickey; Campbell, Harry; Woods, Ngaire; Rudan, Igor

    2015-12-01

    International development assistance for health (DAH) quadrupled between 1990 and 2012, from US$ 5.6 billion to US$ 28.1 billion. This generates an increasing need for transparent and replicable tools that could be used to set investment priorities, monitor the distribution of funding in real time, and evaluate the impact of those investments. In this paper we present a methodology that addresses these three challenges. We call this approach PLANET, which stands for planning, monitoring and evaluation tool. Fundamentally, PLANET is based on crowdsourcing approach to obtaining information relevant to deployment of large-scale programs. Information is contributed in real time by a diverse group of participants involved in the program delivery. PLANET relies on real-time information from three levels of participants in large-scale programs: funders, managers and recipients. At each level, information is solicited to assess five key risks that are most relevant to each level of operations. The risks at the level of funders involve systematic neglect of certain areas, focus on donor's interests over that of program recipients, ineffective co-ordination between donors, questionable mechanisms of delivery and excessive loss of funding to "middle men". At the level of managers, the risks are corruption, lack of capacity and/or competence, lack of information and /or communication, undue avoidance of governmental structures / preference to non-governmental organizations and exclusion of local expertise. At the level of primary recipients, the risks are corruption, parallel operations / "verticalization", misalignment with local priorities and lack of community involvement, issues with ethics, equity and/or acceptability, and low likelihood of sustainability beyond the end of the program's implementation. PLANET is intended as an additional tool available to policy-makers to prioritize, monitor and evaluate large-scale development programs. In this, it should complement tools such as LiST (for health care/interventions), EQUIST (for health care/interventions) and CHNRI (for health research), which also rely on information from local experts and on local context to set priorities in a transparent, user-friendly, replicable, quantifiable and specific, algorithmic-like manner.

  18. Modeling of the response of the POLARBEAR bolometers with a continuously rotating half-wave plate

    NASA Astrophysics Data System (ADS)

    Takakura, Satoru; POLARBEAR Collaboration

    2018-01-01

    The curly pattern, the so-called B-mode, in the polarization anisotropy of the cosmic microwave background (CMB) is a powerful probe to measure primordial gravitational waves from the cosmic inflation, as well as the weak lensing due to the large scale structure of the Universe. At present, ground-based CMB experiments with a few arcminutes resolution such as POLARBEAR, SPTpol, and ACTPol have successfully measured the angular power spectrum of the B-mode only in sub-degree scales, though these experiments also have potential to measure the inflationary B-modes in degree scales in absence of the low-frequency noise (1/f noise). Thus, techniques of polarization signal modulation such as a continuously rotating half-wave plate (CRHWP) are widely investigated to suppress the 1/f noise and also to reduce instrumental systematic errors. In this study, we have implemented a CRHWP placed around the prime focus of the POLARBEAR telescope and operated at ambient temperatures. We construct a comprehensive model including half-wave plate synchronous signals, detector non-linearities, beam imperfections, and all noise sources. Using this model, we show that, in practice, the 1/f noise and instrumental systematics could remain even with the CRHWP. However, we also evaluate those effects from test observations using a prototype CRHWP on the POLARBEAR telescope and find that the residual 1/f noise is sufficiently small for POLARBEAR to probe the multipoles about 40. We will also discuss prospects for future CMB experiments with better sensitivities.

  19. Which are the most useful scales for predicting repeat self-harm? A systematic review evaluating risk scales using measures of diagnostic accuracy

    PubMed Central

    Quinlivan, L; Cooper, J; Davies, L; Hawton, K; Gunnell, D; Kapur, N

    2016-01-01

    Objectives The aims of this review were to calculate the diagnostic accuracy statistics of risk scales following self-harm and consider which might be the most useful scales in clinical practice. Design Systematic review. Methods We based our search terms on those used in the systematic reviews carried out for the National Institute for Health and Care Excellence self-harm guidelines (2012) and evidence update (2013), and updated the searches through to February 2015 (CINAHL, EMBASE, MEDLINE, and PsychINFO). Methodological quality was assessed and three reviewers extracted data independently. We limited our analysis to cohort studies in adults using the outcome of repeat self-harm or attempted suicide. We calculated diagnostic accuracy statistics including measures of global accuracy. Statistical pooling was not possible due to heterogeneity. Results The eight papers included in the final analysis varied widely according to methodological quality and the content of scales employed. Overall, sensitivity of scales ranged from 6% (95% CI 5% to 6%) to 97% (CI 95% 94% to 98%). The positive predictive value (PPV) ranged from 5% (95% CI 3% to 9%) to 84% (95% CI 80% to 87%). The diagnostic OR ranged from 1.01 (95% CI 0.434 to 2.5) to 16.3 (95%CI 12.5 to 21.4). Scales with high sensitivity tended to have low PPVs. Conclusions It is difficult to be certain which, if any, are the most useful scales for self-harm risk assessment. No scales perform sufficiently well so as to be recommended for routine clinical use. Further robust prospective studies are warranted to evaluate risk scales following an episode of self-harm. Diagnostic accuracy statistics should be considered in relation to the specific service needs, and scales should only be used as an adjunct to assessment. PMID:26873046

  20. Scale dependence of halo bispectrum from non-Gaussian initial conditions in cosmological N-body simulations

    NASA Astrophysics Data System (ADS)

    Nishimichi, Takahiro; Taruya, Atsushi; Koyama, Kazuya; Sabiu, Cristiano

    2010-07-01

    We study the halo bispectrum from non-Gaussian initial conditions. Based on a set of large N-body simulations starting from initial density fields with local type non-Gaussianity, we find that the halo bispectrum exhibits a strong dependence on the shape and scale of Fourier space triangles near squeezed configurations at large scales. The amplitude of the halo bispectrum roughly scales as fNL2. The resultant scaling on the triangular shape is consistent with that predicted by Jeong & Komatsu based on perturbation theory. We systematically investigate this dependence with varying redshifts and halo mass thresholds. It is shown that the fNL dependence of the halo bispectrum is stronger for more massive haloes at higher redshifts. This feature can be a useful discriminator of inflation scenarios in future deep and wide galaxy redshift surveys.

  1. How arbitrary is language?

    PubMed Central

    Monaghan, Padraic; Shillcock, Richard C.; Christiansen, Morten H.; Kirby, Simon

    2014-01-01

    It is a long established convention that the relationship between sounds and meanings of words is essentially arbitrary—typically the sound of a word gives no hint of its meaning. However, there are numerous reported instances of systematic sound–meaning mappings in language, and this systematicity has been claimed to be important for early language development. In a large-scale corpus analysis of English, we show that sound–meaning mappings are more systematic than would be expected by chance. Furthermore, this systematicity is more pronounced for words involved in the early stages of language acquisition and reduces in later vocabulary development. We propose that the vocabulary is structured to enable systematicity in early language learning to promote language acquisition, while also incorporating arbitrariness for later language in order to facilitate communicative expressivity and efficiency. PMID:25092667

  2. The Eczema Education Programme: intervention development and model feasibility.

    PubMed

    Jackson, K; Ersser, S J; Dennis, H; Farasat, H; More, A

    2014-07-01

    The systematic support of parents of children with eczema is essential to their effective management; however, we have few models of support. This study examines the rationale, evidence base and development of a large-scale, structured, theory-based, nurse-led intervention, the 'Eczema Education Programme' (EEP), for parents of children with eczema. To outline development of the EEP, model of delivery, determine its feasibility and evaluate this based on service access and parental satisfaction data. Parent-child dyads meeting EEP referral criteria were recruited and demographic information recorded. A questionnaire survey of parental satisfaction was conducted 4 weeks post EEP; parental focus groups at 6 weeks provided comparative qualitative data. Descriptive statistics were derived from the questionnaire data using Predictive Analytics Software (PASW); content analysis was applied to focus group data. A total of 356 parents attended the EEP during the evaluation period. Service access was achieved for those in a challenging population. Both survey data (n = 146 parents, 57%) and focus group data (n = 21) revealed a significant level of parental satisfaction with the programme. It was feasible to provide the EEP as an adjunct to normal clinical care on a large scale, achieving a high level of patient/parent satisfaction and access within an urban area of multiple deprivation and high mobility. The intervention is transferable and the results are generalizable to other ethnically diverse child eczema populations within metropolitan areas in Britain. A multicentre RCT is required to test the effectiveness of this intervention on a larger scale. © 2013 European Academy of Dermatology and Venereology.

  3. Single pass tangential flow filtration to debottleneck downstream processing for therapeutic antibody production.

    PubMed

    Dizon-Maspat, Jemelle; Bourret, Justin; D'Agostini, Anna; Li, Feng

    2012-04-01

    As the therapeutic monoclonal antibody (mAb) market continues to grow, optimizing production processes is becoming more critical in improving efficiencies and reducing cost-of-goods in large-scale production. With the recent trends of increasing cell culture titers from upstream process improvements, downstream capacity has become the bottleneck in many existing manufacturing facilities. Single Pass Tangential Flow Filtration (SPTFF) is an emerging technology, which is potentially useful in debottlenecking downstream capacity, especially when the pool tank size is a limiting factor. It can be integrated as part of an existing purification process, after a column chromatography step or a filtration step, without introducing a new unit operation. In this study, SPTFF technology was systematically evaluated for reducing process intermediate volumes from 2× to 10× with multiple mAbs and the impact of SPTFF on product quality, and process yield was analyzed. Finally, the potential fit into the typical 3-column industry platform antibody purification process and its implementation in a commercial scale manufacturing facility were also evaluated. Our data indicate that using SPTFF to concentrate protein pools is a simple, flexible, and robust operation, which can be implemented at various scales to improve antibody purification process capacity. Copyright © 2011 Wiley Periodicals, Inc.

  4. Energising the WEF nexus to enhance sustainable development at local level.

    PubMed

    Terrapon-Pfaff, Julia; Ortiz, Willington; Dienst, Carmen; Gröne, Marie-Christine

    2018-06-23

    The water-energy-food (WEF) nexus is increasingly recognised as a conceptual framework able to support the efficient implementation of the Sustainable Development Goals (SDGs). Despite growing attention paid to the WEF nexus, the role that renewable energies can play in addressing trade-offs and realising synergies has received limited attention. Until now, the focus of WEF nexus discussions and applications has mainly been on national or global levels, macro-level drivers, material flows and large infrastructure developments. This overlooks the fact that major nexus challenges are faced at local level. Aiming to address these knowledge gaps, the authors conduct a systematic analysis of the linkages between small-scale energy projects in developing countries and the food and water aspects of development. The analysis is based on empirical data from continuous process and impact evaluations complemented by secondary data and relevant literature. The study provides initial insights into how to identify interconnections and the potential benefits of integrating the nexus pillars into local level projects in the global south. The study identifies the complex links which exist between sustainable energy projects and the food and water sectors and highlights that these needs are currently not systematically integrated into project design or project evaluation. A more systematic approach, integrating the water and food pillars into energy planning at local level in the global south, is recommended to avoid trade-offs and enhance the development outcomes and impacts of energy projects. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE) - A Systematic Review of Rating Scales

    PubMed Central

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Background Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students’ communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. Methods We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Results Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Discussion Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students’ academic success. PMID:27031506

  6. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE)--A Systematic Review of Rating Scales.

    PubMed

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students' communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students' academic success.

  7. Effectiveness of Virtual Reality in Children With Cerebral Palsy: A Systematic Review and Meta-Analysis of Randomized Controlled Trials.

    PubMed

    Chen, Yuping; Fanchiang, HsinChen D; Howard, Ayanna

    2018-01-01

    Researchers recently investigated the effectiveness of virtual reality (VR) in helping children with cerebral palsy (CP) to improve motor function. A systematic review of randomized controlled trials (RCTs) using a meta-analytic method to examine the effectiveness of VR in children with CP was thus needed. The purpose of this study was to update the current evidence about VR by systematically examining the research literature. A systematic literature search of PubMed, CINAHL, Cochrane Central Register of Controlled Trials, ERIC, PsycINFO, and Web of Science up to December 2016 was conducted. Studies with an RCT design, children with CP, comparisons of VR with other interventions, and movement-related outcomes were included. A template was created to systematically code the demographic, methodological, and miscellaneous variables of each RCT. The Physiotherapy Evidence Database (PEDro) scale was used to evaluate the study quality. Effect size was computed and combined using meta-analysis software. Moderator analyses were also used to explain the heterogeneity of the effect sizes in all RCTs. . The literature search yielded 19 RCT studies with fair to good methodological quality. Overall, VR provided a large effect size (d = 0.861) when compared with other interventions. A large effect of VR on arm function (d = 0.835) and postural control (d = 1.003) and a medium effect on ambulation (d = 0.755) were also found. Only the VR type affected the overall VR effect: an engineer-built system was more effective than a commercial system. The RCTs included in this study were of fair to good quality, had a high level of heterogeneity and small sample sizes, and used various intervention protocols. Then compared with other interventions, VR seems to be an effective intervention for improving motor function in children with CP. © 2017 American Physical Therapy Association

  8. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  9. Large scale study on the variation of RF energy absorption in the head & brain regions of adults and children and evaluation of the SAM phantom conservativeness.

    PubMed

    Keshvari, J; Kivento, M; Christ, A; Bit-Babik, G

    2016-04-21

    This paper presents the results of two computational large scale studies using highly realistic exposure scenarios, MRI based human head and hand models, and two mobile phone models. The objectives are (i) to study the relevance of age when people are exposed to RF by comparing adult and child heads and (ii) to analyze and discuss the conservativeness of the SAM phantom for all age groups. Representative use conditions were simulated using detailed CAD models of two mobile phones operating between 900 MHz and 1950 MHz including configurations with the hand holding the phone, which were not considered in most previous studies. The peak spatial-average specific absorption rate (psSAR) in the head and the pinna tissues is assessed using anatomically accurate head and hand models. The first of the two mentioned studies involved nine head-, four hand- and two phone-models, the second study included six head-, four hand- and three simplified phone-models (over 400 configurations in total). In addition, both studies also evaluated the exposure using the SAM phantom. Results show no systematic differences between psSAR induced in the adult and child heads. The exposure level and its variation for different age groups may be different for particular phones, but no correlation between psSAR and model age was found. The psSAR from all exposure conditions was compared to the corresponding configurations using SAM, which was found to be conservative in the large majority of cases.

  10. Large scale study on the variation of RF energy absorption in the head & brain regions of adults and children and evaluation of the SAM phantom conservativeness

    NASA Astrophysics Data System (ADS)

    Keshvari, J.; Kivento, M.; Christ, A.; Bit-Babik, G.

    2016-04-01

    This paper presents the results of two computational large scale studies using highly realistic exposure scenarios, MRI based human head and hand models, and two mobile phone models. The objectives are (i) to study the relevance of age when people are exposed to RF by comparing adult and child heads and (ii) to analyze and discuss the conservativeness of the SAM phantom for all age groups. Representative use conditions were simulated using detailed CAD models of two mobile phones operating between 900 MHz and 1950 MHz including configurations with the hand holding the phone, which were not considered in most previous studies. The peak spatial-average specific absorption rate (psSAR) in the head and the pinna tissues is assessed using anatomically accurate head and hand models. The first of the two mentioned studies involved nine head-, four hand- and two phone-models, the second study included six head-, four hand- and three simplified phone-models (over 400 configurations in total). In addition, both studies also evaluated the exposure using the SAM phantom. Results show no systematic differences between psSAR induced in the adult and child heads. The exposure level and its variation for different age groups may be different for particular phones, but no correlation between psSAR and model age was found. The psSAR from all exposure conditions was compared to the corresponding configurations using SAM, which was found to be conservative in the large majority of cases.

  11. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  12. What is actually measured in process evaluations for worksite health promotion programs: a systematic review

    PubMed Central

    2013-01-01

    Background Numerous worksite health promotion program (WHPPs) have been implemented the past years to improve employees’ health and lifestyle (i.e., physical activity, nutrition, smoking, alcohol use and relaxation). Research primarily focused on the effectiveness of these WHPPs. Whereas process evaluations provide essential information necessary to improve large scale implementation across other settings. Therefore, this review aims to: (1) further our understanding of the quality of process evaluations alongside effect evaluations for WHPPs, (2) identify barriers/facilitators affecting implementation, and (3) explore the relationship between effectiveness and the implementation process. Methods Pubmed, EMBASE, PsycINFO, and Cochrane (controlled trials) were searched from 2000 to July 2012 for peer-reviewed (randomized) controlled trials published in English reporting on both the effectiveness and the implementation process of a WHPP focusing on physical activity, smoking cessation, alcohol use, healthy diet and/or relaxation at work, targeting employees aged 18-65 years. Results Of the 307 effect evaluations identified, twenty-two (7.2%) published an additional process evaluation and were included in this review. The results showed that eight of those studies based their process evaluation on a theoretical framework. The methodological quality of nine process evaluations was good. The most frequently reported process components were dose delivered and dose received. Over 50 different implementation barriers/facilitators were identified. The most frequently reported facilitator was strong management support. Lack of resources was the most frequently reported barrier. Seven studies examined the link between implementation and effectiveness. In general a positive association was found between fidelity, dose and the primary outcome of the program. Conclusions Process evaluations are not systematically performed alongside effectiveness studies for WHPPs. The quality of the process evaluations is mostly poor to average, resulting in a lack of systematically measured barriers/facilitators. The narrow focus on implementation makes it difficult to explore the relationship between effectiveness and implementation. Furthermore, the operationalisation of process components varied between studies, indicating a need for consensus about defining and operationalising process components. PMID:24341605

  13. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Diane R.; Wolfrum, Edward J.; Virk, Parminder

    Rice plants ( Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, establishedmore » an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. Lastly, we find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice.« less

  14. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice

    PubMed Central

    Wang, Diane R.; Wolfrum, Edward J.; Virk, Parminder; Ismail, Abdelbagi; Greenberg, Anthony J.; McCouch, Susan R.

    2016-01-01

    Rice plants (Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, established an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. We find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice. PMID:27707775

  15. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice

    DOE PAGES

    Wang, Diane R.; Wolfrum, Edward J.; Virk, Parminder; ...

    2016-10-05

    Rice plants ( Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, establishedmore » an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. Lastly, we find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice.« less

  16. Metabolomic Modularity Analysis (MMA) to Quantify Human Liver Perfusion Dynamics.

    PubMed

    Sridharan, Gautham Vivek; Bruinsma, Bote Gosse; Bale, Shyam Sundhar; Swaminathan, Anandh; Saeidi, Nima; Yarmush, Martin L; Uygun, Korkut

    2017-11-13

    Large-scale -omics data are now ubiquitously utilized to capture and interpret global responses to perturbations in biological systems, such as the impact of disease states on cells, tissues, and whole organs. Metabolomics data, in particular, are difficult to interpret for providing physiological insight because predefined biochemical pathways used for analysis are inherently biased and fail to capture more complex network interactions that span multiple canonical pathways. In this study, we introduce a nov-el approach coined Metabolomic Modularity Analysis (MMA) as a graph-based algorithm to systematically identify metabolic modules of reactions enriched with metabolites flagged to be statistically significant. A defining feature of the algorithm is its ability to determine modularity that highlights interactions between reactions mediated by the production and consumption of cofactors and other hub metabolites. As a case study, we evaluated the metabolic dynamics of discarded human livers using time-course metabolomics data and MMA to identify modules that explain the observed physiological changes leading to liver recovery during subnormothermic machine perfusion (SNMP). MMA was performed on a large scale liver-specific human metabolic network that was weighted based on metabolomics data and identified cofactor-mediated modules that would not have been discovered by traditional metabolic pathway analyses.

  17. Systematic observations of the slip pulse properties of large earthquake ruptures

    USGS Publications Warehouse

    Melgar, Diego; Hayes, Gavin

    2017-01-01

    In earthquake dynamics there are two end member models of rupture: propagating cracks and self-healing pulses. These arise due to different properties of faults and have implications for seismic hazard; rupture mode controls near-field strong ground motions. Past studies favor the pulse-like mode of rupture; however, due to a variety of limitations, it has proven difficult to systematically establish their kinematic properties. Here we synthesize observations from a database of >150 rupture models of earthquakes spanning M7–M9 processed in a uniform manner and show the magnitude scaling properties of these slip pulses indicates self-similarity. Further, we find that large and very large events are statistically distinguishable relatively early (at ~15 s) in the rupture process. This suggests that with dense regional geophysical networks strong ground motions from a large rupture can be identified before their onset across the source region.

  18. Enhanced subarctic Pacific stratification and nutrient utilization during glacials over the last 1.2 Myr

    NASA Astrophysics Data System (ADS)

    Knudson, Karla P.; Ravelo, Ana Christina

    2015-11-01

    The relationship between climate, biological productivity, and nutrient flux is of considerable interest in the subarctic Pacific, which represents an important high-nitrate, low-chlorophyll region. While previous studies suggest that changes in iron supply and/or physical ocean stratification could hypothetically explain orbital-scale fluctuations in subarctic Pacific nutrient utilization and productivity, previous records of nutrient utilization are too short to evaluate these relationships over many glacial-interglacial cycles. We present new, high-resolution records of sedimentary δ15N, which offer the first opportunity to evaluate systematic, orbital-scale variations in subarctic Pacific nitrate utilization from 1.2 Ma. Nitrate utilization was enhanced during all glacials, varied with orbital-scale periodicity since the mid-Pleistocene transition, was strongly correlated with enhanced aeolian dust and low atmospheric CO2, but was not correlated with productivity. These results suggest that glacial stratification, rather than iron fertilization, systematically exerted an important regional control on nutrient utilization and air-sea carbon flux.

  19. Estimating uncertainty of Full Waveform Inversion with Ensemble-based methods

    NASA Astrophysics Data System (ADS)

    Thurin, J.; Brossier, R.; Métivier, L.

    2017-12-01

    Uncertainty estimation is one key feature of tomographic applications for robust interpretation. However, this information is often missing in the frame of large scale linearized inversions, and only the results at convergence are shown, despite the ill-posed nature of the problem. This issue is common in the Full Waveform Inversion community.While few methodologies have already been proposed in the literature, standard FWI workflows do not include any systematic uncertainty quantifications methods yet, but often try to assess the result's quality through cross-comparison with other results from seismic or comparison with other geophysical data. With the development of large seismic networks/surveys, the increase in computational power and the more and more systematic application of FWI, it is crucial to tackle this problem and to propose robust and affordable workflows, in order to address the uncertainty quantification problem faced for near surface targets, crustal exploration, as well as regional and global scales.In this work (Thurin et al., 2017a,b), we propose an approach which takes advantage of the Ensemble Transform Kalman Filter (ETKF) proposed by Bishop et al., (2001), in order to estimate a low-rank approximation of the posterior covariance matrix of the FWI problem, allowing us to evaluate some uncertainty information of the solution. Instead of solving the FWI problem through a Bayesian inversion with the ETKF, we chose to combine a conventional FWI, based on local optimization, and the ETKF strategies. This scheme allows combining the efficiency of local optimization for solving large scale inverse problems and make the sampling of the local solution space possible thanks to its embarrassingly parallel property. References:Bishop, C. H., Etherton, B. J. and Majumdar, S. J., 2001. Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects. Monthly weather review, 129(3), 420-436.Thurin, J., Brossier, R. and Métivier, L. 2017,a.: Ensemble-Based Uncertainty Estimation in Full Waveform Inversion. 79th EAGE Conference and Exhibition 2017, (12 - 15 June, 2017)Thurin, J., Brossier, R. and Métivier, L. 2017,b.: An Ensemble-Transform Kalman Filter - Full Waveform Inversion scheme for Uncertainty estimation; SEG Technical Program Expanded Abstracts 2012

  20. Towards resolving the complete fern tree of life.

    PubMed

    Lehtonen, Samuli

    2011-01-01

    In the past two decades, molecular systematic studies have revolutionized our understanding of the evolutionary history of ferns. The availability of large molecular data sets together with efficient computer algorithms, now enables us to reconstruct evolutionary histories with previously unseen completeness. Here, the most comprehensive fern phylogeny to date, representing over one-fifth of the extant global fern diversity, is inferred based on four plastid genes. Parsimony and maximum-likelihood analyses provided a mostly congruent results and in general supported the prevailing view on the higher-level fern systematics. At a deep phylogenetic level, the position of horsetails depended on the optimality criteria chosen, with horsetails positioned as the sister group either of Marattiopsida-Polypodiopsida clade or of the Polypodiopsida. The analyses demonstrate the power of using a 'supermatrix' approach to resolve large-scale phylogenies and reveal questionable taxonomies. These results provide a valuable background for future research on fern systematics, ecology, biogeography and other evolutionary studies.

  1. Study of muon-induced neutron production using accelerator muon beam at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakajima, Y.; Lin, C. J.; Ochoa-Ricoux, J. P.

    2015-08-17

    Cosmogenic muon-induced neutrons are one of the most problematic backgrounds for various underground experiments for rare event searches. In order to accurately understand such backgrounds, experimental data with high-statistics and well-controlled systematics is essential. We performed a test experiment to measure muon-induced neutron production yield and energy spectrum using a high-energy accelerator muon beam at CERN. We successfully observed neutrons from 160 GeV/c muon interaction on lead, and measured kinetic energy distributions for various production angles. Works towards evaluation of absolute neutron production yield is underway. This work also demonstrates that the setup is feasible for a future large-scale experimentmore » for more comprehensive study of muon-induced neutron production.« less

  2. Infusion phlebitis assessment measures: a systematic review

    PubMed Central

    Ray-Barruel, Gillian; Polit, Denise F; Murfield, Jenny E; Rickard, Claire M

    2014-01-01

    Rationale, aims and objectives Phlebitis is a common and painful complication of peripheral intravenous cannulation. The aim of this review was to identify the measures used in infusion phlebitis assessment and evaluate evidence regarding their reliability, validity, responsiveness and feasibility. Method We conducted a systematic literature review of the Cochrane library, Ovid MEDLINE and EBSCO CINAHL until September 2013. All English-language studies (randomized controlled trials, prospective cohort and cross-sectional) that used an infusion phlebitis scale were retrieved and analysed to determine which symptoms were included in each scale and how these were measured. We evaluated studies that reported testing the psychometric properties of phlebitis assessment scales using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) guidelines. Results Infusion phlebitis was the primary outcome measure in 233 studies. Fifty-three (23%) of these provided no actual definition of phlebitis. Of the 180 studies that reported measuring phlebitis incidence and/or severity, 101 (56%) used a scale and 79 (44%) used a definition alone. We identified 71 different phlebitis assessment scales. Three scales had undergone some psychometric analyses, but no scale had been rigorously tested. Conclusion Many phlebitis scales exist, but none has been thoroughly validated for use in clinical practice. A lack of consensus on phlebitis measures has likely contributed to disparities in reported phlebitis incidence, precluding meaningful comparison of phlebitis rates. PMID:24401116

  3. Systematic content evaluation and review of measurement properties of questionnaires for measuring self-reported fatigue among older people.

    PubMed

    Egerton, Thorlene; Riphagen, Ingrid I; Nygård, Arnhild J; Thingstad, Pernille; Helbostad, Jorunn L

    2015-09-01

    The assessment of fatigue in older people requires simple and user-friendly questionnaires that capture the phenomenon, yet are free from items indistinguishable from other disorders and experiences. This study aimed to evaluate the content, and systematically review and rate the measurement properties of self-report questionnaires for measuring fatigue, in order to identify the most suitable questionnaires for older people. This study firstly involved identification of questionnaires that purport to measure self-reported fatigue, and evaluation of the content using a rating scale developed for the purpose from contemporary understanding of the construct. Secondly, for the questionnaires that had acceptable content, we identified studies reporting measurement properties and rated the methodological quality of those studies according to the COSMIN system. Finally, we extracted and synthesised the results of the studies to give an overall rating for each questionnaire for each measurement property. The protocol was registered with PROSPERO (CRD42013005589). Of the 77 identified questionnaires, twelve were selected for review after content evaluation. Methodological quality varied, and there was a lack of information on measurement error and responsiveness. The PROMIS-Fatigue item bank and short forms perform the best. The FACIT-Fatigue scale, Parkinsons Fatigue Scale, Perform Questionnaire, and Uni-dimensional Fatigue Impact Scale also perform well and can be recommended. Minor modifications to improve performance are suggested. Further evaluation of unresolved measurement properties, particularly with samples including older people, is needed for all the recommended questionnaires.

  4. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    NASA Astrophysics Data System (ADS)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  5. Unravelling connections between river flow and large-scale climate: experiences from Europe

    NASA Astrophysics Data System (ADS)

    Hannah, D. M.; Kingston, D. G.; Lavers, D.; Stagge, J. H.; Tallaksen, L. M.

    2016-12-01

    The United Nations has identified better knowledge of large-scale water cycle processes as essential for socio-economic development and global water-food-energy security. In this context, and given the ever-growing concerns about climate change/ variability and human impacts on hydrology, there is an urgent research need: (a) to quantify space-time variability in regional river flow, and (b) to improve hydroclimatological understanding of climate-flow connections as a basis for identifying current and future water-related issues. In this paper, we draw together studies undertaken at the pan-European scale: (1) to evaluate current methods for assessing space-time dynamics for different streamflow metrics (annual regimes, low flows and high flows) and for linking flow variability to atmospheric drivers (circulation indices, air-masses, gridded climate fields and vapour flux); and (2) to propose a plan for future research connecting streamflow and the atmospheric conditions in Europe and elsewhere. We believe this research makes a useful, unique contribution to the literature through a systematic inter-comparison of different streamflow metrics and atmospheric descriptors. In our findings, we highlight the need to consider appropriate atmospheric descriptors (dependent on the target flow metric and region of interest) and to develop analytical techniques that best characterise connections in the ocean-atmosphere-land surface process chain. We call for the need to consider not only atmospheric interactions, but also the role of the river basin-scale terrestrial hydrological processes in modifying the climate signal response of river flows.

  6. Joint analysis of galaxy-galaxy lensing and galaxy clustering: Methodology and forecasts for Dark Energy Survey

    DOE PAGES

    Park, Y.; Krause, E.; Dodelson, S.; ...

    2016-09-30

    The joint analysis of galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth function of large scale structure. Our analysis will be carried out on data from the Dark Energy Survey (DES), with its measurements of both the distribution of galaxies and the tangential shears of background galaxies induced by these foreground lenses. We develop a practical approach to modeling the assumptions and systematic effects affecting small scale lensing, which provides halo masses, and large scale galaxy clustering. Introducing parameters that characterize the halo occupation distribution (HOD), photometric redshift uncertainties, and shear measurement errors, we studymore » how external priors on different subsets of these parameters affect our growth constraints. Degeneracies within the HOD model, as well as between the HOD and the growth function, are identified as the dominant source of complication, with other systematic effects sub-dominant. The impact of HOD parameters and their degeneracies necessitate the detailed joint modeling of the galaxy sample that we employ. Finally, we conclude that DES data will provide powerful constraints on the evolution of structure growth in the universe, conservatively/optimistically constraining the growth function to 7.9%/4.8% with its first-year data that covered over 1000 square degrees, and to 3.9%/2.3% with its full five-year data that will survey 5000 square degrees, including both statistical and systematic uncertainties.« less

  7. Joint analysis of galaxy-galaxy lensing and galaxy clustering: Methodology and forecasts for Dark Energy Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.; Krause, E.; Dodelson, S.

    The joint analysis of galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth function of large scale structure. Our analysis will be carried out on data from the Dark Energy Survey (DES), with its measurements of both the distribution of galaxies and the tangential shears of background galaxies induced by these foreground lenses. We develop a practical approach to modeling the assumptions and systematic effects affecting small scale lensing, which provides halo masses, and large scale galaxy clustering. Introducing parameters that characterize the halo occupation distribution (HOD), photometric redshift uncertainties, and shear measurement errors, we studymore » how external priors on different subsets of these parameters affect our growth constraints. Degeneracies within the HOD model, as well as between the HOD and the growth function, are identified as the dominant source of complication, with other systematic effects sub-dominant. The impact of HOD parameters and their degeneracies necessitate the detailed joint modeling of the galaxy sample that we employ. Finally, we conclude that DES data will provide powerful constraints on the evolution of structure growth in the universe, conservatively/optimistically constraining the growth function to 7.9%/4.8% with its first-year data that covered over 1000 square degrees, and to 3.9%/2.3% with its full five-year data that will survey 5000 square degrees, including both statistical and systematic uncertainties.« less

  8. Designing an External Evaluation of a Large-Scale Software Development Project.

    ERIC Educational Resources Information Center

    Collis, Betty; Moonen, Jef

    This paper describes the design and implementation of the evaluation of the POCO Project, a large-scale national software project in the Netherlands which incorporates the perspective of an evaluator throughout the entire span of the project, and uses the experiences gained from it to suggest an evaluation procedure that could be applied to other…

  9. Planck 2015 results. III. LFI systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Burigana, C.; Butler, R. C.; Calabrese, E.; Catalano, A.; Christensen, P. R.; Colombo, L. P. L.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Doré, O.; Ducout, A.; Dupac, X.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Frailis, M.; Franceschet, C.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Harrison, D. L.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Knoche, J.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Nati, F.; Natoli, P.; Noviello, F.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Pearson, T. J.; Perdereau, O.; Pettorino, V.; Piacentini, F.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Stolyarov, V.; Stompor, R.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (I) simulations based on measured data and physical models of the known systematic effects; and (II) analysis of difference maps containing the same sky signal ("null-maps"). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrum by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10-20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.

  10. Planck 2015 results: III. LFI systematic uncertainties

    DOE PAGES

    Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; ...

    2016-09-20

    In this paper, we present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal (“null-maps”). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrummore » by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10–20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Finally, some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.« less

  11. Planck 2015 results: III. LFI systematic uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ade, P. A. R.; Aumont, J.; Baccigalupi, C.

    In this paper, we present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal (“null-maps”). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrummore » by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10–20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Finally, some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abolhasani, Ali Akbar; School of Physics, Institute for Research in Fundamental Sciences; Mirbabayi, Mehrdad

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we showmore » that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k{sup 2} and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.« less

  13. A multilevel layout algorithm for visualizing physical and genetic interaction networks, with emphasis on their modular organization.

    PubMed

    Tuikkala, Johannes; Vähämaa, Heidi; Salmela, Pekka; Nevalainen, Olli S; Aittokallio, Tero

    2012-03-26

    Graph drawing is an integral part of many systems biology studies, enabling visual exploration and mining of large-scale biological networks. While a number of layout algorithms are available in popular network analysis platforms, such as Cytoscape, it remains poorly understood how well their solutions reflect the underlying biological processes that give rise to the network connectivity structure. Moreover, visualizations obtained using conventional layout algorithms, such as those based on the force-directed drawing approach, may become uninformative when applied to larger networks with dense or clustered connectivity structure. We implemented a modified layout plug-in, named Multilevel Layout, which applies the conventional layout algorithms within a multilevel optimization framework to better capture the hierarchical modularity of many biological networks. Using a wide variety of real life biological networks, we carried out a systematic evaluation of the method in comparison with other layout algorithms in Cytoscape. The multilevel approach provided both biologically relevant and visually pleasant layout solutions in most network types, hence complementing the layout options available in Cytoscape. In particular, it could improve drawing of large-scale networks of yeast genetic interactions and human physical interactions. In more general terms, the biological evaluation framework developed here enables one to assess the layout solutions from any existing or future graph drawing algorithm as well as to optimize their performance for a given network type or structure. By making use of the multilevel modular organization when visualizing biological networks, together with the biological evaluation of the layout solutions, one can generate convenient visualizations for many network biology applications.

  14. Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines

    PubMed Central

    Mikut, Ralf

    2017-01-01

    Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927

  15. A Functional Model for Management of Large Scale Assessments.

    ERIC Educational Resources Information Center

    Banta, Trudy W.; And Others

    This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…

  16. Methodological Quality Assessment of Meta-Analyses and Systematic Reviews of Probiotics in Inflammatory Bowel Disease and Pouchitis.

    PubMed

    Dong, Jinpei; Teng, Guigen; Wei, Tiantong; Gao, Wen; Wang, Huahong

    2016-01-01

    Probiotics are widely used for the induction and maintenance of remission in inflammatory bowel disease (IBD) and pouchitis. There are a large number of meta-analyses (MAs)/ systematic reviews (SRs) on this subject, the methodological quality of which has not been evaluated. This study aimed to evaluate the methodological quality of and summarize the evidence obtained from MAs/SRs of probiotic treatments for IBD and pouchitis patients. The PubMed, EMBASE, Cochrane Library and China National Knowledge Infrastructure (CNKI) databases were searched to identify Chinese and English language MAs/SRs of the use of probiotics for IBD and pouchitis. The Assessment of Multiple Systematic Reviews (AMSTAR) scale was used to assess the methodological quality of the studies. A total of 36 MAs/SRs were evaluated. The AMSTAR scores of the included studies ranged from 1 to 10, and the average score was 5.81. According to the Canadian Agency for Drugs and Technologies in Health, 4 articles were classified as high quality, 24 articles were classified as moderate quality, and 8 articles were classified as low quality. Most of the MAs/SRs suggested that probiotics had potential benefits for patients with ulcerative colitis (UC), but failed to show effectiveness in the induction and maintenance of remission in Crohn's disease (CD). The probiotic preparation VSL#3 may play a beneficial role in pouchitis. The overall methodological quality of the current MAs/SRs in the field of probiotics for IBD and pouchitis was found to be low to moderate. More MAs/SRs of high quality are required to support using probiotics to treat IBD and pouchitis.

  17. Large-Scale High School Reform through School Improvement Networks: Exploring Possibilities for "Developmental Evaluation"

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Lenhoff, Sarah Winchell; Glazer, Joshua L.

    2016-01-01

    Recognizing school improvement networks as a leading strategy for large-scale high school reform, this analysis examines developmental evaluation as an approach to examining school improvement networks as "learning systems" able to produce, use, and refine practical knowledge in large numbers of schools. Through a case study of one…

  18. Gas-Centered Swirl Coaxial Liquid Injector Evaluations

    NASA Technical Reports Server (NTRS)

    Cohn, A. K.; Strakey, P. A.; Talley, D. G.

    2005-01-01

    Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.

  19. ESL Student Bias in Instructional Evaluation.

    ERIC Educational Resources Information Center

    Wennerstrom, Ann K.; Heiser, Patty

    1992-01-01

    Reports on a statistical analysis of English-as-a-Second-Language (ESL) student evaluations of teachers in two large programs. Results indicated a systematic bias occurs in ESL student evaluations, raising issues of fairness in the use of student evaluations of ESL teachers for purposes of personnel decisions. (21 references) (GLR)

  20. A roadmap for natural product discovery based on large-scale genomics and metabolomics

    USDA-ARS?s Scientific Manuscript database

    Actinobacteria encode a wealth of natural product biosynthetic gene clusters, whose systematic study is complicated by numerous repetitive motifs. By combining several metrics we developed a method for global classification of these gene clusters into families (GCFs) and analyzed the biosynthetic ca...

  1. Accounting for observational uncertainties in the evaluation of low latitude turbulent air-sea fluxes simulated in a suite of IPSL model versions

    NASA Astrophysics Data System (ADS)

    Servonnat, Jerome; Braconnot, Pascale; Gainusa-Bogdan, Alina

    2015-04-01

    Turbulent momentum and heat (sensible and latent) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate and their good representation in climate models is of prime importance. In this work, we use the methodology developed by Braconnot & Frankignoul (1993) to perform a Hotelling T2 test on spatio-temporal fields (annual cycles). This statistic provides a quantitative measure accounting for an estimate of the observational uncertainty for the evaluation of low-latitude turbulent air-sea fluxes in a suite of IPSL model versions. The spread within the observational ensemble of turbulent flux data products assembled by Gainusa-Bogdan et al (submitted) is used as an estimate of the observational uncertainty for the different turbulent fluxes. The methodology holds on a selection of a small number of dominating variability patterns (EOFs) that are common to both the model and the observations for the comparison. Consequently it focuses on the large-scale variability patterns and avoids the possibly noisy smaller scales. The results show that different versions of the IPSL couple model share common large scale model biases, but also that there the skill on sea surface temperature is not necessarily directly related to the skill in the representation of the different turbulent fluxes. Despite the large error bars on the observations the test clearly distinguish the different merits of the different model version. The analyses of the common EOF patterns and related time series provide guidance on the major differences with the observations. This work is a first attempt to use such statistic on the evaluation of the spatio-temporal variability of the turbulent fluxes, accounting for an observational uncertainty, and represents an efficient tool for systematic evaluation of simulated air-seafluxes, considering both the fluxes and the related atmospheric variables. References Braconnot, P., and C. Frankignoul (1993), Testing Model Simulations of the Thermocline Depth Variability in the Tropical Atlantic from 1982 through 1984, J. Phys. Oceanogr., 23(4), 626-647 Gainusa-Bogdan A., Braconnot P. and Servonnat J. (submitted), Using an ensemble data set of turbulent air-sea fluxes to evaluate the IPSL climate model in tropical regions, Journal of Geophysical Research Atmosphere, 2014JD022985

  2. Cosmic shear as a probe of galaxy formation physics

    DOE PAGES

    Foreman, Simon; Becker, Matthew R.; Wechsler, Risa H.

    2016-09-01

    Here, we evaluate the potential for current and future cosmic shear measurements from large galaxy surveys to constrain the impact of baryonic physics on the matter power spectrum. We do so using a model-independent parametrization that describes deviations of the matter power spectrum from the dark-matter-only case as a set of principal components that are localized in wavenumber and redshift. We perform forecasts for a variety of current and future data sets, and find that at least ~90 per cent of the constraining power of these data sets is contained in no more than nine principal components. The constraining powermore » of different surveys can be quantified using a figure of merit defined relative to currently available surveys. With this metric, we find that the final Dark Energy Survey data set (DES Y5) and the Hyper Suprime-Cam Survey will be roughly an order of magnitude more powerful than existing data in constraining baryonic effects. Upcoming Stage IV surveys (Large Synoptic Survey Telescope, Euclid, and Wide Field Infrared Survey Telescope) will improve upon this by a further factor of a few. We show that this conclusion is robust to marginalization over several key systematics. The ultimate power of cosmic shear to constrain galaxy formation is dependent on understanding systematics in the shear measurements at small (sub-arcminute) scales. Lastly, if these systematics can be sufficiently controlled, cosmic shear measurements from DES Y5 and other future surveys have the potential to provide a very clean probe of galaxy formation and to strongly constrain a wide range of predictions from modern hydrodynamical simulations.« less

  3. Evaluation of Large-Scale Public-Sector Reforms: A Comparative Analysis

    ERIC Educational Resources Information Center

    Breidahl, Karen N.; Gjelstrup, Gunnar; Hansen, Hanne Foss; Hansen, Morten Balle

    2017-01-01

    Research on the evaluation of large-scale public-sector reforms is rare. This article sets out to fill that gap in the evaluation literature and argues that it is of vital importance since the impact of such reforms is considerable and they change the context in which evaluations of other and more delimited policy areas take place. In our…

  4. Constraints on the Origin of Cosmic Rays above 1018 eV from Large-scale Anisotropy Searches in Data of the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antiči'c, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Buroker, L.; Burton, R. E.; Caballero-Mora, K. S.; Caccianiga, B.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gemmeke, H.; Ghia, P. L.; Giller, M.; Gitto, J.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gouffon, P.; Grashorn, E.; Grebe, S.; Griffith, N.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jansen, S.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; LaHurd, D.; Latronico, L.; Lauer, R.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Messina, S.; Meurer, C.; Meyhandan, R.; Mi'canovi'c, S.; Micheletti, M. I.; Minaya, I. A.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niechciol, M.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Oehlschläger, J.; Olinto, A.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrolini, A.; Petrov, Y.; Pfendner, C.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Ponce, V. H.; Pontz, M.; Porcelli, A.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Cabo, I.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Sima, O.; 'Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tapia, A.; Tartare, M.; Taşcău, O.; Tcaciuc, R.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tkaczyk, W.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Westerhoff, S.; Whelan, B. J.; Widom, A.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano Garcia, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.

    2013-01-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 1018 eV at the Pierre Auger Observatory is reported. For the first time, these large-scale anisotropy searches are performed as a function of both the right ascension and the declination and expressed in terms of dipole and quadrupole moments. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Upper limits on dipole and quadrupole amplitudes are derived under the hypothesis that any cosmic ray anisotropy is dominated by such moments in this energy range. These upper limits provide constraints on the production of cosmic rays above 1018 eV, since they allow us to challenge an origin from stationary galactic sources densely distributed in the galactic disk and emitting predominantly light particles in all directions.

  5. Evaluating the Effectiveness of a Large-Scale Professional Development Programme

    ERIC Educational Resources Information Center

    Main, Katherine; Pendergast, Donna

    2017-01-01

    An evaluation of the effectiveness of a large-scale professional development (PD) programme delivered to 258 schools in Queensland, Australia is presented. Formal evaluations were conducted at two stages during the programme using a tool developed from Desimone's five core features of effective PD. Descriptive statistics of 38 questions and…

  6. Using Large-Scale Databases in Evaluation: Advances, Opportunities, and Challenges

    ERIC Educational Resources Information Center

    Penuel, William R.; Means, Barbara

    2011-01-01

    Major advances in the number, capabilities, and quality of state, national, and transnational databases have opened up new opportunities for evaluators. Both large-scale data sets collected for administrative purposes and those collected by other researchers can provide data for a variety of evaluation-related activities. These include (a)…

  7. Impact of compressibility on heat transport characteristics of large terrestrial planets

    NASA Astrophysics Data System (ADS)

    Čížková, Hana; van den Berg, Arie; Jacobs, Michel

    2017-07-01

    We present heat transport characteristics for mantle convection in large terrestrial exoplanets (M ⩽ 8M⊕) . Our thermal convection model is based on a truncated anelastic liquid approximation (TALA) for compressible fluids and takes into account a selfconsistent thermodynamic description of material properties derived from mineral physics based on a multi-Einstein vibrational approach. We compare heat transport characteristics in compressible models with those obtained with incompressible models based on the classical- and extended Boussinesq approximation (BA and EBA respectively). Our scaling analysis shows that heat flux scales with effective dissipation number as Nu ∼Dieff-0.71 and with Rayleigh number as Nu ∼Raeff0.27. The surface heat flux of the BA models strongly overestimates the values from the corresponding compressible models, whereas the EBA models systematically underestimate the heat flux by ∼10%-15% with respect to a corresponding compressible case. Compressible models are also systematically warmer than the EBA models. Compressibility effects are therefore important for mantle dynamic processes, especially for large rocky exoplanets and consequently also for formation of planetary atmospheres, through outgassing, and the existence of a magnetic field, through thermal coupling of mantle and core dynamic systems.

  8. General practitioners' continuing education: a review of policies, strategies and effectiveness, and their implications for the future.

    PubMed

    Smith, F; Singleton, A; Hilton, S

    1998-10-01

    The accreditation and provision of continuing education for general practitioners (GPs) is set to change with new proposals from the General Medical Council, the Government, and the Chief Medical Officer. To review the theories, policies, strategies, and effectiveness in GP continuing education in the past 10 years. A systematic review of the literature by computerized and manual searches of relevant journals and books. Educational theory suggests that continuing education (CE) should be work-based and use the learner's experiences. Audit can play an important role in determining performance and needs assessment, but at present is largely a separate activity. Educational and professional support, such as through mentors or co-tutors, has been successfully piloted but awaits larger scale evaluation. Most accredited educational events are still the postgraduate centre lecture, and GP Tutors have a variable role in CE management and provision. Controlled trials of CE strategies suggest effectiveness is enhanced by personal feedback and work prompts. Qualitative studies have demonstrated that education plays only a small part in influencing doctors' behavior. Maintaining good clinical practice is on many stakeholders' agendas. A variety of methods may be effective in CE, and larger scale trials or evaluations are needed.

  9. Implant Supported Fixed Restorations versus Implant Supported Removable Overdentures: A Systematic Review

    PubMed Central

    Selim, Khaled; Ali, Sherif; Reda, Ahmed

    2016-01-01

    AIM: The aim of this study is to systematically evaluate and compare implant retained fixed restoration versus implant retained over denture. MATERIAL AND METHODS: Search was made in 2 databases including PubMed and PubMed Central. Title and abstract were screened to select studies comparing implant retained fixed restorations versus implant retained removable overdentures. Articles which did not follow the inclusion criteria were excluded. Included papers were then read carefully for a second stage filter, this was followed by manual searching of bibliography of selected articles. RESULTS: The search resulted in 5 included papers. One study evaluated the masticatory function, while the other 4 evaluated the patient satisfaction. Two of them used Visual Analogue Scale (VAS) as a measurement tool, while the other two used VAS and Categorical Scales (CAT). Stability, ability to chew, ability to clean, ability to speak and esthetics were the main outcomes of the 4 included papers. CONCLUSION: Conflicting results was observed between the fixed and removable restorations. PMID:28028423

  10. A systematic review of economic evaluations of CHW interventions aimed at improving child health outcomes.

    PubMed

    Nkonki, L; Tugendhaft, A; Hofman, K

    2017-02-28

    Evidence of the cost-effectiveness of community health worker interventions is pertinent for decision-makers and programme planners who are turning to community services in order to strengthen health systems in the context of the momentum generated by strategies to support universal health care, the post-2015 Sustainable Development Goal agenda.We conducted a systematic review of published economic evaluation studies of community health worker interventions aimed at improving child health outcomes. Four public health and economic evaluation databases were searched for studies that met the inclusion criteria: National Health Service Economic Evaluation Database (NHS EED), Cochrane, Paediatric Economic Evaluation Database (PEED), and PubMed. The search strategy was tailored to each database.The 19 studies that met the inclusion criteria were conducted in either high income countries (HIC), low- income countries (LIC) and/or middle-income countries (MIC). The economic evaluations covered a wide range of interventions. Studies were grouped together by intended outcome or objective of each study. The data varied in quality. We found evidence of cost-effectiveness of community health worker (CHW) interventions in reducing malaria and asthma, decreasing mortality of neonates and children, improving maternal health, increasing exclusive breastfeeding and improving malnutrition, and positively impacting physical health and psychomotor development amongst children.Studies measured varied outcomes, due to the heterogeneous nature of studies included; a meta-analysis was not conducted. Outcomes included disease- or condition -specific outcomes, morbidity, mortality, and generic measures (e.g. disability-adjusted life years (DALYs)). Nonetheless, all 19 interventions were found to be either cost-effective or highly cost-effective at a threshold specific to their respective countries.There is a growing body of economic evaluation literature on cost-effectiveness of CHW interventions. However, this is largely for small scale and vertical programmes. There is a need for economic evaluations of larger and integrated CHW programmes in order to achieve the post-2015 Sustainable Development Goal agenda so that appropriate resources can be allocated to this subset of human resources for health. This is the first systematic review to assess the cost-effectiveness of community health workers in delivering child health interventions.

  11. Monitoring land at regional and national scales and the role of remote sensing

    NASA Astrophysics Data System (ADS)

    Dymond, John R.; Bégue, Agnes; Loseen, Danny

    There is a need world wide for monitoring land and its ecosystems to ensure their sustainable use. Despite the laudable intentions of Agenda 21 at the Rio Earth Summit, 1992, in which many countries agreed to monitor and report on the status of their land, systematic monitoring of land has yet to begin. The problem is truly difficult, as the earth's surface is vast and the funds available for monitoring are relatively small. This paper describes several methods for cost-effective monitoring of large land areas, including: strategic monitoring; statistical sampling; risk-based approaches; integration of land and water monitoring; and remote sensing. The role of remote sensing is given special attention, as it is the only method that can monitor land exhaustively and directly, at regional and national scales. It is concluded that strategic monitoring, whereby progress towards environmental goals is assessed, is a vital element in land monitoring as it provides a means for evaluating the utility of monitoring designs.

  12. Cross-Domain Shoe Retrieval with a Semantic Hierarchy of Attribute Classification Network.

    PubMed

    Zhan, Huijing; Shi, Boxin; Kot, Alex C

    2017-08-04

    Cross-domain shoe image retrieval is a challenging problem, because the query photo from the street domain (daily life scenario) and the reference photo in the online domain (online shop images) have significant visual differences due to the viewpoint and scale variation, self-occlusion, and cluttered background. This paper proposes the Semantic Hierarchy Of attributE Convolutional Neural Network (SHOE-CNN) with a three-level feature representation for discriminative shoe feature expression and efficient retrieval. The SHOE-CNN with its newly designed loss function systematically merges semantic attributes of closer visual appearances to prevent shoe images with the obvious visual differences being confused with each other; the features extracted from image, region, and part levels effectively match the shoe images across different domains. We collect a large-scale shoe dataset composed of 14341 street domain and 12652 corresponding online domain images with fine-grained attributes to train our network and evaluate our system. The top-20 retrieval accuracy improves significantly over the solution with the pre-trained CNN features.

  13. Fine-scale patterns of population stratification confound rare variant association tests.

    PubMed

    O'Connor, Timothy D; Kiezun, Adam; Bamshad, Michael; Rich, Stephen S; Smith, Joshua D; Turner, Emily; Leal, Suzanne M; Akey, Joshua M

    2013-01-01

    Advances in next-generation sequencing technology have enabled systematic exploration of the contribution of rare variation to Mendelian and complex diseases. Although it is well known that population stratification can generate spurious associations with common alleles, its impact on rare variant association methods remains poorly understood. Here, we performed exhaustive coalescent simulations with demographic parameters calibrated from exome sequence data to evaluate the performance of nine rare variant association methods in the presence of fine-scale population structure. We find that all methods have an inflated spurious association rate for parameter values that are consistent with levels of differentiation typical of European populations. For example, at a nominal significance level of 5%, some test statistics have a spurious association rate as high as 40%. Finally, we empirically assess the impact of population stratification in a large data set of 4,298 European American exomes. Our results have important implications for the design, analysis, and interpretation of rare variant genome-wide association studies.

  14. An adaptive response surface method for crashworthiness optimization

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Yang, Ren-Jye; Zhu, Ping

    2013-11-01

    Response surface-based design optimization has been commonly used for optimizing large-scale design problems in the automotive industry. However, most response surface models are built by a limited number of design points without considering data uncertainty. In addition, the selection of a response surface in the literature is often arbitrary. This article uses a Bayesian metric to systematically select the best available response surface among several candidates in a library while considering data uncertainty. An adaptive, efficient response surface strategy, which minimizes the number of computationally intensive simulations, was developed for design optimization of large-scale complex problems. This methodology was demonstrated by a crashworthiness optimization example.

  15. Spatial Covariability of Temperature and Hydroclimate as a Function of Timescale During the Common Era

    NASA Astrophysics Data System (ADS)

    McKay, N.

    2017-12-01

    As timescale increases from years to centuries, the spatial scale of covariability in the climate system is hypothesized to increase as well. Covarying spatial scales are larger for temperature than for hydroclimate, however, both aspects of the climate system show systematic changes on large-spatial scales on orbital to tectonic timescales. The extent to which this phenomenon is evident in temperature and hydroclimate at centennial timescales is largely unknown. Recent syntheses of multidecadal to century-scale variability in hydroclimate during the past 2k in the Arctic, North America, and Australasia show little spatial covariability in hydroclimate during the Common Era. To determine 1) the evidence for systematic relationships between the spatial scale of climate covariability as a function of timescale, and 2) whether century-scale hydroclimate variability deviates from the relationship between spatial covariability and timescale, we quantify this phenomenon during the Common Era by calculating the e-folding distance in large instrumental and paleoclimate datasets. We calculate this metric of spatial covariability, at different timescales (1, 10 and 100-yr), for a large network of temperature and precipitation observations from the Global Historical Climatology Network (n=2447), from v2.0.0 of the PAGES2k temperature database (n=692), and from moisture-sensitive paleoclimate records North America, the Arctic, and the Iso2k project (n = 328). Initial results support the hypothesis that the spatial scale of covariability is larger for temperature, than for precipitation or paleoclimate hydroclimate indicators. Spatially, e-folding distances for temperature are largest at low latitudes and over the ocean. Both instrumental and proxy temperature data show clear evidence for increasing spatial extent as a function of timescale, but this phenomenon is very weak in the hydroclimate data analyzed here. In the proxy hydroclimate data, which are predominantly indicators of effective moisture, e-folding distance increases from annual to decadal timescales, but does not continue to increase to centennial timescales. Future work includes examining additional instrumental and proxy datasets of moisture variability, and extending the analysis to millennial timescales of variability.

  16. SQDFT: Spectral Quadrature method for large-scale parallel O(N) Kohn-Sham calculations at high temperature

    NASA Astrophysics Data System (ADS)

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj; Pask, John E.

    2018-03-01

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method for O(N) Kohn-Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw-Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw-Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. We further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect O(N) scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.

  17. Non-Hookean statistical mechanics of clamped graphene ribbons

    NASA Astrophysics Data System (ADS)

    Bowick, Mark J.; Košmrlj, Andrej; Nelson, David R.; Sknepnek, Rastko

    2017-03-01

    Thermally fluctuating sheets and ribbons provide an intriguing forum in which to investigate strong violations of Hooke's Law: Large distance elastic parameters are in fact not constant but instead depend on the macroscopic dimensions. Inspired by recent experiments on free-standing graphene cantilevers, we combine the statistical mechanics of thin elastic plates and large-scale numerical simulations to investigate the thermal renormalization of the bending rigidity of graphene ribbons clamped at one end. For ribbons of dimensions W ×L (with L ≥W ), the macroscopic bending rigidity κR determined from cantilever deformations is independent of the width when W <ℓth , where ℓth is a thermal length scale, as expected. When W >ℓth , however, this thermally renormalized bending rigidity begins to systematically increase, in agreement with the scaling theory, although in our simulations we were not quite able to reach the system sizes necessary to determine the fully developed power law dependence on W . When the ribbon length L >ℓp , where ℓp is the W -dependent thermally renormalized ribbon persistence length, we observe a scaling collapse and the beginnings of large scale random walk behavior.

  18. Effectiveness of information and communication technologies interventions to increase mental health literacy: A systematic review.

    PubMed

    Tay, Jing Ling; Tay, Yi Fen; Klainin-Yobas, Piyanee

    2018-06-13

    Most mental health conditions affect adolescent and young adults. The onset of many mental disorders occurs in the young age. This is a critical period to implement interventions to enhance mental health literacy (MHL) and to prevent the occurrence of mental health problems. This systematic review examined the effectiveness of information and communication technologies interventions on MHL (recognition of conditions, stigma and help-seeking). The authors searched for both published and unpublished studies. Nineteen studies were included with 9 randomized controlled trials and 10 quasi-experimental studies. Informational interventions were useful to enhance MHL of less-known disorders such as anxiety disorder and anorexia, but not depression. Interventions that were effective in enhancing depression MHL comprised active component such as videos or quizzes. Interventions that successfully elevated MHL also reduced stigma. Elevated MHL levels did not improve help-seeking, and reduction in stigma levels did not enhance help-seeking behaviours. Future good quality, large-scale, multi-sites randomized controlled trials are necessary to evaluate MHL interventions. © 2018 John Wiley & Sons Australia, Ltd.

  19. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  20. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  1. Dynamically Consistent Parameterization of Mesoscale Eddies This work aims at parameterization of eddy effects for use in non-eddy-resolving ocean models and focuses on the effect of the stochastic part of the eddy forcing that backscatters and induces eastward jet extension of the western boundary currents and its adjacent recirculation zones.

    NASA Astrophysics Data System (ADS)

    Berloff, P. S.

    2016-12-01

    This work aims at developing a framework for dynamically consistent parameterization of mesoscale eddy effects for use in non-eddy-resolving ocean circulation models. The proposed eddy parameterization framework is successfully tested on the classical, wind-driven double-gyre model, which is solved both with explicitly resolved vigorous eddy field and in the non-eddy-resolving configuration with the eddy parameterization replacing the eddy effects. The parameterization focuses on the effect of the stochastic part of the eddy forcing that backscatters and induces eastward jet extension of the western boundary currents and its adjacent recirculation zones. The parameterization locally approximates transient eddy flux divergence by spatially localized and temporally periodic forcing, referred to as the plunger, and focuses on the linear-dynamics flow solution induced by it. The nonlinear self-interaction of this solution, referred to as the footprint, characterizes and quantifies the induced eddy forcing exerted on the large-scale flow. We find that spatial pattern and amplitude of each footprint strongly depend on the underlying large-scale flow, and the corresponding relationships provide the basis for the eddy parameterization and its closure on the large-scale flow properties. Dependencies of the footprints on other important parameters of the problem are also systematically analyzed. The parameterization utilizes the local large-scale flow information, constructs and scales the corresponding footprints, and then sums them up over the gyres to produce the resulting eddy forcing field, which is interactively added to the model as an extra forcing. Thus, the assumed ensemble of plunger solutions can be viewed as a simple model for the cumulative effect of the stochastic eddy forcing. The parameterization framework is implemented in the simplest way, but it provides a systematic strategy for improving the implementation algorithm.

  2. Cutaneous lichen planus: A systematic review of treatments.

    PubMed

    Fazel, Nasim

    2015-06-01

    Various treatment modalities are available for cutaneous lichen planus. Pubmed, EMBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effects, and Health Technology Assessment Database were searched for all the systematic reviews and randomized controlled trials related to cutaneous lichen planus. Two systematic reviews and nine relevant randomized controlled trials were identified. Acitretin, griseofulvin, hydroxychloroquine and narrow band ultraviolet B are demonstrated to be effective in the treatment of cutaneous lichen planus. Sulfasalazine is effective, but has an unfavorable safety profile. KH1060, a vitamin D analogue, is not beneficial in the management of cutaneous lichen planus. Evidence from large scale randomized trials demonstrating the safety and efficacy for many other treatment modalities used to treat cutaneous lichen planus is simply not available.

  3. Revista de Investigacion Educativa, 2000 (Journal of Educational Research, 2000).

    ERIC Educational Resources Information Center

    Revista de Investigacion Educativa, 2000

    2000-01-01

    Articles in this volume focus on the following: teacher evaluation and quality management in education; steps toward a comprehensive and systematic staff evaluation; opinions of university students on teaching methods at science faculties; design of a scale to assess the ability to jump for the use in elementary school physical education; effects…

  4. Measurement properties of instruments evaluating self-care and related concepts in people with chronic obstructive pulmonary disease: A systematic review.

    PubMed

    Clari, Marco; Matarese, Maria; Alvaro, Rosaria; Piredda, Michela; De Marinis, Maria Grazia

    2016-01-01

    The use of valid and reliable instruments for assessing self-care is crucial for the evaluation of chronic obstructive pulmonary disease (COPD) management programs. The aim of this review is to evaluate the measurement properties and theoretical foundations of instruments for assessing self-care and related concepts in people with COPD. A systematic review was conducted of articles describing the development and validation of self-care instruments. The methodological quality of the measurement properties was assessed using the COSMIN checklist. Ten studies were included evaluating five instruments: three for assessing self-care and self-management and two for assessing self-efficacy. The COPD Self-Efficacy Scale was the most studied instrument, but due to poor study methodological quality, evidence about its measurement properties is inconclusive. Evidence from the COPD Self-Management Scale is more promising, but only one study tested its properties. Due to inconclusive evidence of their measurement properties, no instrument can be recommended for clinical use. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Measurements of the pairwise kinematic Sunyaev-Zel'dovich effect with the Atacama Cosmology Telescope and future surveys

    NASA Astrophysics Data System (ADS)

    Vavagiakis, Eve Marie; De Bernardis, Francesco; Aiola, Simone; Battaglia, Nicholas; Niemack, Michael D.; ACTPol Collaboration

    2017-06-01

    We have made improved measurements of the kinematic Sunyaev-Zel’dovich (kSZ) effect using data from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS). We used a map of the Cosmic Microwave Background (CMB) from two seasons of observations each by ACT and the Atacama Cosmology Telescope Polarimeter (ACTPol) receiver. We evaluated the mean pairwise baryon momentum associated with the positions of 50,000 bright galaxies in the BOSS DR11 Large Scale Structure catalog via 600 square degrees of overlapping sky area. The measurement of the kSZ signal arising from the large-scale motions of clusters was made by fitting data to an analytical model. The free parameter of the fit determined the optical depth to microwave photon scattering for the cluster sample. We estimated the covariance matrix of the mean pairwise momentum as a function of galaxy separation using CMB simulations, jackknife evaluation, and bootstrap estimates. The most conservative simulation-based uncertainties gave signal-to-noise estimates between 3.6 and 4.1 for various luminosity cuts. Additionally, we explored a novel approach to estimating cluster optical depths from the average thermal Sunyaev-Zel’dovich (tSZ) signal at the BOSS DR11 catalog positions. Our results were broadly consistent with those obtained from the kSZ signal. In the future, the tSZ signal may provide a valuable probe of cluster optical depths, enabling the extraction of velocities from the kSZ sourced mean pairwise momenta. New CMB maps from three seasons of ACTPol observations with multi-frequency coverage overlap with nearly four times as many DR11 sources and promise to improve statistics and systematics for SZ measurements. With these and other upcoming data, the pairwise kSZ signal is poised to become a powerful new cosmological tool, able to probe large physical scales to inform neutrino physics and test models of modified gravity and dark energy.

  6. The Olympic Regeneration in East London (ORiEL) study: protocol for a prospective controlled quasi-experiment to evaluate the impact of urban regeneration on young people and their families.

    PubMed

    Smith, Neil R; Clark, Charlotte; Fahy, Amanda E; Tharmaratnam, Vanathi; Lewis, Daniel J; Thompson, Claire; Renton, Adrian; Moore, Derek G; Bhui, Kamaldeep S; Taylor, Stephanie J C; Eldridge, Sandra; Petticrew, Mark; Greenhalgh, Tricia; Stansfeld, Stephen A; Cummins, Steven

    2012-01-01

    Recent systematic reviews suggest that there is a dearth of evidence on the effectiveness of large-scale urban regeneration programmes in improving health and well-being and alleviating health inequalities. The development of the Olympic Park in Stratford for the London 2012 Olympic and Paralympic Games provides the opportunity to take advantage of a natural experiment to examine the impact of large-scale urban regeneration on the health and well-being of young people and their families. A prospective school-based survey of adolescents (11-12 years) with parent data collected through face-to-face interviews at home. Adolescents will be recruited from six randomly selected schools in an area receiving large-scale urban regeneration (London Borough of Newham) and compared with adolescents in 18 schools in three comparison areas with no equivalent regeneration (London Boroughs of Tower Hamlets, Hackney and Barking & Dagenham). Baseline data will be completed prior to the start of the London Olympics (July 2012) with follow-up at 6 and 18 months postintervention. Primary outcomes are: pre-post change in adolescent and parent mental health and well-being, physical activity and parental employment status. Secondary outcomes include: pre-post change in social cohesion, smoking, alcohol use, diet and body mass index. The study will account for individual and environmental contextual effects in evaluating changes to identified outcomes. A nested longitudinal qualitative study will explore families' experiences of regeneration in order to unpack the process by which regeneration impacts on health and well-being. The study has approval from Queen Mary University of London Ethics Committee (QMREC2011/40), the Association of Directors of Children's Services (RGE110927) and the London Boroughs Research Governance Framework (CERGF113). Fieldworkers have had advanced Criminal Records Bureau clearance. Findings will be disseminated through peer-reviewed publications, national and international conferences, through participating schools and the study website (http://www.orielproject.co.uk).

  7. Using the Weak-Temperature Gradient Approximation to Evaluate Parameterizations: An Example of the Transition From Suppressed to Active Convection

    NASA Astrophysics Data System (ADS)

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.

    2017-10-01

    Two single-column models are fully coupled via the weak-temperature gradient approach. The coupled-SCM is used to simulate the transition from suppressed to active convection under the influence of an interactive large-scale circulation. The sensitivity of this transition to the value of mixing entrainment within the convective parameterization is explored. The results from these simulations are compared with those from equivalent simulations using coupled cloud-resolving models. Coupled-column simulations over nonuniform surface forcing are used to initialize the simulations of the transition, in which the column with suppressed convection is forced to undergo a transition to active convection by changing the local and/or remote surface forcings. The direct contributions from the changes in surface forcing are to induce a weakening of the large-scale circulation which systematically modulates the transition. In the SCM, the contributions from the large-scale circulation are dominated by the heating effects, while in the CRM the heating and moistening effects are about equally divided. A transition time is defined as the time when the rain rate in the dry column is halfway to the value at equilibrium after the transition. For the control value of entrainment, the order of the transition times is identical to that obtained in the CRM, but the transition times are markedly faster. The locally forced transition is strongly delayed by a higher entrainment. A consequence is that for a 50% higher entrainment the transition times are reordered. The remotely forced transition remains fast while the locally forced transition becomes slow, compared to the CRM.

  8. A Bayesian Estimate of the CMB-Large-scale Structure Cross-correlation

    NASA Astrophysics Data System (ADS)

    Moura-Santos, E.; Carvalho, F. C.; Penna-Lima, M.; Novaes, C. P.; Wuensche, C. A.

    2016-08-01

    Evidences for late-time acceleration of the universe are provided by multiple probes, such as Type Ia supernovae, the cosmic microwave background (CMB), and large-scale structure (LSS). In this work, we focus on the integrated Sachs-Wolfe (ISW) effect, I.e., secondary CMB fluctuations generated by evolving gravitational potentials due to the transition between, e.g., the matter and dark energy (DE) dominated phases. Therefore, assuming a flat universe, DE properties can be inferred from ISW detections. We present a Bayesian approach to compute the CMB-LSS cross-correlation signal. The method is based on the estimate of the likelihood for measuring a combined set consisting of a CMB temperature and galaxy contrast maps, provided that we have some information on the statistical properties of the fluctuations affecting these maps. The likelihood is estimated by a sampling algorithm, therefore avoiding the computationally demanding techniques of direct evaluation in either pixel or harmonic space. As local tracers of the matter distribution at large scales, we used the Two Micron All Sky Survey galaxy catalog and, for the CMB temperature fluctuations, the ninth-year data release of the Wilkinson Microwave Anisotropy Probe (WMAP9). The results show a dominance of cosmic variance over the weak recovered signal, due mainly to the shallowness of the catalog used, with systematics associated with the sampling algorithm playing a secondary role as sources of uncertainty. When combined with other complementary probes, the method presented in this paper is expected to be a useful tool to late-time acceleration studies in cosmology.

  9. Lessons Learned from Large-Scale Randomized Experiments

    ERIC Educational Resources Information Center

    Slavin, Robert E.; Cheung, Alan C. K.

    2017-01-01

    Large-scale randomized studies provide the best means of evaluating practical, replicable approaches to improving educational outcomes. This article discusses the advantages, problems, and pitfalls of these evaluations, focusing on alternative methods of randomization, recruitment, ensuring high-quality implementation, dealing with attrition, and…

  10. The use of observational scales to monitor symptom control and depth of sedation in patients requiring palliative sedation: a systematic review.

    PubMed

    Brinkkemper, Tijn; van Norel, Arjanne M; Szadek, Karolina M; Loer, Stephan A; Zuurmond, Wouter W A; Perez, Roberto S G M

    2013-01-01

    Palliative sedation is the intentional lowering of consciousness of a patient in the last phase of life to relieve suffering from refractory symptoms such as pain, delirium and dyspnoea. In this systematic review, we evaluated the use of monitoring scales to assess the degree of control of refractory symptoms and/or the depth of the sedation. A database search of PubMed and Embase was performed up to January 2010 using the search terms 'palliative sedation' OR 'terminal sedation'. Retro- and prospective studies as well as reviews and guidelines containing information about monitoring of palliative sedation, written in the English, German or Dutch language were included. The search yielded 264 articles of which 30 were considered relevant. Most studies focused on monitoring refractory symptoms (pain, fatigue or delirium) or the level of awareness to control the level of sedation. Four prospective and one retrospective study used scales validated in other settings: the Numeric Pain Rating Scale, the Visual Analogue Scale, the Memorial Delirium Assessment Scale, the Communication Capacity Scale and Agitation Distress Scale. Only the Community Capacity Scale was partially validated for use in a palliative sedation setting. One guideline described the use of a scale validated in another setting. A minority of studies reported the use of observational scales to monitor the effect of palliative sedation. Future studies should be focused on establishing proper instruments, most adequate frequency and timing of assessment, and interdisciplinary evaluation of sedation depth and symptom control for palliative sedation.

  11. A statewide nurse training program for a hospital based infant abusive head trauma prevention program.

    PubMed

    Nocera, Maryalice; Shanahan, Meghan; Murphy, Robert A; Sullivan, Kelly M; Barr, Marilyn; Price, Julie; Zolotor, Adam

    2016-01-01

    Successful implementation of universal patient education programs requires training large numbers of nursing staff in new content and procedures and maintaining fidelity to program standards. In preparation for statewide adoption of a hospital based universal education program, nursing staff at 85 hospitals and 1 birthing center in North Carolina received standardized training. This article describes the training program and reports findings from the process, outcome and impact evaluations of this training. Evaluation strategies were designed to query nurse satisfaction with training and course content; determine if training conveyed new information, and assess if nurses applied lessons from the training sessions to deliver the program as designed. Trainings were conducted during April 2008-February 2010. Evaluations were received from 4358 attendees. Information was obtained about training type, participants' perceptions of newness and usefulness of information and how the program compared to other education materials. Program fidelity data were collected using telephone surveys about compliance to delivery of teaching points and teaching behaviors. Results demonstrate high levels of satisfaction and perceptions of program utility as well as adherence to program model. These findings support the feasibility of implementing a universal patient education programs with strong uptake utilizing large scale systematic training programs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Planck intermediate results: XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    DOE PAGES

    Aghanim, N.; Ashdown, M.; Aumont, J.; ...

    2016-12-12

    This study describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducingmore » significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. Finally, in a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.« less

  13. Planck intermediate results: XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghanim, N.; Ashdown, M.; Aumont, J.

    This study describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducingmore » significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. Finally, in a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.« less

  14. Planck intermediate results. XLVI. Reduction of large-scale systematic effects in HFI polarization maps and estimation of the reionization optical depth

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Aghanim, N.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Ballardini, M.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battye, R.; Benabed, K.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carron, J.; Challinor, A.; Chiang, H. C.; Colombo, L. P. L.; Combet, C.; Comis, B.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fantaye, Y.; Finelli, F.; Forastieri, F.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Génova-Santos, R. T.; Gerbino, M.; Ghosh, T.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Helou, G.; Henrot-Versillé, S.; Herranz, D.; Hivon, E.; Huang, Z.; Ilić, S.; Jaffe, A. H.; Jones, W. C.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knox, L.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Langer, M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leahy, J. P.; Levrier, F.; Liguori, M.; Lilje, P. B.; López-Caniego, M.; Ma, Y.-Z.; Macías-Pérez, J. F.; Maggio, G.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Matarrese, S.; Mauri, N.; McEwen, J. D.; Meinhold, P. R.; Melchiorri, A.; Mennella, A.; Migliaccio, M.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Moss, A.; Mottet, S.; Naselsky, P.; Natoli, P.; Oxborrow, C. A.; Pagano, L.; Paoletti, D.; Partridge, B.; Patanchon, G.; Patrizii, L.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Plaszczynski, S.; Polastri, L.; Polenta, G.; Puget, J.-L.; Rachen, J. P.; Racine, B.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Rocha, G.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Ruiz-Granados, B.; Salvati, L.; Sandri, M.; Savelainen, M.; Scott, D.; Sirri, G.; Sunyaev, R.; Suur-Uski, A.-S.; Tauber, J. A.; Tenti, M.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Valiviita, J.; Van Tent, F.; Vibert, L.; Vielva, P.; Villa, F.; Vittorio, N.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; White, M.; Zacchei, A.; Zonca, A.

    2016-12-01

    This paper describes the identification, modelling, and removal of previously unexplained systematic effects in the polarization data of the Planck High Frequency Instrument (HFI) on large angular scales, including new mapmaking and calibration procedures, new and more complete end-to-end simulations, and a set of robust internal consistency checks on the resulting maps. These maps, at 100, 143, 217, and 353 GHz, are early versions of those that will be released in final form later in 2016. The improvements allow us to determine the cosmic reionization optical depth τ using, for the first time, the low-multipole EE data from HFI, reducing significantly the central value and uncertainty, and hence the upper limit. Two different likelihood procedures are used to constrain τ from two estimators of the CMB E- and B-mode angular power spectra at 100 and 143 GHz, after debiasing the spectra from a small remaining systematic contamination. These all give fully consistent results. A further consistency test is performed using cross-correlations derived from the Low Frequency Instrument maps of the Planck 2015 data release and the new HFI data. For this purpose, end-to-end analyses of systematic effects from the two instruments are used to demonstrate the near independence of their dominant systematic error residuals. The tightest result comes from the HFI-based τ posterior distribution using the maximum likelihood power spectrum estimator from EE data only, giving a value 0.055 ± 0.009. In a companion paper these results are discussed in the context of the best-fit PlanckΛCDM cosmological model and recent models of reionization.

  15. BioPreDyn-bench: a suite of benchmark problems for dynamic modelling in systems biology.

    PubMed

    Villaverde, Alejandro F; Henriques, David; Smallbone, Kieran; Bongard, Sophia; Schmid, Joachim; Cicin-Sain, Damjan; Crombach, Anton; Saez-Rodriguez, Julio; Mauch, Klaus; Balsa-Canto, Eva; Mendes, Pedro; Jaeger, Johannes; Banga, Julio R

    2015-02-20

    Dynamic modelling is one of the cornerstones of systems biology. Many research efforts are currently being invested in the development and exploitation of large-scale kinetic models. The associated problems of parameter estimation (model calibration) and optimal experimental design are particularly challenging. The community has already developed many methods and software packages which aim to facilitate these tasks. However, there is a lack of suitable benchmark problems which allow a fair and systematic evaluation and comparison of these contributions. Here we present BioPreDyn-bench, a set of challenging parameter estimation problems which aspire to serve as reference test cases in this area. This set comprises six problems including medium and large-scale kinetic models of the bacterium E. coli, baker's yeast S. cerevisiae, the vinegar fly D. melanogaster, Chinese Hamster Ovary cells, and a generic signal transduction network. The level of description includes metabolism, transcription, signal transduction, and development. For each problem we provide (i) a basic description and formulation, (ii) implementations ready-to-run in several formats, (iii) computational results obtained with specific solvers, (iv) a basic analysis and interpretation. This suite of benchmark problems can be readily used to evaluate and compare parameter estimation methods. Further, it can also be used to build test problems for sensitivity and identifiability analysis, model reduction and optimal experimental design methods. The suite, including codes and documentation, can be freely downloaded from the BioPreDyn-bench website, https://sites.google.com/site/biopredynbenchmarks/ .

  16. Evidence for the effect of disease management: is $1 billion a year a good investment?

    PubMed

    Mattke, Soeren; Seid, Michael; Ma, Sai

    2007-12-01

    To assess the evidence for the effect of disease management on quality of care, disease control, and cost, with a focus on population-based programs. Literature review. We conducted a literature search for and a structured review of studies on population-based disease management programs, as well as for reviews and meta-analyses of disease management interventions. We identified 3 evaluations of large-scale population-based programs, as well as 10 meta-analyses and 16 systematic reviews, covering 317 unique studies. We found consistent evidence that disease management improves processes of care and disease control but no conclusive support for its effect on health outcomes. Overall, disease management does not seem to affect utilization except for a reduction in hospitalization rates among patients with congestive heart failure and an increase in outpatient care and prescription drug use among patients with depression. When the costs of the intervention were appropriately accounted for and subtracted from any savings, there was no conclusive evidence that disease management leads to a net reduction of direct medical costs. Although disease management seems to improve quality of care, its effect on cost is uncertain. Most of the evidence to date addresses small-scale programs targeting high-risk individuals, while only 3 studies evaluate large population-based interventions, implying that little is known about their effect. Payers and policy makers should remain skeptical about vendor claims and should demand supporting evidence based on transparent and scientifically sound methods.

  17. XLID-Causing Mutations and Associated Genes Challenged in Light of Data From Large-Scale Human Exome Sequencing

    PubMed Central

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-01-01

    Because of the unbalanced sex ratio (1.3–1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. PMID:23871722

  18. Using Climate Regionalization to Understand Climate Forecast System Version 2 (CFSv2) Precipitation Performance for the Conterminous United States (CONUS)

    NASA Technical Reports Server (NTRS)

    Regonda, Satish K.; Zaitchik, Benjamin F.; Badr, Hamada S.; Rodell, Matthew

    2016-01-01

    Dynamically based seasonal forecasts are prone to systematic spatial biases due to imperfections in the underlying global climate model (GCM). This can result in low-forecast skill when the GCM misplaces teleconnections or fails to resolve geographic barriers, even if the prediction of large-scale dynamics is accurate. To characterize and address this issue, this study applies objective climate regionalization to identify discrepancies between the Climate Forecast SystemVersion 2 (CFSv2) and precipitation observations across the Contiguous United States (CONUS). Regionalization shows that CFSv2 1 month forecasts capture the general spatial character of warm season precipitation variability but that forecast regions systematically differ from observation in some transition zones. CFSv2 predictive skill for these misclassified areas is systematically reduced relative to correctly regionalized areas and CONUS as a whole. In these incorrectly regionalized areas, higher skill can be obtained by using a regional-scale forecast in place of the local grid cell prediction.

  19. Exercises in Evaluation of a Large-Scale Educational Program.

    ERIC Educational Resources Information Center

    Glass, Gene V.

    This workbook is designed to serve as training experience for educational evaluators at the preservice (graduate school) or inservice stages. The book comprises a series of exercises in the planning, execution, and reporting of the evaluation of a large-scale educational program in this case Title I of the Elementary and Secondary Education Act of…

  20. Sensitivity simulations of superparameterised convection in a general circulation model

    NASA Astrophysics Data System (ADS)

    Rybka, Harald; Tost, Holger

    2015-04-01

    Cloud Resolving Models (CRMs) covering a horizontal grid spacing from a few hundred meters up to a few kilometers have been used to explicitly resolve small-scale and mesoscale processes. Special attention has been paid to realistically represent cloud dynamics and cloud microphysics involving cloud droplets, ice crystals, graupel and aerosols. The entire variety of physical processes on the small-scale interacts with the larger-scale circulation and has to be parameterised on the coarse grid of a general circulation model (GCM). Since more than a decade an approach to connect these two types of models which act on different scales has been developed to resolve cloud processes and their interactions with the large-scale flow. The concept is to use an ensemble of CRM grid cells in a 2D or 3D configuration in each grid cell of the GCM to explicitly represent small-scale processes avoiding the use of convection and large-scale cloud parameterisations which are a major source for uncertainties regarding clouds. The idea is commonly known as superparameterisation or cloud-resolving convection parameterisation. This study presents different simulations of an adapted Earth System Model (ESM) connected to a CRM which acts as a superparameterisation. Simulations have been performed with the ECHAM/MESSy atmospheric chemistry (EMAC) model comparing conventional GCM runs (including convection and large-scale cloud parameterisations) with the improved superparameterised EMAC (SP-EMAC) modeling one year with prescribed sea surface temperatures and sea ice content. The sensitivity of atmospheric temperature, precipiation patterns, cloud amount and types is observed changing the embedded CRM represenation (orientation, width, no. of CRM cells, 2D vs. 3D). Additionally, we also evaluate the radiation balance with the new model configuration, and systematically analyse the impact of tunable parameters on the radiation budget and hydrological cycle. Furthermore, the subgrid variability (individual CRM cell output) is analysed in order to illustrate the importance of a highly varying atmospheric structure inside a single GCM grid box. Finally, the convective transport of Radon is observed comparing different transport procedures and their influence on the vertical tracer distribution.

  1. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    PubMed

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  2. Drought Persistence Errors in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, H.; Gudmundsson, L.; Seneviratne, S. I.

    2018-04-01

    The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.

  3. Development process of an assessment tool for disruptive behavior problems in cross-cultural settings: the Disruptive Behavior International Scale – Nepal version (DBIS-N)

    PubMed Central

    Burkey, Matthew D.; Ghimire, Lajina; Adhikari, Ramesh P.; Kohrt, Brandon A.; Jordans, Mark J. D.; Haroz, Emily; Wissow, Lawrence

    2017-01-01

    Systematic processes are needed to develop valid measurement instruments for disruptive behavior disorders (DBDs) in cross-cultural settings. We employed a four-step process in Nepal to identify and select items for a culturally valid assessment instrument: 1) We extracted items from validated scales and local free-list interviews. 2) Parents, teachers, and peers (n=30) rated the perceived relevance and importance of behavior problems. 3) Highly rated items were piloted with children (n=60) in Nepal. 4) We evaluated internal consistency of the final scale. We identified 49 symptoms from 11 scales, and 39 behavior problems from free-list interviews (n=72). After dropping items for low ratings of relevance and severity and for poor item-test correlation, low frequency, and/or poor acceptability in pilot testing, 16 items remained for the Disruptive Behavior International Scale—Nepali version (DBIS-N). The final scale had good internal consistency (α=0.86). A 4-step systematic approach to scale development including local participation yielded an internally consistent scale that included culturally relevant behavior problems. PMID:28093575

  4. Well-being measurement and the WHO health policy Health 2010: systematic review of measurement scales.

    PubMed

    Lindert, Jutta; Bain, Paul A; Kubzansky, Laura D; Stein, Claudia

    2015-08-01

    Subjective well-being (SWB) contributes to health and mental health. It is a major objective of the new World Health Organization health policy framework, 'Health 2020'. Various approaches to defining and measuring well-being exist. We aimed to identify, map and analyse the contents of self-reported well-being measurement scales for use with individuals more than 15 years of age to help researchers and politicians choose appropriate measurement tools. We conducted a systematic literature search in PubMed for studies published between 2007 and 2012, with additional hand-searching, to identify empirical studies that investigated well-being using a measurement scale. For each eligible study, we identified the measurement tool and reviewed its components, number of items, administration time, validity, reliability, responsiveness and sensitivity. The literature review identified 60 unique measurement scales. Measurement scales were either multidimensional (n = 33) or unidimensional (n = 14) and assessed multiple domains. The most frequently encountered domains were affects (39 scales), social relations (17 scales), life satisfaction (13 scales), physical health (13 scales), meaning/achievement (9 scales) and spirituality (6 scales). The scales included between 1 and 100 items; the administration time varied from 1 to 15 min. Well-being is a higher order construct. Measures seldom reported testing for gender or cultural sensitivity. The content and format of scales varied considerably. Effective monitoring and comparison of SWB over time and across geographic regions will require further work to refine definitions of SWB. We recommend concurrent evaluation of at least three self-reported SWB measurement scales, including evaluation for gender or cultural sensitivity. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  5. The relationship between the Early Childhood Environment Rating Scale and its revised form and child outcomes: A systematic review and meta-analysis.

    PubMed

    Brunsek, Ashley; Perlman, Michal; Falenchuk, Olesya; McMullen, Evelyn; Fletcher, Brooke; Shah, Prakesh S

    2017-01-01

    The Early Childhood Environment Rating Scale (ECERS) and its revised version (ECERS-R) were designed as global measures of quality that assess structural and process aspects of Early Childhood Education and Care (ECEC) programs. Despite frequent use of the ECERS/ECERS-R in research and applied settings, associations between it and child outcomes have not been systematically reviewed. The objective of this research was to evaluate the association between the ECERS/ECERS-R and children's wellbeing. Searches of Medline, PsycINFO, ERIC, websites of large datasets and reference sections of all retrieved articles were completed up to July 3, 2015. Eligible studies provided a statistical link between the ECERS/ECERS-R and child outcomes for preschool-aged children in ECEC programs. Of the 823 studies selected for full review, 73 were included in the systematic review and 16 were meta-analyzed. The combined sample across all eligible studies consisted of 33, 318 preschool-aged children. Qualitative systematic review results revealed that ECERS/ECERS-R total scores were more generally associated with positive outcomes than subscales or factors. Seventeen separate meta-analyses were conducted to assess the strength of association between the ECERS/ECERS-R and measures that assessed children's language, math and social-emotional outcomes. Meta-analyses revealed a small number of weak effects (in the expected direction) between the ECERS/ECERS-R total score and children's language and positive behavior outcomes. The Language-Reasoning subscale was weakly related to a language outcome. The enormous heterogeneity in how studies operationalized the ECERS/ECERS-R, the outcomes measured and statistics reported limited our ability to meta-analyze many studies. Greater consistency in study methodology is needed in this area of research. Despite these methodological challenges, the ECERS/ECERS-R does appear to capture aspects of quality that are important for children's wellbeing; however, the strength of association is weak.

  6. The relationship between the Early Childhood Environment Rating Scale and its revised form and child outcomes: A systematic review and meta-analysis

    PubMed Central

    Brunsek, Ashley; Perlman, Michal; Falenchuk, Olesya; McMullen, Evelyn; Fletcher, Brooke; Shah, Prakesh S.

    2017-01-01

    The Early Childhood Environment Rating Scale (ECERS) and its revised version (ECERS-R) were designed as global measures of quality that assess structural and process aspects of Early Childhood Education and Care (ECEC) programs. Despite frequent use of the ECERS/ECERS-R in research and applied settings, associations between it and child outcomes have not been systematically reviewed. The objective of this research was to evaluate the association between the ECERS/ECERS-R and children’s wellbeing. Searches of Medline, PsycINFO, ERIC, websites of large datasets and reference sections of all retrieved articles were completed up to July 3, 2015. Eligible studies provided a statistical link between the ECERS/ECERS-R and child outcomes for preschool-aged children in ECEC programs. Of the 823 studies selected for full review, 73 were included in the systematic review and 16 were meta-analyzed. The combined sample across all eligible studies consisted of 33, 318 preschool-aged children. Qualitative systematic review results revealed that ECERS/ECERS-R total scores were more generally associated with positive outcomes than subscales or factors. Seventeen separate meta-analyses were conducted to assess the strength of association between the ECERS/ECERS-R and measures that assessed children’s language, math and social-emotional outcomes. Meta-analyses revealed a small number of weak effects (in the expected direction) between the ECERS/ECERS-R total score and children’s language and positive behavior outcomes. The Language-Reasoning subscale was weakly related to a language outcome. The enormous heterogeneity in how studies operationalized the ECERS/ECERS-R, the outcomes measured and statistics reported limited our ability to meta-analyze many studies. Greater consistency in study methodology is needed in this area of research. Despite these methodological challenges, the ECERS/ECERS-R does appear to capture aspects of quality that are important for children’s wellbeing; however, the strength of association is weak. PMID:28586399

  7. Developing a Systematic Corrosion Control Evaluation Approach in Flint

    EPA Science Inventory

    Presentation covers what the projects were that were recommended by the Flint Safe Drinking Water Task Force for corrosion control assessment for Flint, focusing on the sequential sampling project, the pipe rigs, and pipe scale analyses.

  8. AGUACLARA: CLEAN WATER FOR SMALL COMMUNITIES

    EPA Science Inventory

    We will systematically evaluate commercially available solar thermal collectors and thermal storage systems for use in residential scale co-generative heat and electrical power systems. Currently, reliable data is unavailable over the range of conditions and installations thes...

  9. Readability of Online Health Information: A Meta-Narrative Systematic Review.

    PubMed

    Daraz, Lubna; Morrow, Allison S; Ponce, Oscar J; Farah, Wigdan; Katabi, Abdulrahman; Majzoub, Abdul; Seisa, Mohamed O; Benkhadra, Raed; Alsawas, Mouaz; Larry, Prokop; Murad, M Hassan

    2018-01-01

    Online health information should meet the reading level for the general public (set at sixth-grade level). Readability is a key requirement for information to be helpful and improve quality of care. The authors conducted a systematic review to evaluate the readability of online health information in the United States and Canada. Out of 3743 references, the authors included 157 cross-sectional studies evaluating 7891 websites using 13 readability scales. The mean readability grade level across websites ranged from grade 10 to 15 based on the different scales. Stratification by specialty, health condition, and type of organization producing information revealed the same findings. In conclusion, online health information in the United States and Canada has a readability level that is inappropriate for general public use. Poor readability can lead to misinformation and may have a detrimental effect on health. Efforts are needed to improve readability and the content of online health information.

  10. Selective Cannabinoids for Chronic Neuropathic Pain: A Systematic Review and Meta-analysis.

    PubMed

    Meng, Howard; Johnston, Bradley; Englesakis, Marina; Moulin, Dwight E; Bhatia, Anuj

    2017-11-01

    There is a lack of consensus on the role of selective cannabinoids for the treatment of neuropathic pain (NP). Guidelines from national and international pain societies have provided contradictory recommendations. The primary objective of this systematic review and meta-analysis (SR-MA) was to determine the analgesic efficacy and safety of selective cannabinoids compared to conventional management or placebo for chronic NP. We reviewed randomized controlled trials that compared selective cannabinoids (dronabinol, nabilone, nabiximols) with conventional treatments (eg, pharmacotherapy, physical therapy, or a combination of these) or placebo in patients with chronic NP because patients with NP may be on any of these therapies or none if all standard treatments have failed to provide analgesia and or if these treatments have been associated with adverse effects. MEDLINE, EMBASE, and other major databases up to March 11, 2016, were searched. Data on scores of numerical rating scale for NP and its subtypes, central and peripheral, were meta-analyzed. The certainty of evidence was classified using the Grade of Recommendations Assessment, Development, and Evaluation approach. Eleven randomized controlled trials including 1219 patients (614 in selective cannabinoid and 605 in comparator groups) were included in this SR-MA. There was variability in the studies in quality of reporting, etiology of NP, type and dose of selective cannabinoids. Patients who received selective cannabinoids reported a significant, but clinically small, reduction in mean numerical rating scale pain scores (0-10 scale) compared with comparator groups (-0.65 points; 95% confidence interval, -1.06 to -0.23 points; P = .002, I = 60%; Grade of Recommendations Assessment, Development, and Evaluation: weak recommendation and moderate-quality evidence). Use of selective cannabinoids was also associated with improvements in quality of life and sleep with no major adverse effects. Selective cannabinoids provide a small analgesic benefit in patients with chronic NP. There was a high degree of heterogeneity among publications included in this SR-MA. Well-designed, large, randomized studies are required to better evaluate specific dosage, duration of intervention, and the effect of this intervention on physical and psychologic function.

  11. Evaluating the Global Precipitation Measurement mission with NOAA/NSSL Multi-Radar Multisensor: current status and future directions.

    NASA Astrophysics Data System (ADS)

    Kirstetter, P. E.; Petersen, W. A.; Gourley, J. J.; Kummerow, C.; Huffman, G. J.; Turk, J.; Tanelli, S.; Maggioni, V.; Anagnostou, E. N.; Hong, Y.; Schwaller, M.

    2017-12-01

    Accurate characterization of uncertainties in space-borne precipitation estimates is critical for many applications including water budget studies or prediction of natural hazards at the global scale. The GPM precipitation Level II (active and passive) and Level III (IMERG) estimates are compared to the high quality and high resolution NEXRAD-based precipitation estimates derived from the NOAA/NSSL's Multi-Radar, Multi-Sensor (MRMS) platform. A surface reference is derived from the MRMS suite of products to be accurate with known uncertainty bounds and measured at a resolution below the pixel sizes of any GPM estimate, providing great flexibility in matching to grid scales or footprints. It provides an independent and consistent reference research framework for directly evaluating GPM precipitation products across a large number of meteorological regimes as a function of resolution, accuracy and sample size. The consistency of the ground and space-based sensors in term of precipitation detection, typology and quantification are systematically evaluated. Satellite precipitation retrievals are further investigated in terms of precipitation distributions, systematic biases and random errors, influence of precipitation sub-pixel variability and comparison between satellite products. Prognostic analysis directly provides feedback to algorithm developers on how to improve the satellite estimates. Specific factors for passive (e.g. surface conditions for GMI) and active (e.g. non uniform beam filling for DPR) sensors are investigated. This cross products characterization acts as a bridge to intercalibrate microwave measurements from the GPM constellation satellites and propagate to the combined and global precipitation estimates. Precipitation features previously used to analyze Level II satellite estimates under various precipitation processes are now intoduced for Level III to test several assumptions in the IMERG algorithm. Specifically, the contribution of Level II is explicitly characterized and a rigorous characterization is performed to migrate across scales fully understanding the propagation of errors from Level II to Level III. Perpectives are presented to advance the use of uncertainty as an integral part of QPE for ground-based and space-borne sensors

  12. Decoupling local mechanics from large-scale structure in modular metamaterials.

    PubMed

    Yang, Nan; Silverberg, Jesse L

    2017-04-04

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such "inverse design" is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module's design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  13. Decoupling local mechanics from large-scale structure in modular metamaterials

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  14. Bio-stimuli-responsive multi-scale hyaluronic acid nanoparticles for deepened tumor penetration and enhanced therapy.

    PubMed

    Huo, Mengmeng; Li, Wenyan; Chaudhuri, Arka Sen; Fan, Yuchao; Han, Xiu; Yang, Chen; Wu, Zhenghong; Qi, Xiaole

    2017-09-01

    In this study, we developed bio-stimuli-responsive multi-scale hyaluronic acid (HA) nanoparticles encapsulated with polyamidoamine (PAMAM) dendrimers as the subunits. These HA/PAMAM nanoparticles of large scale (197.10±3.00nm) were stable during systematic circulation then enriched at the tumor sites; however, they were prone to be degraded by the high expressed hyaluronidase (HAase) to release inner PAMAM dendrimers and regained a small scale (5.77±0.25nm) with positive charge. After employing tumor spheroids penetration assay on A549 3D tumor spheroids for 8h, the fluorescein isothiocyanate (FITC) labeled multi-scale HA/PAMAM-FITC nanoparticles could penetrate deeply into these tumor spheroids with the degradation of HAase. Moreover, small animal imaging technology in male nude mice bearing H22 tumor showed HA/PAMAM-FITC nanoparticles possess higher prolonged systematic circulation compared with both PAMAM-FITC nanoparticles and free FITC. In addition, after intravenous administration in mice bearing H22 tumors, methotrexate (MTX) loaded multi-scale HA/PAMAM-MTX nanoparticles exhibited a 2.68-fold greater antitumor activity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Motivational interviewing: a systematic review and meta-analysis

    PubMed Central

    Rubak, Sune; Sandbæk, Annelli; Lauritzen, Torsten; Christensen, Bo

    2005-01-01

    Background Motivational Interviewing is a well-known, scientifically tested method of counselling clients developed by Miller and Rollnick and viewed as a useful intervention strategy in the treatment of lifestyle problems and disease. Aim To evaluate the effectiveness of motivational interviewing in different areas of disease and to identify factors shaping outcomes. Design of study A systematic review and meta-analysis of randomised controlled trials using motivational interviewing as the intervention. Method After selection criteria a systematic literature search in 16 databases produced 72 randomised controlled trials the first of which was published in 1991. A quality assessment was made with a validated scale. A meta-analysis was performed as a generic inverse variance meta-analysis. Results Meta-analysis showed a significant effect (95% confidence interval) for motivational interviewing for combined effect estimates for body mass index, total blood cholesterol, systolic blood pressure, blood alcohol concentration and standard ethanol content, while combined effect estimates for cigarettes per day and for HbA1c were not significant. Motivational interviewing had a significant and clinically relevant effect in approximately three out of four studies, with an equal effect on physiological (72%) and psychological (75%) diseases. Psychologists and physicians obtained an effect in approximately 80% of the studies, while other healthcare providers obtained an effect in 46% of the studies. When using motivational interviewing in brief encounters of 15 minutes, 64% of the studies showed an effect. More than one encounter with the patient ensures the effectiveness of motivational interviewing. Conclusion Motivational interviewing in a scientific setting outperforms traditional advice giving in the treatment of a broad range of behavioural problems and diseases. Large-scale studies are now needed to prove that motivational interviewing can be implemented into daily clinical work in primary and secondary health care. PMID:15826439

  16. Theory of wavelet-based coarse-graining hierarchies for molecular dynamics.

    PubMed

    Rinderspacher, Berend Christopher; Bardhan, Jaydeep P; Ismail, Ahmed E

    2017-07-01

    We present a multiresolution approach to compressing the degrees of freedom and potentials associated with molecular dynamics, such as the bond potentials. The approach suggests a systematic way to accelerate large-scale molecular simulations with more than two levels of coarse graining, particularly applications of polymeric materials. In particular, we derive explicit models for (arbitrarily large) linear (homo)polymers and iterative methods to compute large-scale wavelet decompositions from fragment solutions. This approach does not require explicit preparation of atomistic-to-coarse-grained mappings, but instead uses the theory of diffusion wavelets for graph Laplacians to develop system-specific mappings. Our methodology leads to a hierarchy of system-specific coarse-grained degrees of freedom that provides a conceptually clear and mathematically rigorous framework for modeling chemical systems at relevant model scales. The approach is capable of automatically generating as many coarse-grained model scales as necessary, that is, to go beyond the two scales in conventional coarse-grained strategies; furthermore, the wavelet-based coarse-grained models explicitly link time and length scales. Furthermore, a straightforward method for the reintroduction of omitted degrees of freedom is presented, which plays a major role in maintaining model fidelity in long-time simulations and in capturing emergent behaviors.

  17. The multidimensional driving style inventory a decade later: Review of the literature and re-evaluation of the scale.

    PubMed

    Taubman-Ben-Ari, Orit; Skvirsky, Vera

    2016-08-01

    The Multidimensional Driving Style Inventory (MDSI; Taubman - Ben-Ari, Mikulincer, & Gillath, 2004a), a self-report questionnaire assessing four broad driving styles, has been in use for the last ten years. During that time, numerous studies have explored the associations between the MDSI factors and sociodemographic and driving-related variables. The current paper employs two large data sets to summarize the accumulated knowledge, examining MDSI factors in samples of young drivers aged 17-21 (Study 1, n=1436) and older drivers aged 22-84 (Study 2, n=3409). Findings indicate that driving-related indicators are coherently and systematically related to the four driving styles in the expected directions, revalidating the structure of the MDSI. The results also help clarify the relationships between the driving styles and variables such as gender, ethnicity, car ownership, age, and experience, and suggest that driving styles are largely unaffected by sociodemographic characteristics, except for gender and ethnicity, and appear to represent a relatively stable and universal trait. The two studies highlight the validity and reliability of the MDSI, attesting to its practical value as a tool for purposes of research, evaluation, and intervention. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Assessing trade-offs in large marine protected areas.

    PubMed

    Davies, Tammy E; Epstein, Graham; Aguilera, Stacy E; Brooks, Cassandra M; Cox, Michael; Evans, Louisa S; Maxwell, Sara M; Nenadovic, Mateja; Ban, Natalie C

    2018-01-01

    Large marine protected areas (LMPAs) are increasingly being established and have a high profile in marine conservation. LMPAs are expected to achieve multiple objectives, and because of their size are postulated to avoid trade-offs that are common in smaller MPAs. However, evaluations across multiple outcomes are lacking. We used a systematic approach to code several social and ecological outcomes of 12 LMPAs. We found evidence of three types of trade-offs: trade-offs between different ecological resources (supply trade-offs); trade-offs between ecological resource conditions and the well-being of resource users (supply-demand trade-offs); and trade-offs between the well-being outcomes of different resource users (demand trade-offs). We also found several divergent outcomes that were attributed to influences beyond the scope of the LMPA. We suggest that despite their size, trade-offs can develop in LMPAs and should be considered in planning and design. LMPAs may improve their performance across multiple social and ecological objectives if integrated with larger-scale conservation efforts.

  19. Assessing trade-offs in large marine protected areas

    PubMed Central

    Aguilera, Stacy E.; Brooks, Cassandra M.; Cox, Michael; Evans, Louisa S.; Maxwell, Sara M.; Nenadovic, Mateja

    2018-01-01

    Large marine protected areas (LMPAs) are increasingly being established and have a high profile in marine conservation. LMPAs are expected to achieve multiple objectives, and because of their size are postulated to avoid trade-offs that are common in smaller MPAs. However, evaluations across multiple outcomes are lacking. We used a systematic approach to code several social and ecological outcomes of 12 LMPAs. We found evidence of three types of trade-offs: trade-offs between different ecological resources (supply trade-offs); trade-offs between ecological resource conditions and the well-being of resource users (supply-demand trade-offs); and trade-offs between the well-being outcomes of different resource users (demand trade-offs). We also found several divergent outcomes that were attributed to influences beyond the scope of the LMPA. We suggest that despite their size, trade-offs can develop in LMPAs and should be considered in planning and design. LMPAs may improve their performance across multiple social and ecological objectives if integrated with larger-scale conservation efforts. PMID:29668750

  20. Surgical versus injection treatment for injection-confirmed chronic sacroiliac joint pain

    PubMed Central

    Spiker, William Ryan; Lawrence, Brandon D.; Raich, Annie L.; Skelly, Andrea C.; Brodke, Darrel S.

    2012-01-01

    Study design: Systematic review. Study rationale: Chronic sacroiliac joint pain (CSJP) is a common clinical entity with highly controversial treatment options. A recent systematic review compared surgery with denervation, but the current systematic review compares outcomes of surgical intervention with therapeutic injection for the treatment of CSJP and serves as the next step for evaluating current evidence on the comparative effectiveness of treatments for non-traumatic sacroiliac joint pain. Objective or clinical question: In adult patients with injection-confirmed CSJP, does surgical treatment lead to better outcomes and fewer complications than injection therapy? Methods: A systematic review of the English-language literature was undertaken for articles published between 1970 and June 2012. Electronic databases and reference lists of key articles were searched to identify studies evaluating surgery or injection treatment for injection-confirmed CSJP. Studies involving traumatic onset or non-injection–confirmed CSJP were excluded. Two independent reviewers assessed the level of evidence quality using the grading of recommendations assessment, development and evaluation (GRADE) system, and disagreements were resolved by consensus. Results: We identified twelve articles (seven surgical and five injection treatment) meeting our inclusion criteria. Regardless of the type of treatment, most studies reported over 40% improvement in pain as measured by Visual Analog Scale or Numeric rating Scale score. Regardless of the type of treatment, most studies reported over 20% improvement in functionality. Most complications were reported in the surgical studies. Conclusion: Surgical fusion and therapeutic injections can likely provide pain relief, improve quality of life, and improve work status. The comparative effectiveness of these interventions cannot be evaluated with the current literature. PMID:23526911

  1. Multiscale modeling of lithium ion batteries: thermal aspects

    PubMed Central

    Zausch, Jochen

    2015-01-01

    Summary The thermal behavior of lithium ion batteries has a huge impact on their lifetime and the initiation of degradation processes. The development of hot spots or large local overpotentials leading, e.g., to lithium metal deposition depends on material properties as well as on the nano- und microstructure of the electrodes. In recent years a theoretical structure emerges, which opens the possibility to establish a systematic modeling strategy from atomistic to continuum scale to capture and couple the relevant phenomena on each scale. We outline the building blocks for such a systematic approach and discuss in detail a rigorous approach for the continuum scale based on rational thermodynamics and homogenization theories. Our focus is on the development of a systematic thermodynamically consistent theory for thermal phenomena in batteries at the microstructure scale and at the cell scale. We discuss the importance of carefully defining the continuum fields for being able to compare seemingly different phenomenological theories and for obtaining rules to determine unknown parameters of the theory by experiments or lower-scale theories. The resulting continuum models for the microscopic and the cell scale are numerically solved in full 3D resolution. The complex very localized distributions of heat sources in a microstructure of a battery and the problems of mapping these localized sources on an averaged porous electrode model are discussed by comparing the detailed 3D microstructure-resolved simulations of the heat distribution with the result of the upscaled porous electrode model. It is shown, that not all heat sources that exist on the microstructure scale are represented in the averaged theory due to subtle cancellation effects of interface and bulk heat sources. Nevertheless, we find that in special cases the averaged thermal behavior can be captured very well by porous electrode theory. PMID:25977870

  2. Neurological Soft Signs in Schizophrenia: A Meta-analysis

    PubMed Central

    Chan, Raymond C. K.; Xu, Ting; Heinrichs, R. Walter; Yu, Yue; Wang, Ya

    2010-01-01

    Background: Neurological soft signs (NSS) are hypothesized as candidate endophenotypes for schizophrenia, but their prevalence and relations with clinical and demographic data are unknown. The authors undertook a quantification (meta-analysis) of the published literature on NSS in patients with schizophrenia and healthy controls. A systematic search was conducted for published articles reporting NSS and related data using standard measures in schizophrenia and healthy comparison groups. Method: A systematic search was conducted for published articles reporting data on the prevalence of NSS in schizophrenia using standard clinical rating scales and healthy comparison groups. Meta-analyses were performed using the Comprehensive Meta-analysis software package. Effect sizes (Cohen d) indexing the difference between schizophrenic patients and the healthy controls were calculated on the basis of reported statistics. Potential moderator variables evaluated included age of patient samples, level of education, sample sex proportions, medication doses, and negative and positive symptoms. Results: A total of 33 articles met inclusion criteria for the meta-analysis. A large and reliable group difference (Cohen d) indicated that, on average, a majority of patients (73%) perform outside the range of healthy subjects on aggregate NSS measures. Cognitive performance and positive and negative symptoms share 2%–10% of their variance with NSS. Conclusions: NSS occur in a majority of the schizophrenia patient population and are largely distinct from symptomatic and cognitive features of the illness. PMID:19377058

  3. Computerised cognitive training in acquired brain injury: A systematic review of outcomes using the International Classification of Functioning (ICF).

    PubMed

    Sigmundsdottir, Linda; Longley, Wendy A; Tate, Robyn L

    2016-10-01

    Computerised cognitive training (CCT) is an increasingly popular intervention for people experiencing cognitive symptoms. This systematic review evaluated the evidence for CCT in adults with acquired brain injury (ABI), focusing on how outcome measures used reflect efficacy across components of the International Classification of Functioning, Disability and Health. Database searches were conducted of studies investigating CCT to treat cognitive symptoms in adult ABI. Scientific quality was rated using the PEDro-P and RoBiNT Scales. Ninety-six studies met the criteria. Most studies examined outcomes using measures of mental functions (93/96, 97%); fewer studies included measures of activities/participation (41/96, 43%) or body structures (8/96, 8%). Only 14 studies (15%) provided Level 1 evidence (randomised controlled trials with a PEDro-P score ≥ 6/10), with these studies suggesting strong evidence for CCT improving processing speed in multiple sclerosis (MS) and moderate evidence for improving memory in MS and brain tumour populations. There is a large body of research examining the efficacy of CCT, but relatively few Level 1 studies and evidence is largely limited to body function outcomes. The routine use of outcome measures of activities/participation would provide more meaningful evidence for the efficacy of CCT. The use of body structure outcome measures (e.g., neuroimaging) is a newly emerging area, with potential to increase understanding of mechanisms of action for CCT.

  4. Personality in 100,000 Words: A large-scale analysis of personality and word use among bloggers

    PubMed Central

    Yarkoni, Tal

    2010-01-01

    Previous studies have found systematic associations between personality and individual differences in word use. Such studies have typically focused on broad associations between major personality domains and aggregate word categories, potentially masking more specific associations. Here I report the results of a large-scale analysis of personality and word use in a large sample of blogs (N=694). The size of the dataset enabled pervasive correlations with personality to be identified for a broad range of lexical variables, including both aggregate word categories and individual English words. The results replicated category-level findings from previous offline studies, identified numerous novel associations at both a categorical and single-word level, and underscored the value of complementary approaches to the study of personality and word use. PMID:20563301

  5. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  6. Large-scale shell-model calculation with core excitations for neutron-rich nuclei beyond 132Sn

    NASA Astrophysics Data System (ADS)

    Jin, Hua; Hasegawa, Munetake; Tazaki, Shigeru; Kaneko, Kazunari; Sun, Yang

    2011-10-01

    The structure of neutron-rich nuclei with a few nucleons beyond 132Sn is investigated by means of large-scale shell-model calculations. For a considerably large model space, including neutron core excitations, a new effective interaction is determined by employing the extended pairing-plus-quadrupole model with monopole corrections. The model provides a systematical description for energy levels of A=133-135 nuclei up to high spins and reproduces available data of electromagnetic transitions. The structure of these nuclei is analyzed in detail, with emphasis of effects associated with core excitations. The results show evidence of hexadecupole correlation in addition to octupole correlation in this mass region. The suggested feature of magnetic rotation in 135Te occurs in the present shell-model calculation.

  7. Sensitivity of U.S. summer precipitation to model resolution and convective parameterizations across gray zone resolutions

    NASA Astrophysics Data System (ADS)

    Gao, Yang; Leung, L. Ruby; Zhao, Chun; Hagos, Samson

    2017-03-01

    Simulating summer precipitation is a significant challenge for climate models that rely on cumulus parameterizations to represent moist convection processes. Motivated by recent advances in computing that support very high-resolution modeling, this study aims to systematically evaluate the effects of model resolution and convective parameterizations across the gray zone resolutions. Simulations using the Weather Research and Forecasting model were conducted at grid spacings of 36 km, 12 km, and 4 km for two summers over the conterminous U.S. The convection-permitting simulations at 4 km grid spacing are most skillful in reproducing the observed precipitation spatial distributions and diurnal variability. Notable differences are found between simulations with the traditional Kain-Fritsch (KF) and the scale-aware Grell-Freitas (GF) convection schemes, with the latter more skillful in capturing the nocturnal timing in the Great Plains and North American monsoon regions. The GF scheme also simulates a smoother transition from convective to large-scale precipitation as resolution increases, resulting in reduced sensitivity to model resolution compared to the KF scheme. Nonhydrostatic dynamics has a positive impact on precipitation over complex terrain even at 12 km and 36 km grid spacings. With nudging of the winds toward observations, we show that the conspicuous warm biases in the Southern Great Plains are related to precipitation biases induced by large-scale circulation biases, which are insensitive to model resolution. Overall, notable improvements in simulating summer rainfall and its diurnal variability through convection-permitting modeling and scale-aware parameterizations suggest promising venues for improving climate simulations of water cycle processes.

  8. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  9. School Mental Health: The Impact of State and Local Capacity-Building Training

    ERIC Educational Resources Information Center

    Stephan, Sharon; Paternite, Carl; Grimm, Lindsey; Hurwitz, Laura

    2014-01-01

    Despite a growing number of collaborative partnerships between schools and community-based organizations to expand school mental health (SMH) service capacity in the United States, there have been relatively few systematic initiatives focused on key strategies for large-scale SMH capacity building with state and local education systems. Based on a…

  10. Development of Systematic Approaches for Calibration of Subsurface Transport Models Using Hard and Soft Data on System Characteristics and Behavior

    DTIC Science & Technology

    2011-02-02

    who graduated during this period and will receive scholarships or fellowships for further studies in science, mathematics, engineering or technology...nature or are collected at discrete points or localized areas in the system. The qualitative data includes, geology , large-scale stratigraphy and

  11. Psychiatric Illness in a Cohort of Adults with Prader-Willi Syndrome

    ERIC Educational Resources Information Center

    Sinnema, Margje; Boer, Harm; Collin, Philippe; Maaskant, Marian A.; van Roozendaal, Kees E. P.; Schrander-Stumpel, Constance T. R. M.; Curfs, Leopold M. G.

    2011-01-01

    Previous studies have suggested an association between PWS and comorbid psychiatric illness. Data on prevalence rates of psychopathology is still scarce. This paper describes a large-scale, systematic study investigating the prevalence of psychiatric illness in a Dutch adult PWS cohort. One hundred and two individuals were screened for psychiatric…

  12. Evidence of evolutionary history and selective sweeps in the genome of Meishan pig reveals its genetic and phenotypic characterization

    USDA-ARS?s Scientific Manuscript database

    Meishan is a famous Chinese indigenous pig breed known for its extremely high fecundity. To explore if Meishan has unique evolutionary process and genome characteristics differing from other pig breeds, we systematically analyzed its genetic divergence, and demographic history by large-scale reseque...

  13. Large-scale mapping of mutations affecting zebrafish development.

    PubMed

    Geisler, Robert; Rauch, Gerd-Jörg; Geiger-Rudolph, Silke; Albrecht, Andrea; van Bebber, Frauke; Berger, Andrea; Busch-Nentwich, Elisabeth; Dahm, Ralf; Dekens, Marcus P S; Dooley, Christopher; Elli, Alexandra F; Gehring, Ines; Geiger, Horst; Geisler, Maria; Glaser, Stefanie; Holley, Scott; Huber, Matthias; Kerr, Andy; Kirn, Anette; Knirsch, Martina; Konantz, Martina; Küchler, Axel M; Maderspacher, Florian; Neuhauss, Stephan C; Nicolson, Teresa; Ober, Elke A; Praeg, Elke; Ray, Russell; Rentzsch, Brit; Rick, Jens M; Rief, Eva; Schauerte, Heike E; Schepp, Carsten P; Schönberger, Ulrike; Schonthaler, Helia B; Seiler, Christoph; Sidi, Samuel; Söllner, Christian; Wehner, Anja; Weiler, Christian; Nüsslein-Volhard, Christiane

    2007-01-09

    Large-scale mutagenesis screens in the zebrafish employing the mutagen ENU have isolated several hundred mutant loci that represent putative developmental control genes. In order to realize the potential of such screens, systematic genetic mapping of the mutations is necessary. Here we report on a large-scale effort to map the mutations generated in mutagenesis screening at the Max Planck Institute for Developmental Biology by genome scanning with microsatellite markers. We have selected a set of microsatellite markers and developed methods and scoring criteria suitable for efficient, high-throughput genome scanning. We have used these methods to successfully obtain a rough map position for 319 mutant loci from the Tübingen I mutagenesis screen and subsequent screening of the mutant collection. For 277 of these the corresponding gene is not yet identified. Mapping was successful for 80 % of the tested loci. By comparing 21 mutation and gene positions of cloned mutations we have validated the correctness of our linkage group assignments and estimated the standard error of our map positions to be approximately 6 cM. By obtaining rough map positions for over 300 zebrafish loci with developmental phenotypes, we have generated a dataset that will be useful not only for cloning of the affected genes, but also to suggest allelism of mutations with similar phenotypes that will be identified in future screens. Furthermore this work validates the usefulness of our methodology for rapid, systematic and inexpensive microsatellite mapping of zebrafish mutations.

  14. Epidemiological considerations for the use of databases in transfusion research: a Scandinavian perspective.

    PubMed

    Edgren, Gustaf; Hjalgrim, Henrik

    2010-11-01

    At current safety levels, with adverse events from transfusions being relatively rare, further progress in risk reductions will require large-scale investigations. Thus, truly prospective studies may prove unfeasible and other alternatives deserve consideration. In this review, we will try to give an overview of recent and historical developments in the use of blood donation and transfusion databases in research. In addition, we will go over important methodological issues. There are at least three nationwide or near-nationwide donation/transfusion databases with the possibility for long-term follow-up of donors and recipients. During the past few years, a large number of reports have been published utilizing such data sources to investigate transfusion-associated risks. In addition, numerous clinics systematically collect and use such data on a smaller scale. Combining systematically recorded donation and transfusion data with long-term health follow-up opens up exciting opportunities for transfusion medicine research. However, the correct analysis of such data requires close attention to methodological issues, especially including the indication for transfusion and reverse causality.

  15. Large scale systematic proteomic quantification from non-metastatic to metastatic colorectal cancer

    NASA Astrophysics Data System (ADS)

    Yin, Xuefei; Zhang, Yang; Guo, Shaowen; Jin, Hong; Wang, Wenhai; Yang, Pengyuan

    2015-07-01

    A systematic proteomic quantification of formalin-fixed, paraffin-embedded (FFPE) colorectal cancer tissues from stage I to stage IIIC was performed in large scale. 1017 proteins were identified with 338 proteins in quantitative changes by label free method, while 341 proteins were quantified with significant expression changes among 6294 proteins by iTRAQ method. We found that proteins related to migration expression increased and those for binding and adherent decreased during the colorectal cancer development according to the gene ontology (GO) annotation and ingenuity pathway analysis (IPA). The integrin alpha 5 (ITA5) in integrin family was focused, which was consistent with the metastasis related pathway. The expression level of ITA5 decreased in metastasis tissues and the result has been further verified by Western blotting. Another two cell migration related proteins vitronectin (VTN) and actin-related protein (ARP3) were also proved to be up-regulated by both mass spectrometry (MS) based quantification results and Western blotting. Up to now, our result shows one of the largest dataset in colorectal cancer proteomics research. Our strategy reveals a disease driven omics-pattern for the metastasis colorectal cancer.

  16. Problems and Solutions in Evaluating Child Outcomes of Large-Scale Educational Programs.

    ERIC Educational Resources Information Center

    Abrams, Allan S.; And Others

    1979-01-01

    Evaluation of large-scale programs is problematical because of inherent bias in assignment of treatment and control groups, resulting in serious regression artifacts even with the use of analysis of covariance designs. Nonuniformity of program implementation across sites and classrooms is also a problem. (Author/GSK)

  17. Vertical Accuracy Evaluation of Aster GDEM2 Over a Mountainous Area Based on Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Qu, Y.; Guo, D.; Cui, T.

    2018-05-01

    Global digital elevation models (GDEM) provide elementary information on heights of the Earth's surface and objects on the ground. GDEMs have become an important data source for a range of applications. The vertical accuracy of a GDEM is critical for its applications. Nowadays UAVs has been widely used for large-scale surveying and mapping. Compared with traditional surveying techniques, UAV photogrammetry are more convenient and more cost-effective. UAV photogrammetry produces the DEM of the survey area with high accuracy and high spatial resolution. As a result, DEMs resulted from UAV photogrammetry can be used for a more detailed and accurate evaluation of the GDEM product. This study investigates the vertical accuracy (in terms of elevation accuracy and systematic errors) of the ASTER GDEM Version 2 dataset over a complex terrain based on UAV photogrammetry. Experimental results show that the elevation errors of ASTER GDEM2 are in normal distribution and the systematic error is quite small. The accuracy of the ASTER GDEM2 coincides well with that reported by the ASTER validation team. The accuracy in the research area is negatively correlated to both the slope of the terrain and the number of stereo observations. This study also evaluates the vertical accuracy of the up-sampled ASTER GDEM2. Experimental results show that the accuracy of the up-sampled ASTER GDEM2 data in the research area is not significantly reduced by the complexity of the terrain. The fine-grained accuracy evaluation of the ASTER GDEM2 is informative for the GDEM-supported UAV photogrammetric applications.

  18. Large-scale weakly supervised object localization via latent category learning.

    PubMed

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  19. Meridional flow in the solar convection zone. I. Measurements from gong data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kholikov, S.; Serebryanskiy, A.; Jackiewicz, J., E-mail: kholikov@noao.edu

    2014-04-01

    Large-scale plasma flows in the Sun's convection zone likely play a major role in solar dynamics on decadal timescales. In particular, quantifying meridional motions is a critical ingredient for understanding the solar cycle and the transport of magnetic flux. Because the signal of such features can be quite small in deep solar layers and be buried in systematics or noise, the true meridional velocity profile has remained elusive. We perform time-distance helioseismology measurements on several years worth of Global Oscillation Network Group Doppler data. A spherical harmonic decomposition technique is applied to a subset of acoustic modes to measure travel-timemore » differences to try to obtain signatures of meridional flows throughout the solar convection zone. Center-to-limb systematics are taken into account in an intuitive yet ad hoc manner. Travel-time differences near the surface that are consistent with a poleward flow in each hemisphere and are similar to previous work are measured. Additionally, measurements in deep layers near the base of the convection zone suggest a possible equatorward flow, as well as partial evidence of a sign change in the travel-time differences at mid-convection zone depths. This analysis on an independent data set using different measurement techniques strengthens recent conclusions that the convection zone may have multiple 'cells' of meridional flow. The results may challenge the common understanding of one large conveyor belt operating in the solar convection zone. Further work with helioseismic inversions and a careful study of systematic effects are needed before firm conclusions of these large-scale flow structures can be made.« less

  20. Improving International Assessment through Evaluation

    ERIC Educational Resources Information Center

    Rutkowski, David

    2018-01-01

    In this article I advocate for a new discussion in the field of international large-scale assessments; one that calls for a reexamination of international large-scale assessments (ILSAs) and their use. Expanding on the high-quality work in this special issue I focus on three inherent limitations to international large-scale assessments noted by…

  1. A multilevel layout algorithm for visualizing physical and genetic interaction networks, with emphasis on their modular organization

    PubMed Central

    2012-01-01

    Background Graph drawing is an integral part of many systems biology studies, enabling visual exploration and mining of large-scale biological networks. While a number of layout algorithms are available in popular network analysis platforms, such as Cytoscape, it remains poorly understood how well their solutions reflect the underlying biological processes that give rise to the network connectivity structure. Moreover, visualizations obtained using conventional layout algorithms, such as those based on the force-directed drawing approach, may become uninformative when applied to larger networks with dense or clustered connectivity structure. Methods We implemented a modified layout plug-in, named Multilevel Layout, which applies the conventional layout algorithms within a multilevel optimization framework to better capture the hierarchical modularity of many biological networks. Using a wide variety of real life biological networks, we carried out a systematic evaluation of the method in comparison with other layout algorithms in Cytoscape. Results The multilevel approach provided both biologically relevant and visually pleasant layout solutions in most network types, hence complementing the layout options available in Cytoscape. In particular, it could improve drawing of large-scale networks of yeast genetic interactions and human physical interactions. In more general terms, the biological evaluation framework developed here enables one to assess the layout solutions from any existing or future graph drawing algorithm as well as to optimize their performance for a given network type or structure. Conclusions By making use of the multilevel modular organization when visualizing biological networks, together with the biological evaluation of the layout solutions, one can generate convenient visualizations for many network biology applications. PMID:22448851

  2. Evaluation of large-scale meteorological patterns associated with temperature extremes in the NARCCAP regional climate model simulations

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.; Waliser, Duane E.; Lee, Huikyo; Neelin, J. David; Lintner, Benjamin R.; McGinnis, Seth; Mearns, Linda O.; Kim, Jinwon

    2015-12-01

    Large-scale meteorological patterns (LSMPs) associated with temperature extremes are evaluated in a suite of regional climate model (RCM) simulations contributing to the North American Regional Climate Change Assessment Program. LSMPs are characterized through composites of surface air temperature, sea level pressure, and 500 hPa geopotential height anomalies concurrent with extreme temperature days. Six of the seventeen RCM simulations are driven by boundary conditions from reanalysis while the other eleven are driven by one of four global climate models (GCMs). Four illustrative case studies are analyzed in detail. Model fidelity in LSMP spatial representation is high for cold winter extremes near Chicago. Winter warm extremes are captured by most RCMs in northern California, with some notable exceptions. Model fidelity is lower for cool summer days near Houston and extreme summer heat events in the Ohio Valley. Physical interpretation of these patterns and identification of well-simulated cases, such as for Chicago, boosts confidence in the ability of these models to simulate days in the tails of the temperature distribution. Results appear consistent with the expectation that the ability of an RCM to reproduce a realistically shaped frequency distribution for temperature, especially at the tails, is related to its fidelity in simulating LMSPs. Each ensemble member is ranked for its ability to reproduce LSMPs associated with observed warm and cold extremes, identifying systematically high performing RCMs and the GCMs that provide superior boundary forcing. The methodology developed here provides a framework for identifying regions where further process-based evaluation would improve the understanding of simulation error and help guide future model improvement and downscaling efforts.

  3. Robust phenotyping strategies for evaluation of stem non-structural carbohydrates (NSC) in rice.

    PubMed

    Wang, Diane R; Wolfrum, Edward J; Virk, Parminder; Ismail, Abdelbagi; Greenberg, Anthony J; McCouch, Susan R

    2016-11-01

    Rice plants (Oryza sativa) accumulate excess photoassimilates in the form of non-structural carbohydrates (NSCs) in their stems prior to heading that can later be mobilized to supplement photosynthate production during grain-filling. Despite longstanding interest in stem NSC for rice improvement, the dynamics of NSC accumulation, remobilization, and re-accumulation that have genetic potential for optimization have not been systematically investigated. Here we conducted three pilot experiments to lay the groundwork for large-scale diversity studies on rice stem NSC. We assessed the relationship of stem NSC components with 21 agronomic traits in large-scale, tropical yield trials using 33 breeder-nominated lines, established an appropriate experimental design for future genetic studies using a Bayesian framework to sample sub-datasets from highly replicated greenhouse data using 36 genetically diverse genotypes, and used 434 phenotypically divergent rice stem samples to develop two partial least-squares (PLS) models using near-infrared (NIR) spectra for accurate, rapid prediction of rice stem starch, sucrose, and total non-structural carbohydrates. We find evidence that stem reserves are most critical for short-duration varieties and suggest that pre-heading stem NSC is worthy of further experimentation for breeding early maturing rice. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  4. Comparison of spatio-temporal resolution of different flow measurement techniques for marine renewable energy applications

    NASA Astrophysics Data System (ADS)

    Lyon, Vincent; Wosnik, Martin

    2013-11-01

    Marine hydrokinetic (MHK) energy conversion devices are subject to a wide range of turbulent scales, either due to upstream bathymetry, obstacles and waves, or from wakes of upstream devices in array configurations. The commonly used, robust Acoustic Doppler Current Profilers (ADCP) are well suited for long term flow measurements in the marine environment, but are limited to low sampling rates due to their operational principle. The resulting temporal and spatial resolution is insufficient to measure all turbulence scales of interest to the device, e.g., ``blade-scale turbulence.'' The present study systematically characterizes the spatial and temporal resolution of ADCP, Acoustic Doppler Velocimetry (ADV), and Particle Image Velocimetry (PIV). Measurements were conducted in a large cross section tow tank (3.7m × 2.4m) for several benchmark cases, including low and high turbulence intensity uniform flow as well as in the wake of a cylinder, to quantitatively investigate the flow scales which each of the instruments can resolve. The purpose of the study is to supply data for mathematical modeling to improve predictions from ADCP measurements, which can help lead to higher-fidelity energy resource assessment and more accurate device evaluation, including wake measurements. Supported by NSF-CBET grant 1150797.

  5. Computational study of 3-D hot-spot initiation in shocked insensitive high-explosive

    NASA Astrophysics Data System (ADS)

    Najjar, F. M.; Howard, W. M.; Fried, L. E.; Manaa, M. R.; Nichols, A., III; Levesque, G.

    2012-03-01

    High-explosive (HE) material consists of large-sized grains with micron-sized embedded impurities and pores. Under various mechanical/thermal insults, these pores collapse generating hightemperature regions leading to ignition. A hydrodynamic study has been performed to investigate the mechanisms of pore collapse and hot spot initiation in TATB crystals, employing a multiphysics code, ALE3D, coupled to the chemistry module, Cheetah. This computational study includes reactive dynamics. Two-dimensional high-resolution large-scale meso-scale simulations have been performed. The parameter space is systematically studied by considering various shock strengths, pore diameters and multiple pore configurations. Preliminary 3-D simulations are undertaken to quantify the 3-D dynamics.

  6. Four-center bubbled BPS solutions with a Gibbons-Hawking base

    NASA Astrophysics Data System (ADS)

    Heidmann, Pierre

    2017-10-01

    We construct four-center bubbled BPS solutions with a Gibbons-Hawking base space. We give a systematic procedure to build scaling solutions: starting from three-supertube configurations and using generalized spectral flows and gauge transformations to extend to solutions with four Gibbons-Hawking centers. This allows us to construct very large families of smooth horizonless solutions that have the same charges and angular momentum as supersymmetric black holes with a macroscopically large horizon area. Our construction reveals that all scaling solutions with four Gibbons Hawking centers have an angular momentum at around 99% of the cosmic censorship bound. We give both an analytical and a numerical explanation for this unexpected feature.

  7. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  8. The effect of large-scale model time step and multiscale coupling frequency on cloud climatology, vertical structure, and rainfall extremes in a superparameterized GCM

    DOE PAGES

    Yu, Sungduk; Pritchard, Michael S.

    2015-12-17

    The effect of global climate model (GCM) time step—which also controls how frequently global and embedded cloud resolving scales are coupled—is examined in the Superparameterized Community Atmosphere Model ver 3.0. Systematic bias reductions of time-mean shortwave cloud forcing (~10 W/m 2) and longwave cloud forcing (~5 W/m 2) occur as scale coupling frequency increases, but with systematically increasing rainfall variance and extremes throughout the tropics. An overarching change in the vertical structure of deep tropical convection, favoring more bottom-heavy deep convection as a global model time step is reduced may help orchestrate these responses. The weak temperature gradient approximation ismore » more faithfully satisfied when a high scale coupling frequency (a short global model time step) is used. These findings are distinct from the global model time step sensitivities of conventionally parameterized GCMs and have implications for understanding emergent behaviors of multiscale deep convective organization in superparameterized GCMs. Lastly, the results may also be useful for helping to tune them.« less

  9. The effect of large-scale model time step and multiscale coupling frequency on cloud climatology, vertical structure, and rainfall extremes in a superparameterized GCM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Sungduk; Pritchard, Michael S.

    The effect of global climate model (GCM) time step—which also controls how frequently global and embedded cloud resolving scales are coupled—is examined in the Superparameterized Community Atmosphere Model ver 3.0. Systematic bias reductions of time-mean shortwave cloud forcing (~10 W/m 2) and longwave cloud forcing (~5 W/m 2) occur as scale coupling frequency increases, but with systematically increasing rainfall variance and extremes throughout the tropics. An overarching change in the vertical structure of deep tropical convection, favoring more bottom-heavy deep convection as a global model time step is reduced may help orchestrate these responses. The weak temperature gradient approximation ismore » more faithfully satisfied when a high scale coupling frequency (a short global model time step) is used. These findings are distinct from the global model time step sensitivities of conventionally parameterized GCMs and have implications for understanding emergent behaviors of multiscale deep convective organization in superparameterized GCMs. Lastly, the results may also be useful for helping to tune them.« less

  10. Evidence-based evaluation of the cumulative effects of ecosystem restoration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diefenderfer, Heida L.; Johnson, Gary E.; Thom, Ronald M.

    Evaluating the cumulative effects of large-scale ecological restoration programs is necessary to inform adaptive ecosystem management and provide society with resilient and sustainable services. However, complex linkages between restorative actions and ecosystem responses make evaluations problematic. Despite long-term federal investments in restoring aquatic ecosystems, no standard evaluation method has been adopted and most programs focus on monitoring and analysis, not synthesis and evaluation. In this paper, we demonstrate a new transdisciplinary approach integrating techniques from evidence-based medicine, critical thinking, and cumulative effects assessment. Tiered hypotheses are identified using an ecosystem conceptual model. The systematic literature review at the core ofmore » evidence-based assessment becomes one of many lines of evidence assessed collectively, using critical thinking strategies and causal criteria from a cumulative effects perspective. As a demonstration, we analyzed data from 166 locations on the Columbia River and estuary representing 12 indicators of habitat and fish response to floodplain restoration actions intended to benefit threatened and endangered salmon. Synthesis of seven lines of evidence showed that hydrologic reconnection promoted macrodetritis export, prey availability, and fish access and feeding. The evidence was sufficient to infer cross-boundary, indirect, compounding and delayed cumulative effects, and suggestive of nonlinear, landscape-scale, and spatial density effects. On the basis of causal inferences regarding food web functions, we concluded that the restoration program has a cumulative beneficial effect on juvenile salmon. As a result, this evidence-based approach will enable the evaluation of restoration in complex coastal and riverine ecosystems where data have accumulated without sufficient synthesis.« less

  11. Evidence-based evaluation of the cumulative effects of ecosystem restoration

    DOE PAGES

    Diefenderfer, Heida L.; Johnson, Gary E.; Thom, Ronald M.; ...

    2016-03-18

    Evaluating the cumulative effects of large-scale ecological restoration programs is necessary to inform adaptive ecosystem management and provide society with resilient and sustainable services. However, complex linkages between restorative actions and ecosystem responses make evaluations problematic. Despite long-term federal investments in restoring aquatic ecosystems, no standard evaluation method has been adopted and most programs focus on monitoring and analysis, not synthesis and evaluation. In this paper, we demonstrate a new transdisciplinary approach integrating techniques from evidence-based medicine, critical thinking, and cumulative effects assessment. Tiered hypotheses are identified using an ecosystem conceptual model. The systematic literature review at the core ofmore » evidence-based assessment becomes one of many lines of evidence assessed collectively, using critical thinking strategies and causal criteria from a cumulative effects perspective. As a demonstration, we analyzed data from 166 locations on the Columbia River and estuary representing 12 indicators of habitat and fish response to floodplain restoration actions intended to benefit threatened and endangered salmon. Synthesis of seven lines of evidence showed that hydrologic reconnection promoted macrodetritis export, prey availability, and fish access and feeding. The evidence was sufficient to infer cross-boundary, indirect, compounding and delayed cumulative effects, and suggestive of nonlinear, landscape-scale, and spatial density effects. On the basis of causal inferences regarding food web functions, we concluded that the restoration program has a cumulative beneficial effect on juvenile salmon. As a result, this evidence-based approach will enable the evaluation of restoration in complex coastal and riverine ecosystems where data have accumulated without sufficient synthesis.« less

  12. Monitoring and evaluation of strategic change programme implementation-Lessons from a case analysis.

    PubMed

    Neumann, Jan; Robson, Andrew; Sloan, Diane

    2018-02-01

    This study considered the monitoring and evaluation of a large-scale and domestic and global strategic change programme implementation. It considers the necessary prerequisites to overcome challenges and barriers that prevent systematic and effective monitoring and evaluation to take place alongside its operationalisation. The work involves a case study based on a major industrial company from the energy sector. The change programme makes particular reference to changes in business models, business processes, organisation structures as well as Enterprise Resource Planning infrastructure. The case study focussed on the summative evaluation of the programme post-implementation. This assessment involved 25 semi-structured interviews with employees across a range of managerial strata capturing more than 65 roles within the change programme at both local and global levels. Data relating to their perception of evaluation effectiveness and shortcomings were analysed by means of template analysis. The study identifies responsibilities for executing an evaluation alongside various methods and tools that are appropriate, thereby focussing on the "Who" (roles, responsibility for particular activities) and "How" (methods and tools) rather than "What" to monitor and evaluate. The findings are presented generically so they offer new insights and transferability for practitioners involved in managing strategic change and its associated evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Systematic detection and classification of earthquake clusters in Italy

    NASA Astrophysics Data System (ADS)

    Poli, P.; Ben-Zion, Y.; Zaliapin, I. V.

    2017-12-01

    We perform a systematic analysis of spatio-temporal clustering of 2007-2017 earthquakes in Italy with magnitudes m>3. The study employs the nearest-neighbor approach of Zaliapin and Ben-Zion [2013a, 2013b] with basic data-driven parameters. The results indicate that seismicity in Italy (an extensional tectonic regime) is dominated by clustered events, with smaller proportion of background events than in California. Evaluation of internal cluster properties allows separation of swarm-like from burst-like seismicity. This classification highlights a strong geographical coherence of cluster properties. Swarm-like seismicity are dominant in regions characterized by relatively slow deformation with possible elevated temperature and/or fluids (e.g. Alto Tiberina, Pollino), while burst-like seismicity are observed in crystalline tectonic regions (Alps and Calabrian Arc) and in Central Italy where moderate to large earthquakes are frequent (e.g. L'Aquila, Amatrice). To better assess the variation of seismicity style across Italy, we also perform a clustering analysis with region-specific parameters. This analysis highlights clear spatial changes of the threshold separating background and clustered seismicity, and permits better resolution of different clusters in specific geological regions. For example, a large proportion of repeaters is found in the Etna region as expected for volcanic-induced seismicity. A similar behavior is observed in the northern Apennines with high pore pressure associated with mantle degassing. The observed variations of earthquakes properties highlight shortcomings of practices using large-scale average seismic properties, and points to connections between seismicity and local properties of the lithosphere. The observations help to improve the understanding of the physics governing the occurrence of earthquakes in different regions.

  14. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers

    PubMed Central

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon—Salmo salar, number of >1 year old Atlantic salmon, number of brown trout—Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies. PMID:27191717

  15. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers.

    PubMed

    Bilotta, Gary S; Burnside, Niall G; Gray, Jeremy C; Orr, Harriet G

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon-Salmo salar, number of >1 year old Atlantic salmon, number of brown trout-Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies.

  16. Investigation of low-latitude hydrogen emission in terms of a two-component interstellar gas model

    NASA Technical Reports Server (NTRS)

    Baker, P. L.; Burton, W. B.

    1975-01-01

    High-resolution 21-cm hydrogen line observations at low galactic latitude are analyzed to determine the large-scale distribution of galactic hydrogen. Distribution parameters are found by model fitting, optical depth effects are computed using a two-component gas model suggested by the observations, and calculations are made for a one-component uniform spin-temperature gas model to show the systematic departures between this model and data obtained by incorrect treatment of the optical depth effects. Synthetic 21-cm line profiles are computed from the two-component model, and the large-scale trends of the observed emission profiles are reproduced together with the magnitude of the small-scale emission irregularities. Values are determined for the thickness of the galactic hydrogen disk between half density points, the total observed neutral hydrogen mass of the galaxy, and the central number density of the intercloud hydrogen atoms. It is shown that typical hydrogen clouds must be between 1 and 13 pc in diameter and that optical thinness exists on large-scale despite the presence of optically thin gas.

  17. Just enough inflation: power spectrum modifications at large scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cicoli, Michele; Downes, Sean; Dutta, Bhaskar

    2014-12-01

    We show that models of 'just enough' inflation, where the slow-roll evolution lasted only 50- 60 e-foldings, feature modifications of the CMB power spectrum at large angular scales. We perform a systematic analytic analysis in the limit of a sudden transition between any possible non-slow-roll background evolution and the final stage of slow-roll inflation. We find a high degree of universality since most common backgrounds like fast-roll evolution, matter or radiation-dominance give rise to a power loss at large angular scales and a peak together with an oscillatory behaviour at scales around the value of the Hubble parameter at themore » beginning of slow-roll inflation. Depending on the value of the equation of state parameter, different pre-inflationary epochs lead instead to an enhancement of power at low ℓ, and so seem disfavoured by recent observational hints for a lack of CMB power at ℓ∼< 40. We also comment on the importance of initial conditions and the possibility to have multiple pre-inflationary stages.« less

  18. Systematic Evaluation and Uncertainty Analysis of the Refuse-Derived Fuel Process in Taiwan.

    PubMed

    Chang, Ying-Hsi; Chang, Ni-Bin; Chen, W C

    1998-06-01

    In the last few years, Taiwan has set a bold agenda in solid waste recycling and incineration programs. Not only were the recycling activities and incineration projects promoted by government agencies, but the related laws and regulations were continuously promulgated by the Legislative Yen. The solid waste presorting process that is to be considered prior to the existing incineration facilities has received wide attention. This paper illustrates a thorough evaluation for the first refuse-derived fuel pilot process from both quantitative and qualitative aspects. The process is to be installed and integrated with a large-scale municipal incinerator. This pilot process, developed by an engineering firm in Tainan County, consists of standard unit operations of shredding, magnetic separation, trommel screening, and air classification. A series of sampling and analyses were initialized in order to characterize its potentials in the solid waste management system. The probabilistic modeling for various types o f waste pro perties derived in this analysis may provide a basic understanding of system reliability.

  19. Audience reactions and receptivity to HIV prevention message concepts for people living with HIV.

    PubMed

    Uhrig, Jennifer D; Bann, Carla M; Wasserman, Jill; Guenther-Grey, Carolyn; Eroğlu, Doğan

    2010-04-01

    This study measured audience reactions and receptivity to five draft HIV prevention messages developed for people living with HIV (PLWH) to inform future HIV message choice and audience targeting decisions. Our premise was that message concepts that receive wide audience appeal constitute a strong starting point for designing future HIV prevention messages, program activities, and health communication and marketing campaigns for PLWH. The majority of participants indicated agreement with evaluative statements that expressed favorable attitudes toward all five of the message concepts we evaluated. Participants gave the lowest approval to the message promoting sero-sorting. Sociodemographic characteristics played less of a role in predicting differences in message perceptions than attitudes, beliefs and sexual behavior. The general appeal for these messages is encouraging given that messages were expressed in plain text without the support of other creative elements that are commonly used in message execution. These results confirm the utility of systematic efforts to generate and screen message concepts prior to large-scale testing.

  20. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    NASA Astrophysics Data System (ADS)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  1. Evaluating stream trout habitat on large-scale aerial color photographs

    Treesearch

    Wallace J. Greentree; Robert C. Aldrich

    1976-01-01

    Large-scale aerial color photographs were used to evaluate trout habitat by studying stream and streambank conditions. Ninety-two percent of these conditions could be identified correctly on the color photographs. Color photographs taken 1 year apart showed that rehabilitation efforts resulted in stream vegetation changes. Water depth was correlated with film density:...

  2. Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles

    NASA Technical Reports Server (NTRS)

    Gradl, Paul; Brandsmeier, Will

    2016-01-01

    Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.

  3. Thermodynamic scaling of the shear viscosity of Mie n-6 fluids and their binary mixtures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delage-Santacreu, Stephanie; Galliero, Guillaume, E-mail: guillaume.galliero@univ-pau.fr; Hoang, Hai

    2015-05-07

    In this work, we have evaluated the applicability of the so-called thermodynamic scaling and the isomorph frame to describe the shear viscosity of Mie n-6 fluids of varying repulsive exponents (n = 8, 12, 18, 24, and 36). Furthermore, the effectiveness of the thermodynamic scaling to deal with binary mixtures of Mie n-6 fluids has been explored as well. To generate the viscosity database of these fluids, extensive non-equilibrium molecular dynamics simulations have been performed for various thermodynamic conditions. Then, a systematic approach has been used to determine the gamma exponent value (γ) characteristic of the thermodynamic scaling approach formore » each system. In addition, the applicability of the isomorph theory with a density dependent gamma has been confirmed in pure fluids. In both pure fluids and mixtures, it has been found that the thermodynamic scaling with a constant gamma is sufficient to correlate the viscosity data on a large range of thermodynamic conditions covering liquid and supercritical states as long as the density is not too high. Interestingly, it has been obtained that, in pure fluids, the value of γ is directly proportional to the repulsive exponent of the Mie potential. Finally, it has been found that the value of γ in mixtures can be deduced from those of the pure component using a simple logarithmic mixing rule.« less

  4. Evaluating the performance of different predictor strategies in regression-based downscaling with a focus on glacierized mountain environments

    NASA Astrophysics Data System (ADS)

    Hofer, Marlis; Nemec, Johanna

    2016-04-01

    This study presents first steps towards verifying the hypothesis that uncertainty in global and regional glacier mass simulations can be reduced considerably by reducing the uncertainty in the high-resolution atmospheric input data. To this aim, we systematically explore the potential of different predictor strategies for improving the performance of regression-based downscaling approaches. The investigated local-scale target variables are precipitation, air temperature, wind speed, relative humidity and global radiation, all at a daily time scale. Observations of these target variables are assessed from three sites in geo-environmentally and climatologically very distinct settings, all within highly complex topography and in the close proximity to mountain glaciers: (1) the Vernagtbach station in the Northern European Alps (VERNAGT), (2) the Artesonraju measuring site in the tropical South American Andes (ARTESON), and (3) the Brewster measuring site in the Southern Alps of New Zealand (BREWSTER). As the large-scale predictors, ERA interim reanalysis data are used. In the applied downscaling model training and evaluation procedures, particular emphasis is put on appropriately accounting for the pitfalls of limited and/or patchy observation records that are usually the only (if at all) available data from the glacierized mountain sites. Generalized linear models and beta regression are investigated as alternatives to ordinary least squares regression for the non-Gaussian target variables. By analyzing results for the three different sites, five predictands and for different times of the year, we look for systematic improvements in the downscaling models' skill specifically obtained by (i) using predictor data at the optimum scale rather than the minimum scale of the reanalysis data, (ii) identifying the optimum predictor allocation in the vertical, and (iii) considering multiple (variable, level and/or grid point) predictor options combined with state-of-art empirical feature selection tools. First results show that in particular for air temperature, those downscaling models based on direct predictor selection show comparative skill like those models based on multiple predictors. For all other target variables, however, multiple predictor approaches can considerably outperform those models based on single predictors. Including multiple variable types emerges as the most promising predictor option (in particular for wind speed at all sites), even if the same predictor set is used across the different cases.

  5. SQDFT: Spectral Quadrature method for large-scale parallel O ( N ) Kohn–Sham calculations at high temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method formore » $$\\mathscr{O}(N)$$ Kohn–Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw–Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw–Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. Here, we further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect $$\\mathscr{O}(N)$$ scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.« less

  6. Time-sliced perturbation theory for large scale structure I: general formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blas, Diego; Garny, Mathias; Sibiryakov, Sergey

    2016-07-01

    We present a new analytic approach to describe large scale structure formation in the mildly non-linear regime. The central object of the method is the time-dependent probability distribution function generating correlators of the cosmological observables at a given moment of time. Expanding the distribution function around the Gaussian weight we formulate a perturbative technique to calculate non-linear corrections to cosmological correlators, similar to the diagrammatic expansion in a three-dimensional Euclidean quantum field theory, with time playing the role of an external parameter. For the physically relevant case of cold dark matter in an Einstein-de Sitter universe, the time evolution ofmore » the distribution function can be found exactly and is encapsulated by a time-dependent coupling constant controlling the perturbative expansion. We show that all building blocks of the expansion are free from spurious infrared enhanced contributions that plague the standard cosmological perturbation theory. This paves the way towards the systematic resummation of infrared effects in large scale structure formation. We also argue that the approach proposed here provides a natural framework to account for the influence of short-scale dynamics on larger scales along the lines of effective field theory.« less

  7. Cosmological measurements with general relativistic galaxy correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raccanelli, Alvise; Montanari, Francesco; Durrer, Ruth

    We investigate the cosmological dependence and the constraining power of large-scale galaxy correlations, including all redshift-distortions, wide-angle, lensing and gravitational potential effects on linear scales. We analyze the cosmological information present in the lensing convergence and in the gravitational potential terms describing the so-called ''relativistic effects'', and we find that, while smaller than the information contained in intrinsic galaxy clustering, it is not negligible. We investigate how neglecting them does bias cosmological measurements performed by future spectroscopic and photometric large-scale surveys such as SKA and Euclid. We perform a Fisher analysis using the CLASS code, modified to include scale-dependent galaxymore » bias and redshift-dependent magnification and evolution bias. Our results show that neglecting relativistic terms, especially lensing convergence, introduces an error in the forecasted precision in measuring cosmological parameters of the order of a few tens of percent, in particular when measuring the matter content of the Universe and primordial non-Gaussianity parameters. The analysis suggests a possible substantial systematic error in cosmological parameter constraints. Therefore, we argue that radial correlations and integrated relativistic terms need to be taken into account when forecasting the constraining power of future large-scale number counts of galaxy surveys.« less

  8. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  9. SQDFT: Spectral Quadrature method for large-scale parallel O ( N ) Kohn–Sham calculations at high temperature

    DOE PAGES

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj; ...

    2017-12-07

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method formore » $$\\mathscr{O}(N)$$ Kohn–Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw–Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw–Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. Here, we further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect $$\\mathscr{O}(N)$$ scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.« less

  10. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE PAGES

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    2016-07-26

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  11. Large-scale imputation of epigenomic datasets for systematic annotation of diverse human tissues.

    PubMed

    Ernst, Jason; Kellis, Manolis

    2015-04-01

    With hundreds of epigenomic maps, the opportunity arises to exploit the correlated nature of epigenetic signals, across both marks and samples, for large-scale prediction of additional datasets. Here, we undertake epigenome imputation by leveraging such correlations through an ensemble of regression trees. We impute 4,315 high-resolution signal maps, of which 26% are also experimentally observed. Imputed signal tracks show overall similarity to observed signals and surpass experimental datasets in consistency, recovery of gene annotations and enrichment for disease-associated variants. We use the imputed data to detect low-quality experimental datasets, to find genomic sites with unexpected epigenomic signals, to define high-priority marks for new experiments and to delineate chromatin states in 127 reference epigenomes spanning diverse tissues and cell types. Our imputed datasets provide the most comprehensive human regulatory region annotation to date, and our approach and the ChromImpute software constitute a useful complement to large-scale experimental mapping of epigenomic information.

  12. Method for revealing biases in precision mass measurements

    NASA Astrophysics Data System (ADS)

    Vabson, V.; Vendt, R.; Kübarsepp, T.; Noorma, M.

    2013-02-01

    A practical method for the quantification of systematic errors of large-scale automatic comparators is presented. This method is based on a comparison of the performance of two different comparators. First, the differences of 16 equal partial loads of 1 kg are measured with a high-resolution mass comparator featuring insignificant bias and 1 kg maximum load. At the second stage, a large-scale comparator is tested by using combined loads with known mass differences. Comparing the different results, the biases of any comparator can be easily revealed. These large-scale comparator biases are determined over a 16-month period, and for the 1 kg loads, a typical pattern of biases in the range of ±0.4 mg is observed. The temperature differences recorded inside the comparator concurrently with mass measurements are found to remain within a range of ±30 mK, which obviously has a minor effect on the detected biases. Seasonal variations imply that the biases likely arise mainly due to the functioning of the environmental control at the measurement location.

  13. A systematic review and meta-analysis of mesh versus suture cruroplasty in laparoscopic large hiatal hernia repair

    PubMed Central

    Tam, Vernissia; Winger, Daniel G.; Nason, Katie S.

    2015-01-01

    Structured Abstract Background Equipoise exists regarding whether mesh cruroplasty during laparoscopic large hiatal hernia repair improves symptomatic outcomes compared to suture repair. Data Source Systematic literature review (MEDLINE and EMBASE) identified 13 studies (1194 patients; 521 suture and 673 mesh) comparing mesh versus suture cruroplasty during laparoscopic repair of large hiatal hernia. We abstracted data regarding symptom assessment, objective recurrence, and reoperation and performed meta-analysis. Conclusions The majority of studies reported significant symptom improvement. Data were insufficient to evaluate symptomatic versus asymptomatic recurrence. Time to evaluation was skewed toward longer follow-up after suture cruroplasty. Odds of recurrence (OR 0.51, 95% CI 0.30–0.87; overall p=0.014) but not need for reoperation (OR 0.42, 95% CI 0.13–1.37; overall p=0.149) were less after mesh cruroplasty. Quality of evidence supporting routine use of mesh cruroplasty is low. Mesh should be used at surgeon discretion until additional studies evaluating symptomatic outcomes, quality of life and long-term recurrence are available. PMID:26520872

  14. Efficacy and safety of miconazole for oral candidiasis: a systematic review and meta-analysis.

    PubMed

    Zhang, L-W; Fu, J-Y; Hua, H; Yan, Z-M

    2016-04-01

    The objective of this study is to assess the efficacy and safety of miconazole for treating oral candidiasis. Twelve electronic databases were searched for randomized controlled trials evaluating treatments for oral candidiasis and complemented by hand searching. The clinical and mycological outcomes, as well as adverse effects, were set as the primary outcome criteria. Seventeen trials were included in this review. Most studies were considered to have a high or moderate level of bias. Miconazole was more effective than nystatin for thrush. For HIV-infected patients, there was no significant difference in the efficacy between miconazole and other antifungals. For denture wearers, microwave therapy was significantly better than miconazole. No significant difference was found in the safety evaluation between miconazole and other treatments. The relapse rate of miconazole oral gel may be lower than that of other formulations. This systematic review and meta-analysis indicated that miconazole may be an optional choice for thrush. Microwave therapy could be an effective adjunct treatment for denture stomatitis. Miconazole oral gel may be more effective than other formulations with regard to long-term results. However, future studies that are adequately powered, large-scale, and well-designed are needed to provide higher-quality evidence for the management of oral candidiasis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Estimating the coverage of mental health programmes: a systematic review.

    PubMed

    De Silva, Mary J; Lee, Lucy; Fuhr, Daniela C; Rathod, Sujit; Chisholm, Dan; Schellenberg, Joanna; Patel, Vikram

    2014-04-01

    The large treatment gap for people suffering from mental disorders has led to initiatives to scale up mental health services. In order to track progress, estimates of programme coverage, and changes in coverage over time, are needed. Systematic review of mental health programme evaluations that assess coverage, measured either as the proportion of the target population in contact with services (contact coverage) or as the proportion of the target population who receive appropriate and effective care (effective coverage). We performed a search of electronic databases and grey literature up to March 2013 and contacted experts in the field. Methods to estimate the numerator (service utilization) and the denominator (target population) were reviewed to explore methods which could be used in programme evaluations. We identified 15 735 unique records of which only seven met the inclusion criteria. All studies reported contact coverage. No study explicitly measured effective coverage, but it was possible to estimate this for one study. In six studies the numerator of coverage, service utilization, was estimated using routine clinical information, whereas one study used a national community survey. The methods for estimating the denominator, the population in need of services, were more varied and included national prevalence surveys case registers, and estimates from the literature. Very few coverage estimates are available. Coverage could be estimated at low cost by combining routine programme data with population prevalence estimates from national surveys.

  16. Altering micro-environments to change population health behaviour: towards an evidence base for choice architecture interventions.

    PubMed

    Hollands, Gareth J; Shemilt, Ian; Marteau, Theresa M; Jebb, Susan A; Kelly, Michael P; Nakamura, Ryota; Suhrcke, Marc; Ogilvie, David

    2013-12-21

    The idea that behaviour can be influenced at population level by altering the environments within which people make choices (choice architecture) has gained traction in policy circles. However, empirical evidence to support this idea is limited, especially its application to changing health behaviour. We propose an evidence-based definition and typology of choice architecture interventions that have been implemented within small-scale micro-environments and evaluated for their effects on four key sets of health behaviours: diet, physical activity, alcohol and tobacco use. We argue that the limitations of the evidence base are due not simply to an absence of evidence, but also to a prior lack of definitional and conceptual clarity concerning applications of choice architecture to public health intervention. This has hampered the potential for systematic assessment of existing evidence. By seeking to address this issue, we demonstrate how our definition and typology have enabled systematic identification and preliminary mapping of a large body of available evidence for the effects of choice architecture interventions. We discuss key implications for further primary research, evidence synthesis and conceptual development to support the design and evaluation of such interventions. This conceptual groundwork provides a foundation for future research to investigate the effectiveness of choice architecture interventions within micro-environments for changing health behaviour. The approach we used may also serve as a template for mapping other under-explored fields of enquiry.

  17. A systematic evaluation of websites offering information on chronic kidney disease.

    PubMed

    Lutz, Erin R; Costello, Kaitlin L; Jo, Minjeong; Gilet, Constance A; Hawley, Jennifer M; Bridgman, Jessica C; Song, Mi-Kyung

    2014-01-01

    In this study, we described the content and characteristics of 40 non-proprietary websites offering information about chronic kidney disease (CKD) and evaluated their information quality using the DISCERN scale and readability using Flesch Reading Ease and Flesch-Kincaid grade level. The areas in which the websites scored the lowest on the DISCERN scale were whether the website discussed knowledge gaps, presented balanced information, and was clear about the information source. Websites that rated higher quality on the DISCERN scale were more difficult to read. The quality and readability of many websites about CKD to be used as meaningful educational resources for patients who desire to learn more about CKD and treatment options remain inadequate.

  18. Risk of malignancy in ankylosing spondylitis: a systematic review and meta-analysis.

    PubMed

    Deng, Chuiwen; Li, Wenli; Fei, Yunyun; Li, Yongzhe; Zhang, Fengchun

    2016-08-18

    Current knowledge about the overall and site-specific risk of malignancy associated with ankylosing spondylitis (AS) is inconsistent. We conducted a systematic review and meta-analysis to address this knowledge gap. Five databases (PubMed, EMBASE, Web of Science, the Cochrane library and the virtual health library) were systematically searched. A manual search of publications within the last 2 years in key journals in the field (Annals of the Rheumatic Diseases, Rheumatology and Arthritis &rheumatology) was also performed. STATA 11.2 software was used to conduct the meta-analysis. After screening, twenty-three studies, of different designs, were eligible for meta-analysis. AS is associated with a 14% (pooled RR 1.14; 95% CI 1.03-1.25) increase in the overall risk for malignancy. Compared to controls, patients with AS are at a specific increased risk for malignancy of the digestive system (pooled RR 1.20; 95% CI 1.01 to 1.42), multiple myelomas (pooled RR 1.92; 95% CI 1.37 to 3.69) and lymphomas (pooled RR 1.32; 95% CI 1.11 to 1.57). On subgroup analysis, evidence from high quality cohort studies indicated that AS patients from Asia are at highest risk for malignancy overall. Confirmation of findings from large-scale longitudinal studies is needed to identify specific risk factors and to evaluate treatment effects.

  19. A process for creating multimetric indices for large-scale aquatic surveys

    EPA Science Inventory

    Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...

  20. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    PubMed

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Quality of Evidence-Based Guidelines for Transfusion of Red Blood Cells and Plasma: A Systematic Review.

    PubMed

    Pavenski, Katerina; Stanworth, Simon; Fung, Mark; Wood, Erica M; Pink, Joanne; Murphy, Michael F; Hume, Heather; Nahirniak, Susan; Webert, Kathryn E; Tanael, Susano; Landry, Denise; Shehata, Nadine

    2018-06-01

    Many transfusion guidelines are available, but little appraisal of their quality has been undertaken. The quality of guidelines may potentially influence adoption. Our aim was to determine the quality of evidence-based transfusion guidelines (EBG) for red cells and plasma, using the Appraisal of Guidelines for Research and Evaluation (AGREE II) instrument, and assess duplication and consistency of recommendations. MEDLINE and EMBASE were systematically searched for EBG from 2005 to June 3, 2016. Citations were reviewed for inclusion in duplicate. A guideline was included if it had a specified clinical question, described a systematic search strategy, included critical appraisal of the literature and a description of how recommendations were developed. Four to six physicians used AGREE II to appraise each guideline. Median and scaled scores were calculated, with each item scored on a scale of one to seven, seven representing the highest score. Of 6174 citations, 30 guidelines met inclusion criteria. Twenty six guidelines had recommendations for red cells and 18 included recommendations for plasma use. The median score, the scaled score and the interquartile range of the scaled score were: scope and purpose: median score 5, scaled score 60%, IQR (49-74%); stakeholder involvement 4, 43%, (33-49%); rigor of development 4, 41%, (19-59%); clarity of presentation 5, 69%, (52-81%); applicability 1, 16%, (9-23%); editorial independence 3, 43%, (20-58%). Sixteen guidelines were evaluated to have a scaled domain score of 50% or less. Variations in recommendations were found for the use of hemoglobin triggers for red cell transfusion in patients with acute coronary syndromes and for plasma use for patients with bleeding. Our findings document, limited rigor in guideline development and duplication and inconsistencies in recommendations for the same topic. The process of developing guidelines for red cells and plasma transfusion can be enhanced to improve implementation. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Test-retest reliability of the Clinical Learning Environment, Supervision and Nurse Teacher (CLES + T) scale.

    PubMed

    Gustafsson, Margareta; Blomberg, Karin; Holmefur, Marie

    2015-07-01

    The Clinical Learning Environment, Supervision and Nurse Teacher (CLES + T) scale evaluates the student nurses' perception of the learning environment and supervision within the clinical placement. It has never been tested in a replication study. The aim of the present study was to evaluate the test-retest reliability of the CLES + T scale. The CLES + T scale was administered twice to a group of 42 student nurses, with a one-week interval. Test-retest reliability was determined by calculations of Intraclass Correlation Coefficients (ICCs) and weighted Kappa coefficients. Standard Error of Measurements (SEM) and Smallest Detectable Difference (SDD) determined the precision of individual scores. Bland-Altman plots were created for analyses of systematic differences between the test occasions. The results of the study showed that the stability over time was good to excellent (ICC 0.88-0.96) in the sub-dimensions "Supervisory relationship", "Pedagogical atmosphere on the ward" and "Role of the nurse teacher". Measurements of "Premises of nursing on the ward" and "Leadership style of the manager" had lower but still acceptable stability (ICC 0.70-0.75). No systematic differences occurred between the test occasions. This study supports the usefulness of the CLES + T scale as a reliable measure of the student nurses' perception of the learning environment within the clinical placement at a hospital. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A systematic meta-review of evaluations of youth violence prevention programs: Common and divergent findings from 25 years of meta-analyses and systematic reviews☆

    PubMed Central

    Matjasko, Jennifer L.; Vivolo-Kantor, Alana M.; Massetti, Greta M.; Holland, Kristin M.; Holt, Melissa K.; Cruz, Jason Dela

    2018-01-01

    Violence among youth is a pervasive public health problem. In order to make progress in reducing the burden of injury and mortality that result from youth violence, it is imperative to identify evidence-based programs and strategies that have a significant impact on violence. There have been many rigorous evaluations of youth violence prevention programs. However, the literature is large, and it is difficult to draw conclusions about what works across evaluations from different disciplines, contexts, and types of programs. The current study reviews the meta-analyses and systematic reviews published prior to 2009 that synthesize evaluations of youth violence prevention programs. This meta-review reports the findings from 37 meta-analyses and 15 systematic reviews; the included reviews were coded on measures of the social ecology, prevention approach, program type, and study design. A majority of the meta-analyses and systematic reviews were found to demonstrate moderate program effects. Meta-analyses yielded marginally smaller effect sizes compared to systematic reviews, and those that included programs targeting family factors showed marginally larger effects than those that did not. In addition, there are a wide range of individual/family, program, and study moderators of program effect sizes. Implications of these findings and suggestions for future research are discussed. PMID:29503594

  4. Investigation of rock samples by neutron diffraction and ultrasonic sounding

    NASA Astrophysics Data System (ADS)

    Burilichev, D. E.; Ivankina, T. I.; Klima, K.; Locajicek, T.; Nikitin, A. N.; Pros, Z.

    2000-03-01

    The interpretation of large-scale geophysical anisotropies largely depends upon the knowledge of rock anisotropies of any kind (compositions, foliations, grain shape, physical properties). Almost all physical rock properties (e.g. elastic, thermal, magnetic properties) are related to the textures of the rock constituents since they are anisotropic for the single crystal. Although anisotropy determinations are numerous, systematic investigations are scarce. Therefore, several rock samples with different microfabrics were selected for texture analysis and to determine its P-wave distributions at various confining pressures.

  5. PERFORMANCE OF SOLAR HOT WATER COLLECTORS FOR ELECTRICITY PRODUCTION AND CLIMATE CONTROL

    EPA Science Inventory

    We will systematically evaluate commercially available solar thermal collectors and thermal storage systems for use in residential scale co-generative heat and electrical power systems. Currently, reliable data is unavailable over the range of conditions and installations thes...

  6. Evaluation of Scaling Methods for Rotorcraft Icing

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Kreeger, Richard E.

    2010-01-01

    This paper reports result of an experimental study in the NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the current recommended scaling methods developed for fixed-wing unprotected surface icing applications might apply to representative rotor blades at finite angle of attack. Unlike the fixed-wing case, there is no single scaling method that has been systematically developed and evaluated for rotorcraft icing applications. In the present study, scaling was based on the modified Ruff method with scale velocity determined by maintaining constant Weber number. Models were unswept NACA 0012 wing sections. The reference model had a chord of 91.4 cm and scale model had a chord of 35.6 cm. Reference tests were conducted with velocities of 76 and 100 kt (39 and 52 m/s), droplet MVDs of 150 and 195 fun, and with stagnation-point freezing fractions of 0.3 and 0.5 at angle of attack of 0deg and 5deg. It was shown that good ice shape scaling was achieved for NACA 0012 airfoils with angle of attack lip to 5deg.

  7. Towards stellar effective temperatures and diameters at 1 per cent accuracy for future surveys

    NASA Astrophysics Data System (ADS)

    Casagrande, L.; Portinari, L.; Glass, I. S.; Laney, D.; Silva Aguirre, V.; Datson, J.; Andersen, J.; Nordström, B.; Holmberg, J.; Flynn, C.; Asplund, M.

    2014-04-01

    The apparent size of stars is a crucial benchmark for fundamental stellar properties such as effective temperatures, radii and surface gravities. While interferometric measurements of stellar angular diameters are the most direct method to gauge these, they are still limited to relatively nearby and bright stars, which are saturated in most of the modern photometric surveys. This dichotomy prevents us from safely extending well-calibrated relations to the faint stars targeted in large spectroscopic and photometric surveys. Here, we alleviate this obstacle by presenting South African Astronomical Observatory near-infrared JHK observations of 55 stars: 16 of them have interferometric angular diameters and the rest are in common with the 2 Micron All Sky Survey (2MASS, unsaturated) data set, allowing us to tie the effective temperatures and angular diameters derived via the infrared flux method to the interferometric scale. We extend the test to recent interferometric measurements of unsaturated 2MASS stars, including giants, and the metal-poor benchmark target HD122563. With a critical evaluation of the systematics involved, we conclude that a 1 per cent accuracy in fundamental stellar parameters is usually within reach. Caution, however, must be used when indirectly testing a Teff scale via colour relations as well as when assessing the reliability of interferometric measurements, especially at submilliarcsec level. As a result, rather different effective temperature scales can be compatible with a given subset of interferometric data. We highlight some caveats to be aware of in such a quest and suggest a simple method to check against systematics in fundamental measurements. A new diagnostic combination seismic radii with astrometric distances is also presented.

  8. Hierarchical coarse-graining strategy for protein-membrane systems to access mesoscopic scales

    PubMed Central

    Ayton, Gary S.; Lyman, Edward

    2014-01-01

    An overall multiscale simulation strategy for large scale coarse-grain simulations of membrane protein systems is presented. The protein is modeled as a heterogeneous elastic network, while the lipids are modeled using the hybrid analytic-systematic (HAS) methodology, where in both cases atomistic level information obtained from molecular dynamics simulation is used to parameterize the model. A feature of this approach is that from the outset liposome length scales are employed in the simulation (i.e., on the order of ½ a million lipids plus protein). A route to develop highly coarse-grained models from molecular-scale information is proposed and results for N-BAR domain protein remodeling of a liposome are presented. PMID:20158037

  9. Effect of inventory method on niche models: random versus systematic error

    Treesearch

    Heather E. Lintz; Andrew N. Gray; Bruce McCune

    2013-01-01

    Data from large-scale biological inventories are essential for understanding and managing Earth's ecosystems. The Forest Inventory and Analysis Program (FIA) of the U.S. Forest Service is the largest biological inventory in North America; however, the FIA inventory recently changed from an amalgam of different approaches to a nationally-standardized approach in...

  10. An Exploratory Analysis of the Longitudinal Impact of Principal Change on Elementary School Achievement

    ERIC Educational Resources Information Center

    Hochbein, Craig; Cunningham, Brittany C.

    2013-01-01

    Recent reform initiatives, such as the Title I School Improvement Grants and Race to the Top, recommended a principal change to jump-start school turnaround. Yet, few educational researchers have examined principal change as way to improve schools in a state of systematic reform; furthermore, no large-scale quantitative study has determined the…

  11. Discussing the Flynn Effect: From Causes and Interpretation to Implications

    ERIC Educational Resources Information Center

    Kanaya, Tomoe

    2016-01-01

    Clark, Lawlor-Savage, and Goghari (this issue) point out that evidence of IQ rises had been documented decades before it was named the Flynn effect. These previous studies, however, were conducted sporadically and in isolated samples. Flynn (1984, 1987) examined them in a large-scale manner and was able to show their systematic and global nature.…

  12. What Works to Improve Reading Outcomes in Latin-America? A Systematic Review of the Evidence

    ERIC Educational Resources Information Center

    de Hoop, Thomas; Klochikin, Evgeny; Stone, Rebecca

    2016-01-01

    Improvements in students' learning achievement have lagged behind in low-and middle-income countries despite significant progress in school enrollment numbers. Large-scale early grade reading assessments (e.g., "Annual Status of Education Report" [ASER], 2013; EdData II, n.d.) have shown low reading rates and worryingly high…

  13. Overview of the OGAP Formative Assessment Project and CPRE's Large-Scale Experimental Study of Implementation and Impacts

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2016-01-01

    In this presentation discussed in this brief abstracted report, the author presents about an ongoing partnership with the Philadelphia School District (PSD) to implement and research the Ongoing Assessment Project (OGAP). OGAP is a systematic, intentional and iterative formative assessment system grounded in the research on how students learn…

  14. Guide for preparing active solar heating systems operation and maintenance manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This book presents a systematic and standardized approach to the preparation of operation and maintenance manuals for active solar heating systems. Provides an industry consensus of the best operating and maintenance procedures for large commercial-scale solar service water and space heating systems. A sample O M manual is included. 3-ring binder included.

  15. Systematic analysis of microfauna indicator values for treatment performance in a full-scale municipal wastewater treatment plant.

    PubMed

    Hu, Bo; Qi, Rong; Yang, Min

    2013-07-01

    The indicator values of microfauna functional groups and species for treatment performance were systematically evaluated based on the continuous monitoring of the entire microfauna communities including both protozoa and metazoa over a period of 14 months, in two parallel full-scale municipal wastewater treatment systems in a plant in Beijing, China. A total of 57 species of ciliates, 14 species (units) of amoebae, 14 species (units) of flagellates and 4 classes of small metazoa were identified, with Arcella hemisphaerica, Vorticella striata, Vorticella convallaria, Epistylis plicatilis and small flagellates (e.g. Bodo spp.) as the dominant protozoa, and rotifers as the dominant metazoa. The abundance of the sessile ciliates was correlated with the removals of BOD5 (Pearson's r = 0.410, p < 0.05) and CODcr (r = 0.397, p < 0.05) while the testate amoebae was significantly positively related to nitrification (r = 0.523, p < 0.01). At the same time, some other associations were also identified: the abundances of the large flagellates (r = 0.447, p < 0.01), the metazoa (r = 0.718, p < 0.01) and species Aspidisca sulcata (r = 0.337, p < 0.05) were positively related to nitrification; the abundance of Aspidisca costata was correlated to the TN (total nitrogen) removal (r = -0.374, p < 0.05 ); the abundances of the sessile species Carchesium polypinum (r = 0.458, p < 0.01) and E. plicatilis (r = 0.377, p < 0.05) were correlated with the removal of suspended solids.

  16. Quality Assessment of Studies Published in Open Access and Subscription Journals: Results of a Systematic Evaluation.

    PubMed

    Pastorino, Roberta; Milovanovic, Sonja; Stojanovic, Jovana; Efremov, Ljupcho; Amore, Rosarita; Boccia, Stefania

    2016-01-01

    Along with the proliferation of Open Access (OA) publishing, the interest for comparing the scientific quality of studies published in OA journals versus subscription journals has also increased. With our study we aimed to compare the methodological quality and the quality of reporting of primary epidemiological studies and systematic reviews and meta-analyses published in OA and non-OA journals. In order to identify the studies to appraise, we listed all OA and non-OA journals which published in 2013 at least one primary epidemiologic study (case-control or cohort study design), and at least one systematic review or meta-analysis in the field of oncology. For the appraisal, we picked up the first studies published in 2013 with case-control or cohort study design from OA journals (Group A; n = 12), and in the same time period from non-OA journals (Group B; n = 26); the first systematic reviews and meta-analyses published in 2013 from OA journals (Group C; n = 15), and in the same time period from non-OA journals (Group D; n = 32). We evaluated the methodological quality of studies by assessing the compliance of case-control and cohort studies to Newcastle and Ottawa Scale (NOS) scale, and the compliance of systematic reviews and meta-analyses to Assessment of Multiple Systematic Reviews (AMSTAR) scale. The quality of reporting was assessed considering the adherence of case-control and cohort studies to STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist, and the adherence of systematic reviews and meta-analyses to Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) checklist. Among case-control and cohort studies published in OA and non-OA journals, we did not observe significant differences in the median value of NOS score (Group A: 7 (IQR 7-8) versus Group B: 8 (7-9); p = 0.5) and in the adherence to STROBE checklist (Group A, 75% versus Group B, 80%; p = 0.1). The results did not change after adjustment for impact factor. The compliance with AMSTAR and adherence to PRISMA checklist were comparable between systematic reviews and meta-analyses published in OA and non-OA journals (Group C, 46.0% versus Group D, 55.0%; p = 0.06), (Group C, 72.0% versus Group D, 76.0%; p = 0.1), respectively). The epidemiological studies published in OA journals in the field of oncology approach the same methodological quality and quality of reporting as studies published in non-OA journals.

  17. Quality Assessment of Studies Published in Open Access and Subscription Journals: Results of a Systematic Evaluation

    PubMed Central

    Pastorino, Roberta; Milovanovic, Sonja; Stojanovic, Jovana; Efremov, Ljupcho; Amore, Rosarita; Boccia, Stefania

    2016-01-01

    Introduction Along with the proliferation of Open Access (OA) publishing, the interest for comparing the scientific quality of studies published in OA journals versus subscription journals has also increased. With our study we aimed to compare the methodological quality and the quality of reporting of primary epidemiological studies and systematic reviews and meta-analyses published in OA and non-OA journals. Methods In order to identify the studies to appraise, we listed all OA and non-OA journals which published in 2013 at least one primary epidemiologic study (case-control or cohort study design), and at least one systematic review or meta-analysis in the field of oncology. For the appraisal, we picked up the first studies published in 2013 with case-control or cohort study design from OA journals (Group A; n = 12), and in the same time period from non-OA journals (Group B; n = 26); the first systematic reviews and meta-analyses published in 2013 from OA journals (Group C; n = 15), and in the same time period from non-OA journals (Group D; n = 32). We evaluated the methodological quality of studies by assessing the compliance of case-control and cohort studies to Newcastle and Ottawa Scale (NOS) scale, and the compliance of systematic reviews and meta-analyses to Assessment of Multiple Systematic Reviews (AMSTAR) scale. The quality of reporting was assessed considering the adherence of case-control and cohort studies to STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist, and the adherence of systematic reviews and meta-analyses to Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) checklist. Results Among case-control and cohort studies published in OA and non-OA journals, we did not observe significant differences in the median value of NOS score (Group A: 7 (IQR 7–8) versus Group B: 8 (7–9); p = 0.5) and in the adherence to STROBE checklist (Group A, 75% versus Group B, 80%; p = 0.1). The results did not change after adjustment for impact factor. The compliance with AMSTAR and adherence to PRISMA checklist were comparable between systematic reviews and meta-analyses published in OA and non-OA journals (Group C, 46.0% versus Group D, 55.0%; p = 0.06), (Group C, 72.0% versus Group D, 76.0%; p = 0.1), respectively). Conclusion The epidemiological studies published in OA journals in the field of oncology approach the same methodological quality and quality of reporting as studies published in non-OA journals. PMID:27167982

  18. General practitioners' continuing education: a review of policies, strategies and effectiveness, and their implications for the future.

    PubMed Central

    Smith, F; Singleton, A; Hilton, S

    1998-01-01

    BACKGROUND: The accreditation and provision of continuing education for general practitioners (GPs) is set to change with new proposals from the General Medical Council, the Government, and the Chief Medical Officer. AIM: To review the theories, policies, strategies, and effectiveness in GP continuing education in the past 10 years. METHOD: A systematic review of the literature by computerized and manual searches of relevant journals and books. RESULTS: Educational theory suggests that continuing education (CE) should be work-based and use the learner's experiences. Audit can play an important role in determining performance and needs assessment, but at present is largely a separate activity. Educational and professional support, such as through mentors or co-tutors, has been successfully piloted but awaits larger scale evaluation. Most accredited educational events are still the postgraduate centre lecture, and GP Tutors have a variable role in CE management and provision. Controlled trials of CE strategies suggest effectiveness is enhanced by personal feedback and work prompts. Qualitative studies have demonstrated that education plays only a small part in influencing doctors' behavior. CONCLUSION: Maintaining good clinical practice is on many stakeholders' agendas. A variety of methods may be effective in CE, and larger scale trials or evaluations are needed. PMID:10071406

  19. Clean fuels for resource-poor settings: A systematic review of barriers and enablers to adoption and sustained use.

    PubMed

    Puzzolo, Elisa; Pope, Daniel; Stanistreet, Debbi; Rehfuess, Eva A; Bruce, Nigel G

    2016-04-01

    Access to, and sustained adoption of, clean household fuels at scale remains an aspirational goal to achieve sufficient reductions in household air pollution (HAP) in order to impact on the substantial global health burden caused by reliance on solid fuels. To systematically appraise the current evidence base to identify: (i) which factors enable or limit adoption and sustained use of clean fuels (namely liquefied petroleum gas (LPG), biogas, solar cooking and alcohol fuels) in low- and middle-income countries; (ii) lessons learnt concerning equitable scaling-up of programmes of cleaner cooking fuels in relation to poverty, urban-rural settings and gender. A mixed-methods systematic review was conducted using established review methodology and extensive searches of published and grey literature sources. Data extraction and quality appraisal of quantitative, qualitative and case studies meeting inclusion criteria were conducted using standardised methods with reliability checking. Forty-four studies from Africa, Asia and Latin America met the inclusion criteria (17 on biogas, 12 on LPG, 9 on solar, 6 on alcohol fuels). A broad range of inter-related enabling and limiting factors were identified for all four types of intervention, operating across seven pre-specified domains (i.e. fuel and technology characteristics, household and setting characteristics, knowledge and perceptions, financial, tax and subsidy aspects, market development, regulation, legislation and standards, and programme and policy mechanisms) and multiple levels (i.e. household, community, national). All domains matter and the majority of factors are common to all clean fuels interventions reviewed although some are fuel and technology-specific. All factors should therefore be taken into account and carefully assessed during planning and implementation of any small- and large-scale initiative aiming at promoting clean fuels for household cooking. Despite limitations in quantity and quality of the evidence this systematic review provides a useful starting point for the design, delivery and evaluation of programmes to ensure more effective adoption and use of LPG, biogas, alcohol fuels and solar cooking. This review was funded by the Department for International Development (DfID) of the United Kingdom. The authors would also like to thank the Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre) for their technical support. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Residential road traffic noise as a risk factor for hypertension in adults: Systematic review and meta-analysis of analytic studies published in the period 2011-2017.

    PubMed

    Dzhambov, Angel M; Dimitrova, Donka D

    2018-05-07

    Multiple cross-sectional studies indicated an association between hypertension and road traffic noise and they were recently synthetized in a WHO systematic evidence review. However, recent years have seen a growing body of high-quality, large-scale research, which is missing from the WHO review. Therefore, we aimed to close that gap by conducting an updated systematic review and meta-analysis on the exposure-response relationship between residential road traffic noise and the risk of hypertension in adults. Studies were identified by searching MEDLINE, EMBASE, the Internet, conference proceedings, reference lists, and expert archives in English, Russian, and Spanish through August 5, 2017. The risk of bias for each extracted estimate and the overall quality of evidence were evaluated using a list of predefined safeguards against bias related to different study characteristics and the Grading of Recommendations Assessment, Development and Evaluation system, respectively. The inverse variance heterogeneity (IVhet) model was used for meta-analysis. The possibility of publication bias was evaluated by funnel and Doi plots, and asymmetry in these was tested with Egger's test and the Luis Furuya-Kanamori index, respectively. Sensitivity analyses included leave-one-out meta-analysis, subgroup meta-analysis with meta-regressions, and non-linear exposure-response meta-analysis. Based on seven cohort and two case-control studies (n = 5 514 555; 14 estimates; L den range ≈ 25-90 dB(A)), we found "low" evidence of RR per 10 dB(A)  = 1.018 (95% CI: 0.984, 1.053), moderate heterogeneity (I 2  = 46%), and no publication bias. In the subgroup of cohort studies, we found "moderate" evidence of RR per 10 dB(A)  = 1.018 (95% CI: 0.987, 1.049), I 2  = 31%, and no publication bias. In conclusion, residential road traffic noise was associated with higher risk of hypertension in adults, but the risk was lower than previously reported in the systematic review literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Constant-Murley Score: systematic review and standardized evaluation in different shoulder pathologies.

    PubMed

    Vrotsou, Kalliopi; Ávila, Mónica; Machón, Mónica; Mateo-Abad, Maider; Pardo, Yolanda; Garin, Olatz; Zaror, Carlos; González, Nerea; Escobar, Antonio; Cuéllar, Ricardo

    2018-05-10

    The objective of this study was to evaluate the psychometric properties of the Constant-Murley Score (CMS) in various shoulder pathologies, based on a systematic review and expert standardized evaluations. A systematic review was performed in MEDLINE and EMBASE databases. Titles and abstracts were reviewed and finally the included articles were grouped according to patients' pathologies. Two expert evaluators independently assessed the CMS properties of reliability, validity, responsiveness to change, interpretability and burden score in each group, using the EMPRO (Evaluating Measures of Patient Reported Outcomes) tool. The CMS properties were assessed per attribute and overall for each considered group. Only the concept and measurement model was assessed globally. Five individual pathologies (i.e. subacromial, fractures, arthritis, instability and frozen shoulder) and two additional groups (i.e. various pathologies and healthy subjects) were considered. Overall EMPRO scores ranged from 58.6 for subacromial to 30.6 points for instability. Responsiveness to change was the only quality to obtain at least 50 points across all groups, but for frozen shoulder. Insufficient information was obtained in relation to the concept and measurement model and great variability was seen in the other evaluated attributes. The current evidence does not support the CMS as a gold standard in shoulder evaluation. Its use is advisable for subacromial pathology; but data are inconclusive for other shoulder conditions. Prospective studies exploring the psychometric properties of the scale, particularly for fractures, arthritis, instability and frozen shoulder are needed. Systematic review.

  2. Assessing communication quality of consultations in primary care: initial reliability of the Global Consultation Rating Scale, based on the Calgary-Cambridge Guide to the Medical Interview.

    PubMed

    Burt, Jenni; Abel, Gary; Elmore, Natasha; Campbell, John; Roland, Martin; Benson, John; Silverman, Jonathan

    2014-03-06

    To investigate initial reliability of the Global Consultation Rating Scale (GCRS: an instrument to assess the effectiveness of communication across an entire doctor-patient consultation, based on the Calgary-Cambridge guide to the medical interview), in simulated patient consultations. Multiple ratings of simulated general practitioner (GP)-patient consultations by trained GP evaluators. UK primary care. 21 GPs and six trained GP evaluators. GCRS score. 6 GP raters used GCRS to rate randomly assigned video recordings of GP consultations with simulated patients. Each of the 42 consultations was rated separately by four raters. We considered whether a fixed difference between scores had the same meaning at all levels of performance. We then examined the reliability of GCRS using mixed linear regression models. We augmented our regression model to also examine whether there were systematic biases between the scores given by different raters and to look for possible order effects. Assessing the communication quality of individual consultations, GCRS achieved a reliability of 0.73 (95% CI 0.44 to 0.79) for two raters, 0.80 (0.54 to 0.85) for three and 0.85 (0.61 to 0.88) for four. We found an average difference of 1.65 (on a 0-10 scale) in the scores given by the least and most generous raters: adjusting for this evaluator bias increased reliability to 0.78 (0.53 to 0.83) for two raters; 0.85 (0.63 to 0.88) for three and 0.88 (0.69 to 0.91) for four. There were considerable order effects, with later consultations (after 15-20 ratings) receiving, on average, scores more than one point higher on a 0-10 scale. GCRS shows good reliability with three raters assessing each consultation. We are currently developing the scale further by assessing a large sample of real-world consultations.

  3. The non-evaluative circumplex of personality adjectives.

    PubMed

    Saucier, G; Ostendorf, F; Peabody, D

    2001-08-01

    In judgments about personality, descriptive and evaluative aspects are ordinarily combined; separating them can be important both theoretically and practically. Study 1 showed that two similar descriptive factors can be found in analyses of personality terms, selected independently in English and in German and using different methods to control for evaluation. The factors relate to two pairs of independent axes suggested by previous work: Assertive-Unassertive and Tight-Loose, or alternatively, Interactional Orientation (Extraversion-Introversion) and Affective Orientation. These two pairs of axes are shown to be rotations of each other, and to form the prime non-evaluative circumplex. As in previous studies, non-evaluative scales elicited higher levels of self-peer agreement than did more typical evaluation-confounded scales. Study 2 showed that adjective scales for the octants of this circumplex have circular ordering, can fit even very stringent constraints of a circumplex model, have mild to strong isomorphism with the interpersonal circumplex, but represent somewhat broader constructs, and are systematically related to the Big Five and the Big Three personality factors.

  4. Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan

    2015-10-01

    Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.

  5. Developments in advanced and energy saving thermal isolations for cryogenic applications

    NASA Astrophysics Data System (ADS)

    Shu, Q. S.; Demko, J. A.; Fesmire, J. E.

    2015-12-01

    The cooling power consumption in large scale superconducting systems is huge and cryogenic devices used in space applications often require an extremely long cryogen holding time. To economically maintain the device at its operating temperature and minimize the refrigeration losses, high performance of thermal isolation is essential. The radiation from warm surrounding surfaces and conducting heat leaks through supports and penetrations are the dominant heat loads to the cold mass under vacuum condition. The advanced developments in various cryogenic applications to successfully reduce the heat loads through radiation and conduction are briefly and systematically discussed and evaluated in this review paper. These include: (1) thermal Insulation for different applications (foams, perlites, glass bubbles, aerogel and MLI), (2) sophisticated low-heat-leak support (cryogenic tension straps, trolley bars and posts with dedicated thermal intercepts), and (3) novel cryogenic heat switches.

  6. A Systematic Review of Group Social Skills Interventions, and Meta-analysis of Outcomes, for Children with High Functioning ASD.

    PubMed

    Wolstencroft, J; Robinson, L; Srinivasan, R; Kerry, E; Mandy, W; Skuse, D

    2018-07-01

    Group social skills interventions (GSSIs) are a commonly offered treatment for children with high functioning ASD. We critically evaluated GSSI randomised controlled trials for those aged 6-25 years. Our meta-analysis of outcomes emphasised internal validity, thus was restricted to trials that used the parent-report social responsiveness scale (SRS) or the social skills rating system (SSRS). Large positive effect sizes were found for the SRS total score, plus the social communication and restricted interests and repetitive behaviours subscales. The SSRS social skills subscale improved with moderate effect size. Moderator analysis of the SRS showed that GSSIs that include parent-groups, and are of greater duration or intensity, obtained larger effect sizes. We recommend future trials distinguish gains in children's social knowledge from social performance.

  7. A scoping review of crisis teams managing dementia in older people.

    PubMed

    Streater, Amy; Coleston-Shields, Donna Maria; Yates, Jennifer; Stanyon, Miriam; Orrell, Martin

    2017-01-01

    Research on crisis teams for older adults with dementia is limited. This scoping review aimed to 1) conduct a systematic literature review reporting on the effectiveness of crisis interventions for older people with dementia and 2) conduct a scoping survey with dementia crisis teams mapping services across England to understand operational procedures and identify what is currently occurring in practice. For the systematic literature review, included studies were graded using the Critical Appraisal Skills Programme checklist. For the scoping survey, Trusts across England were contacted and relevant services were identified that work with people with dementia experiencing a mental health crisis. The systematic literature review demonstrated limited evidence in support of crisis teams reducing the rate of hospital admissions, and despite the increase in number of studies, methodological limitations remain. For the scoping review, only half (51.8%) of the teams had a care pathway to manage crises and the primary need for referral was behavioral or psychological factors. Evidence in the literature for the effectiveness of crisis teams for older adults with dementia remains limited. Being mainly cohort designs can make it difficult to evaluate the effectiveness of the intervention. In practice, it appears that the pathway for care managing crisis for people with dementia varies widely across services in England. There was a wide range of names given to the provision of teams managing crisis for people with dementia, which may reflect the differences in the setup and procedures of the service. To provide evidence on crisis intervention teams, a comprehensive protocol is required to deliver a standardized care pathway and measurable intervention as part of a large-scale evaluation of effectiveness.

  8. A scoping review of crisis teams managing dementia in older people

    PubMed Central

    Streater, Amy; Coleston-Shields, Donna Maria; Yates, Jennifer; Stanyon, Miriam; Orrell, Martin

    2017-01-01

    Background Research on crisis teams for older adults with dementia is limited. This scoping review aimed to 1) conduct a systematic literature review reporting on the effectiveness of crisis interventions for older people with dementia and 2) conduct a scoping survey with dementia crisis teams mapping services across England to understand operational procedures and identify what is currently occurring in practice. Methods For the systematic literature review, included studies were graded using the Critical Appraisal Skills Programme checklist. For the scoping survey, Trusts across England were contacted and relevant services were identified that work with people with dementia experiencing a mental health crisis. Results The systematic literature review demonstrated limited evidence in support of crisis teams reducing the rate of hospital admissions, and despite the increase in number of studies, methodological limitations remain. For the scoping review, only half (51.8%) of the teams had a care pathway to manage crises and the primary need for referral was behavioral or psychological factors. Conclusion Evidence in the literature for the effectiveness of crisis teams for older adults with dementia remains limited. Being mainly cohort designs can make it difficult to evaluate the effectiveness of the intervention. In practice, it appears that the pathway for care managing crisis for people with dementia varies widely across services in England. There was a wide range of names given to the provision of teams managing crisis for people with dementia, which may reflect the differences in the setup and procedures of the service. To provide evidence on crisis intervention teams, a comprehensive protocol is required to deliver a standardized care pathway and measurable intervention as part of a large-scale evaluation of effectiveness. PMID:29042760

  9. Strategies to improve treatment coverage in community-based public health programs: A systematic review of the literature.

    PubMed

    Deardorff, Katrina V; Rubin Means, Arianna; Ásbjörnsdóttir, Kristjana H; Walson, Judd

    2018-02-01

    Community-based public health campaigns, such as those used in mass deworming, vitamin A supplementation and child immunization programs, provide key healthcare interventions to targeted populations at scale. However, these programs often fall short of established coverage targets. The purpose of this systematic review was to evaluate the impact of strategies used to increase treatment coverage in community-based public health campaigns. We systematically searched CAB Direct, Embase, and PubMed archives for studies utilizing specific interventions to increase coverage of community-based distribution of drugs, vaccines, or other public health services. We identified 5,637 articles, from which 79 full texts were evaluated according to pre-defined inclusion and exclusion criteria. Twenty-eight articles met inclusion criteria and data were abstracted regarding strategy-specific changes in coverage from these sources. Strategies used to increase coverage included community-directed treatment (n = 6, pooled percent change in coverage: +26.2%), distributor incentives (n = 2, +25.3%), distribution along kinship networks (n = 1, +24.5%), intensified information, education, and communication activities (n = 8, +21.6%), fixed-point delivery (n = 1, +21.4%), door-to-door delivery (n = 1, +14.0%), integrated service distribution (n = 9, +12.7%), conversion from school- to community-based delivery (n = 3, +11.9%), and management by a non-governmental organization (n = 1, +5.8%). Strategies that target improving community member ownership of distribution appear to have a large impact on increasing treatment coverage. However, all strategies used to increase coverage successfully did so. These results may be useful to National Ministries, programs, and implementing partners in optimizing treatment coverage in community-based public health programs.

  10. Strategies to improve treatment coverage in community-based public health programs: A systematic review of the literature

    PubMed Central

    2018-01-01

    Background Community-based public health campaigns, such as those used in mass deworming, vitamin A supplementation and child immunization programs, provide key healthcare interventions to targeted populations at scale. However, these programs often fall short of established coverage targets. The purpose of this systematic review was to evaluate the impact of strategies used to increase treatment coverage in community-based public health campaigns. Methodology/ principal findings We systematically searched CAB Direct, Embase, and PubMed archives for studies utilizing specific interventions to increase coverage of community-based distribution of drugs, vaccines, or other public health services. We identified 5,637 articles, from which 79 full texts were evaluated according to pre-defined inclusion and exclusion criteria. Twenty-eight articles met inclusion criteria and data were abstracted regarding strategy-specific changes in coverage from these sources. Strategies used to increase coverage included community-directed treatment (n = 6, pooled percent change in coverage: +26.2%), distributor incentives (n = 2, +25.3%), distribution along kinship networks (n = 1, +24.5%), intensified information, education, and communication activities (n = 8, +21.6%), fixed-point delivery (n = 1, +21.4%), door-to-door delivery (n = 1, +14.0%), integrated service distribution (n = 9, +12.7%), conversion from school- to community-based delivery (n = 3, +11.9%), and management by a non-governmental organization (n = 1, +5.8%). Conclusions/significance Strategies that target improving community member ownership of distribution appear to have a large impact on increasing treatment coverage. However, all strategies used to increase coverage successfully did so. These results may be useful to National Ministries, programs, and implementing partners in optimizing treatment coverage in community-based public health programs. PMID:29420534

  11. Implications of the Observed Mesoscale Variations of Clouds for Earth's Radiation Budget

    NASA Technical Reports Server (NTRS)

    Rossow, William B.; Delo, Carl; Cairns, Brian; Hansen, James E. (Technical Monitor)

    2001-01-01

    The effect of small-spatial-scale cloud variations on radiative transfer in cloudy atmospheres currently receives a lot of research attention, but the available studies are not very clear about which spatial scales are important and report a very large range of estimates of the magnitude of the effects. Also, there have been no systematic investigations of how to measure and represent these cloud variations. We exploit the cloud climatology produced by the International Satellite Cloud Climatology Project (ISCCP) to: (1) define and test different methods of representing cloud variation statistics, (2) investigate the range of spatial scales that should be included, (3) characterize cloud variations over a range of space and time scales covering mesoscale (30 - 300 km, 3-12 hr) into part of the lower part of the synoptic scale (300 - 3000 km, 1-30 days), (4) obtain a climatology of the optical thickness, emissivity and cloud top temperature variability of clouds that can be used in weather and climate GCMS, together with the parameterization proposed by Cairns et al. (1999), to account for the effects of small-scale cloud variations on radiative fluxes, and (5) evaluate the effect of observed cloud variations on Earth's radiation budget. These results lead to the formulation of a revised conceptual model of clouds for use in radiative transfer calculations in GCMS. The complete variability climatology can be obtained from the ISCCP Web site at http://isccp.giss.nasa.gov.

  12. Systematic effects of foreground removal in 21-cm surveys of reionization

    NASA Astrophysics Data System (ADS)

    Petrovic, Nada; Oh, S. Peng

    2011-05-01

    21-cm observations have the potential to revolutionize our understanding of the high-redshift Universe. Whilst extremely bright radio continuum foregrounds exist at these frequencies, their spectral smoothness can be exploited to allow efficient foreground subtraction. It is well known that - regardless of other instrumental effects - this removes power on scales comparable to the survey bandwidth. We investigate associated systematic biases. We show that removing line-of-sight fluctuations on large scales aliases into suppression of the 3D power spectrum across a broad range of scales. This bias can be dealt with by correctly marginalizing over small wavenumbers in the 1D power spectrum; however, the unbiased estimator will have unavoidably larger variance. We also show that Gaussian realizations of the power spectrum permit accurate and extremely rapid Monte Carlo simulations for error analysis; repeated realizations of the fully non-Gaussian field are unnecessary. We perform Monte Carlo maximum likelihood simulations of foreground removal which yield unbiased, minimum variance estimates of the power spectrum in agreement with Fisher matrix estimates. Foreground removal also distorts the 21-cm probability distribution function (PDF), reducing the contrast between neutral and ionized regions, with potentially serious consequences for efforts to extract information from the PDF. We show that it is the subtraction of large-scale modes which is responsible for this distortion, and that it is less severe in the earlier stages of reionization. It can be reduced by using larger bandwidths. In the late stages of reionization, identification of the largest ionized regions (which consist of foreground emission only) provides calibration points which potentially allow recovery of large-scale modes. Finally, we also show that (i) the broad frequency response of synchrotron and free-free emission will smear out any features in the electron momentum distribution and ensure spectrally smooth foregrounds and (ii) extragalactic radio recombination lines should be negligible foregrounds.

  13. Qualitative Collection Analysis: The Conspectus Methodology. SPEC Kit 151.

    ERIC Educational Resources Information Center

    Jakubs, Deborah

    The introduction to this Systems and Procedures Exchange Center (SPEC) kit explains the Conspectus method, which was developed in 1980 by the Research Libraries Group (RLG) as a means of systematically and qualitatively evaluating large library collections. The discussion considers advantages and disadvantages of this tool, which evaluates past…

  14. Regional Evaluation of Groundwater Age Distributions Using Lumped Parameter Models with Large, Sparse Datasets: Example from the Central Valley, California, USA

    NASA Astrophysics Data System (ADS)

    Jurgens, B. C.; Bohlke, J. K.; Voss, S.; Fram, M. S.; Esser, B.

    2015-12-01

    Tracer-based, lumped parameter models (LPMs) are an appealing way to estimate the distribution of age for groundwater because the cost of sampling wells is often less than building numerical groundwater flow models sufficiently complex to provide groundwater age distributions. In practice, however, tracer datasets are often incomplete because of anthropogenic or terrigenic contamination of tracers, or analytical limitations. While age interpretations using such datsets can have large uncertainties, it may still be possible to identify key parts of the age distribution if LPMs are carefully chosen to match hydrogeologic conceptualization and the degree of age mixing is reasonably estimated. We developed a systematic approach for evaluating groundwater age distributions using LPMs with a large but incomplete set of tracer data (3H, 3Hetrit, 14C, and CFCs) from 535 wells, mostly used for public supply, in the Central Valley, California, USA that were sampled by the USGS for the California State Water Resources Control Board Groundwater Ambient Monitoring and Assessment or the USGS National Water Quality Assessment Programs. In addition to mean ages, LPMs gave estimates of unsaturated zone travel times, recharge rates for pre- and post-development groundwater, the degree of age mixing in wells, proportion of young water (<60 yrs), and the depth of the boundary between post-development and predevelopment groundwater throughout the Central Valley. Age interpretations were evaluated by comparing past nitrate trends with LPM predicted trends, and whether the presence or absence of anthropogenic organic compounds was consistent with model results. This study illustrates a practical approach for assessing groundwater age information at a large scale to reveal important characteristics about the age structure of a major aquifer, and of the water supplies being derived from it.

  15. The impact of Lyman-α radiative transfer on large-scale clustering in the Illustris simulation

    NASA Astrophysics Data System (ADS)

    Behrens, C.; Byrohl, C.; Saito, S.; Niemeyer, J. C.

    2018-06-01

    Context. Lyman-α emitters (LAEs) are a promising probe of the large-scale structure at high redshift, z ≳ 2. In particular, the Hobby-Eberly Telescope Dark Energy Experiment aims at observing LAEs at 1.9 < z < 3.5 to measure the baryon acoustic oscillation (BAO) scale and the redshift-space distortion (RSD). However, it has been pointed out that the complicated radiative transfer (RT) of the resonant Lyman-α emission line generates an anisotropic selection bias in the LAE clustering on large scales, s ≳ 10 Mpc. This effect could potentially induce a systematic error in the BAO and RSD measurements. Also, there exists a recent claim to have observational evidence of the effect in the Lyman-α intensity map, albeit statistically insignificant. Aims: We aim at quantifying the impact of the Lyman-α RT on the large-scale galaxy clustering in detail. For this purpose, we study the correlations between the large-scale environment and the ratio of an apparent Lyman-α luminosity to an intrinsic one, which we call the "observed fraction", at 2 < z < 6. Methods: We apply our Lyman-α RT code by post-processing the full Illustris simulations. We simply assume that the intrinsic luminosity of the Lyman-α emission is proportional to the star formation rate of galaxies in Illustris, yielding a sufficiently large sample of LAEs to measure the anisotropic selection bias. Results: We find little correlation between large-scale environment and the observed fraction induced by the RT, and hence a smaller anisotropic selection bias than has previously been claimed. We argue that the anisotropy was overestimated in previous work due to insufficient spatial resolution; it is important to keep the resolution such that it resolves the high-density region down to the scale of the interstellar medium, that is, 1 physical kpc. We also find that the correlation can be further enhanced by assumptions in modeling intrinsic Lyman-α emission.

  16. Can we really use available scales for child and adolescent psychopathology across cultures? A systematic review of cross-cultural measurement invariance data.

    PubMed

    Stevanovic, Dejan; Jafari, Peyman; Knez, Rajna; Franic, Tomislav; Atilola, Olayinka; Davidovic, Nikolina; Bagheri, Zahra; Lakic, Aneta

    2017-02-01

    In this systematic review, we assessed available evidence for cross-cultural measurement invariance of assessment scales for child and adolescent psychopathology as an indicator of cross-cultural validity. A literature search was conducted using the Medline, PsychInfo, Scopus, Web of Science, and Google Scholar databases. Cross-cultural measurement invariance data was available for 26 scales. Based on the aggregation of the evidence from the studies under review, none of the evaluated scales have strong evidence for cross-cultural validity and suitability for cross-cultural comparison. A few of the studies showed a moderate level of measurement invariance for some scales (such as the Fear Survey Schedule for Children-Revised, Multidimensional Anxiety Scale for Children, Revised Child Anxiety and Depression Scale, Revised Children's Manifest Anxiety Scale, Mood and Feelings Questionnaire, and Disruptive Behavior Rating Scale), which may make them suitable in cross-cultural comparative studies. The remainder of the scales either showed weak or outright lack of measurement invariance. This review showed only limited testing for measurement invariance across cultural groups of scales for pediatric psychopathology, with evidence of cross-cultural validity for only a few scales. This study also revealed a need to improve practices of statistical analysis reporting in testing measurement invariance. Implications for future research are discussed.

  17. From the ORFeome concept to highly comprehensive, full-genome screening libraries.

    PubMed

    Rid, Raphaela; Abdel-Hadi, Omar; Maier, Richard; Wagner, Martin; Hundsberger, Harald; Hintner, Helmut; Bauer, Johann; Onder, Kamil

    2013-02-01

    Recombination-based cloning techniques have in recent times facilitated the establishment of genome-scale single-gene ORFeome repositories. Their further handling and downstream application in systematic fashion is, however, practically impeded because of logistical plus economic challenges. At this juncture, simultaneously transferring entire gene collections in compiled pool format could represent an advanced compromise between systematic ORFeome (an organism's entire set of protein-encoding open reading frames) projects and traditional random library approaches, but has not yet been considered in great detail. In our endeavor to merge the comprehensiveness of ORFeomes with a basically simple, streamlined, and easily executable single-tube design, we have here produced five different pooled screening-ready libraries for both Staphylococcus aureus and Homo sapiens. By evaluating the parallel transfer efficiencies of differentially sized genes from initial polymerase chain reaction (PCR) product amplification to entry and final destination library construction via quantitative real-time PCR, we found that the complexity of the gene population is fairly stably maintained once an entry resource has been successfully established, and that no apparent size-selection bias loss of large inserts takes place. Recombinational transfer processes are hence robust enough for straightforwardly achieving such pooled screening libraries.

  18. Genetic testing in heritable cardiac arrhythmia syndromes: differentiating pathogenic mutations from background genetic noise.

    PubMed

    Giudicessi, John R; Ackerman, Michael J

    2013-01-01

    In this review, we summarize the basic principles governing rare variant interpretation in the heritable cardiac arrhythmia syndromes, focusing on recent advances that have led to disease-specific approaches to the interpretation of positive genetic testing results. Elucidation of the genetic substrates underlying heritable cardiac arrhythmia syndromes has unearthed new arrhythmogenic mechanisms and given rise to a number of clinically meaningful genotype-phenotype correlations. As such, genetic testing for these disorders now carries important diagnostic, prognostic, and therapeutic implications. Recent large-scale systematic studies designed to explore the background genetic 'noise' rate associated with these genetic tests have provided important insights and enhanced how positive genetic testing results are interpreted for these potentially lethal, yet highly treatable, cardiovascular disorders. Clinically available genetic tests for heritable cardiac arrhythmia syndromes allow the identification of potentially at-risk family members and contribute to the risk-stratification and selection of therapeutic interventions in affected individuals. The systematic evaluation of the 'signal-to-noise' ratio associated with these genetic tests has proven critical and essential to assessing the probability that a given variant represents a rare pathogenic mutation or an equally rare, yet innocuous, genetic bystander.

  19. Evaluation of RNAi and CRISPR technologies by large-scale gene expression profiling in the Connectivity Map.

    PubMed

    Smith, Ian; Greenside, Peyton G; Natoli, Ted; Lahr, David L; Wadden, David; Tirosh, Itay; Narayan, Rajiv; Root, David E; Golub, Todd R; Subramanian, Aravind; Doench, John G

    2017-11-01

    The application of RNA interference (RNAi) to mammalian cells has provided the means to perform phenotypic screens to determine the functions of genes. Although RNAi has revolutionized loss-of-function genetic experiments, it has been difficult to systematically assess the prevalence and consequences of off-target effects. The Connectivity Map (CMAP) represents an unprecedented resource to study the gene expression consequences of expressing short hairpin RNAs (shRNAs). Analysis of signatures for over 13,000 shRNAs applied in 9 cell lines revealed that microRNA (miRNA)-like off-target effects of RNAi are far stronger and more pervasive than generally appreciated. We show that mitigating off-target effects is feasible in these datasets via computational methodologies to produce a consensus gene signature (CGS). In addition, we compared RNAi technology to clustered regularly interspaced short palindromic repeat (CRISPR)-based knockout by analysis of 373 single guide RNAs (sgRNAs) in 6 cells lines and show that the on-target efficacies are comparable, but CRISPR technology is far less susceptible to systematic off-target effects. These results will help guide the proper use and analysis of loss-of-function reagents for the determination of gene function.

  20. Automatic Evaluations and Exercising: Systematic Review and Implications for Future Research.

    PubMed

    Schinkoeth, Michaela; Antoniewicz, Franziska

    2017-01-01

    The general purpose of this systematic review was to summarize, structure and evaluate the findings on automatic evaluations of exercising. Studies were eligible for inclusion if they reported measuring automatic evaluations of exercising with an implicit measure and assessed some kind of exercise variable. Fourteen nonexperimental and six experimental studies (out of a total N = 1,928) were identified and rated by two independent reviewers. The main study characteristics were extracted and the grade of evidence for each study evaluated. First, results revealed a large heterogeneity in the applied measures to assess automatic evaluations of exercising and the exercise variables. Generally, small to large-sized significant relations between automatic evaluations of exercising and exercise variables were identified in the vast majority of studies. The review offers a systematization of the various examined exercise variables and prompts to differentiate more carefully between actually observed exercise behavior (proximal exercise indicator) and associated physiological or psychological variables (distal exercise indicator). Second, a lack of transparent reported reflections on the differing theoretical basis leading to the use of specific implicit measures was observed. Implicit measures should be applied purposefully, taking into consideration the individual advantages or disadvantages of the measures. Third, 12 studies were rated as providing first-grade evidence (lowest grade of evidence), five represent second-grade and three were rated as third-grade evidence. There is a dramatic lack of experimental studies, which are essential for illustrating the cause-effect relation between automatic evaluations of exercising and exercise and investigating under which conditions automatic evaluations of exercising influence behavior. Conclusions about the necessity of exercise interventions targeted at the alteration of automatic evaluations of exercising should therefore not be drawn too hastily.

  1. Automatic Evaluations and Exercising: Systematic Review and Implications for Future Research

    PubMed Central

    Schinkoeth, Michaela; Antoniewicz, Franziska

    2017-01-01

    The general purpose of this systematic review was to summarize, structure and evaluate the findings on automatic evaluations of exercising. Studies were eligible for inclusion if they reported measuring automatic evaluations of exercising with an implicit measure and assessed some kind of exercise variable. Fourteen nonexperimental and six experimental studies (out of a total N = 1,928) were identified and rated by two independent reviewers. The main study characteristics were extracted and the grade of evidence for each study evaluated. First, results revealed a large heterogeneity in the applied measures to assess automatic evaluations of exercising and the exercise variables. Generally, small to large-sized significant relations between automatic evaluations of exercising and exercise variables were identified in the vast majority of studies. The review offers a systematization of the various examined exercise variables and prompts to differentiate more carefully between actually observed exercise behavior (proximal exercise indicator) and associated physiological or psychological variables (distal exercise indicator). Second, a lack of transparent reported reflections on the differing theoretical basis leading to the use of specific implicit measures was observed. Implicit measures should be applied purposefully, taking into consideration the individual advantages or disadvantages of the measures. Third, 12 studies were rated as providing first-grade evidence (lowest grade of evidence), five represent second-grade and three were rated as third-grade evidence. There is a dramatic lack of experimental studies, which are essential for illustrating the cause-effect relation between automatic evaluations of exercising and exercise and investigating under which conditions automatic evaluations of exercising influence behavior. Conclusions about the necessity of exercise interventions targeted at the alteration of automatic evaluations of exercising should therefore not be drawn too hastily. PMID:29250022

  2. Precise Protein Photolithography (P3): High Performance Biopatterning Using Silk Fibroin Light Chain as the Resist

    PubMed Central

    Liu, Wanpeng; Zhou, Zhitao; Zhang, Shaoqing; Shi, Zhifeng; Tabarini, Justin; Lee, Woonsoo; Zhang, Yeshun; Gilbert Corder, S. N.; Li, Xinxin; Dong, Fei; Cheng, Liang; Liu, Mengkun; Kaplan, David L.; Omenetto, Fiorenzo G.

    2017-01-01

    Precise patterning of biomaterials has widespread applications, including drug release, degradable implants, tissue engineering, and regenerative medicine. Patterning of protein‐based microstructures using UV‐photolithography has been demonstrated using protein as the resist material. The Achilles heel of existing protein‐based biophotoresists is the inevitable wide molecular weight distribution during the protein extraction/regeneration process, hindering their practical uses in the semiconductor industry where reliability and repeatability are paramount. A wafer‐scale high resolution patterning of bio‐microstructures using well‐defined silk fibroin light chain as the resist material is presented showing unprecedent performances. The lithographic and etching performance of silk fibroin light chain resists are evaluated systematically and the underlying mechanisms are thoroughly discussed. The micropatterned silk structures are tested as cellular substrates for the successful spatial guidance of fetal neural stems cells seeded on the patterned substrates. The enhanced patterning resolution, the improved etch resistance, and the inherent biocompatibility of such protein‐based photoresist provide new opportunities in fabricating large scale biocompatible functional microstructures. PMID:28932678

  3. Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale

    Treesearch

    R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove

    2016-01-01

    The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...

  4. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  5. Renormalization group analysis of turbulence

    NASA Technical Reports Server (NTRS)

    Smith, Leslie M.

    1989-01-01

    The objective is to understand and extend a recent theory of turbulence based on dynamic renormalization group (RNG) techniques. The application of RNG methods to hydrodynamic turbulence was explored most extensively by Yakhot and Orszag (1986). An eddy viscosity was calculated which was consistent with the Kolmogorov inertial range by systematic elimination of the small scales in the flow. Further, assumed smallness of the nonlinear terms in the redefined equations for the large scales results in predictions for important flow constants such as the Kolmogorov constant. It is emphasized that no adjustable parameters are needed. The parameterization of the small scales in a self-consistent manner has important implications for sub-grid modeling.

  6. Methodological Flaws, Conflicts of Interest, and Scientific Fallacies: Implications for the Evaluation of Antidepressants' Efficacy and Harm.

    PubMed

    Hengartner, Michael P

    2017-01-01

    In current psychiatric practice, antidepressants are widely and with ever-increasing frequency prescribed to patients. However, several scientific biases obfuscate estimates of antidepressants' efficacy and harm, and these are barely recognized in treatment guidelines. The aim of this mini-review is to critically evaluate the efficacy and harm of antidepressants for acute and maintenance treatment with respect to systematic biases related to industry funding and trial methodology. Narrative review based on a comprehensive search of the literature. It is shown that the pooled efficacy of antidepressants is weak and below the threshold of a minimally clinically important change once publication and reporting biases are considered. Moreover, the small mean difference in symptom reductions relative to placebo is possibly attributable to observer effects in unblinded assessors and patient expectancies. With respect to trial dropout rates, a hard outcome not subjected to observer bias, no difference was observed between antidepressants and placebo. The discontinuation trials on the efficacy of antidepressants in maintenance therapy are systematically flawed, because in these studies, spontaneous remitters are excluded, whereas half of all patients who remitted on antidepressants are abruptly switched to placebo. This can cause a severe withdrawal syndrome that is easily misdiagnosed as a relapse when assessed on subjective symptom rating scales. In accordance, the findings of naturalistic long-term studies suggest that maintenance therapy has no clear benefit, and non-drug users do not show increased recurrence rates. Moreover, a growing body of evidence from hundreds of randomized controlled trials suggests that antidepressants cause suicidality, but this risk is underestimated because data from industry-funded trials are systematically flawed. Unselected, population-wide observational studies indicate that depressive patients who use antidepressants are at an increased risk of suicide and that they have a higher rate of all-cause mortality than matched controls. The strong reliance on industry-funded research results in an uncritical approval of antidepressants. Due to several flaws such as publication and reporting bias, unblinding of outcome assessors, concealment and recoding of serious adverse events, the efficacy of antidepressants is systematically overestimated, and harm is systematically underestimated. Therefore, I conclude that antidepressants are largely ineffective and potentially harmful.

  7. Lifetime evaluation of large format CMOS mixed signal infrared devices

    NASA Astrophysics Data System (ADS)

    Linder, A.; Glines, Eddie

    2015-09-01

    New large scale foundry processes continue to produce reliable products. These new large scale devices continue to use industry best practice to screen for failure mechanisms and validate their long lifetime. The Failure-in-Time analysis in conjunction with foundry qualification information can be used to evaluate large format device lifetimes. This analysis is a helpful tool when zero failure life tests are typical. The reliability of the device is estimated by applying the failure rate to the use conditions. JEDEC publications continue to be the industry accepted methods.

  8. Effect of Diabetes Mellitus Type 2 on Salivary Glucose – A Systematic Review and Meta-Analysis of Observational Studies

    PubMed Central

    Mascarenhas, Paulo; Fatela, Bruno; Barahona, Isabel

    2014-01-01

    Background Early screening of type 2 diabetes mellitus (DM) is essential for improved prognosis and effective delay of clinical complications. However, testing for high glycemia often requires invasive and painful blood testing, limiting its large-scale applicability. We have combined new, unpublished data with published data comparing salivary glucose levels in type 2 DM patients and controls and/or looked at the correlation between salivary glucose and glycemia/HbA1c to systematically review the effectiveness of salivary glucose to estimate glycemia and HbA1c. We further discuss salivary glucose as a biomarker for large-scale screening of diabetes or developing type 2 DM. Methods and Findings We conducted a meta-analysis of peer-reviewed published articles that reported data regarding mean salivary glucose levels and/or correlation between salivary glucose levels and glycemia or HbA1c for type 2 DM and non-diabetic individuals and combined them with our own unpublished results. Our global meta-analysis of standardized mean differences on salivary glucose levels shows an overall large positive effect of type 2 DM over salivary glucose (Hedge's g = 1.37). The global correlation coefficient (r) between salivary glucose and glycemia was large (r = 0.49), with subgroups ranging from medium (r = 0.30 in non-diabetics) to very large (r = 0.67 in diabetics). Meta-analysis of the global correlation between salivary glucose and HbA1c showed an overall association of medium strength (r = 0.37). Conclusions Our systematic review reports an overall meaningful salivary glucose concentration increase in type 2 DM and a significant overall relationship between salivary glucose concentration and associated glycemia/HbA1c values, with the strength of the correlation increasing for higher glycemia/HbA1c values. These results support the potential of salivary glucose levels as a biomarker for type 2 DM, providing a less painful/invasive method for screening type 2 DM, as well as for monitoring blood glucose levels in large cohorts of DM patients. PMID:25025218

  9. Simulations of hypervelocity impacts for asteroid deflection studies

    NASA Astrophysics Data System (ADS)

    Heberling, T.; Ferguson, J. M.; Gisler, G. R.; Plesko, C. S.; Weaver, R.

    2016-12-01

    The possibility of kinetic-impact deflection of threatening near-earth asteroids will be tested for the first time in the proposed AIDA (Asteroid Impact Deflection Assessment) mission, involving two independent spacecraft, NASAs DART (Double Asteroid Redirection Test) and ESAs AIM (Asteroid Impact Mission). The impact of the DART spacecraft onto the secondary of the binary asteroid 65803 Didymos, at a speed of 5 to 7 km/s, is expected to alter the mutual orbit by an observable amount. The velocity imparted to the secondary depends on the geometry and dynamics of the impact, and especially on the momentum enhancement factor, conventionally called beta. We use the Los Alamos hydrocodes Rage and Pagosa to estimate beta in laboratory-scale benchmark experiments and in the large-scale asteroid deflection test. Simulations are performed in two- and three-dimensions, using a variety of equations of state and strength models for both the lab-scale and large-scale cases. This work is being performed as part of a systematic benchmarking study for the AIDA mission that includes other hydrocodes.

  10. The void spectrum in two-dimensional numerical simulations of gravitational clustering

    NASA Technical Reports Server (NTRS)

    Kauffmann, Guinevere; Melott, Adrian L.

    1992-01-01

    An algorithm for deriving a spectrum of void sizes from two-dimensional high-resolution numerical simulations of gravitational clustering is tested, and it is verified that it produces the correct results where those results can be anticipated. The method is used to study the growth of voids as clustering proceeds. It is found that the most stable indicator of the characteristic void 'size' in the simulations is the mean fractional area covered by voids of diameter d, in a density field smoothed at its correlation length. Very accurate scaling behavior is found in power-law numerical models as they evolve. Eventually, this scaling breaks down as the nonlinearity reaches larger scales. It is shown that this breakdown is a manifestation of the undesirable effect of boundary conditions on simulations, even with the very large dynamic range possible here. A simple criterion is suggested for deciding when simulations with modest large-scale power may systematically underestimate the frequency of larger voids.

  11. Is franchising in health care valuable? A systematic review.

    PubMed

    Nijmeijer, Karlijn J; Fabbricotti, Isabelle N; Huijsman, Robbert

    2014-03-01

    Franchising is an organizational form that originates from the business sector. It is increasingly used in the healthcare sector with the aim of enhancing quality and accessibility for patients, improving the efficiency and competitiveness of organizations and/or providing professionals with a supportive working environment. However, a structured overview of the scientific evidence for these claims is absent, whereas such an overview can be supportive to scholars, policy makers and franchise practitioners. This article provides a systematic review of literature on the outcomes of franchising in health care. Seven major databases were systematically searched. Peer-reviewed empirical journal articles focusing on the relationship between franchising and outcomes were included. Eventually, 15 articles were included and their findings were narratively synthesized. The level of evidence was rated by using the Grading of Recommendations Assessment, Development, and Evaluation scale. The review shows that outcomes of franchising in health care have primarily been evaluated in low- and middle-income countries in the reproductive health/family planning sector. Articles about high-income countries are largely absent, apart from three articles evaluating pharmacy franchises. Most studies focus on outcomes for customers/clients and less on organizations and professionals. The evidence is primarily of low quality. Based on this evidence, franchising is predominantly positively associated with client volumes, physical accessibility and some types of quality. Findings regarding utilization, customer loyalty, efficiency and results for providers are mixed. We conclude that franchising has the potential to improve outcomes in healthcare practices, but the evidence base is yet too weak for firm conclusions. Extensive research is needed to further determine the value of healthcare franchising in various contexts. We advocate more research in other healthcare sectors in both low- and middle-income countries and high-income countries, on more types of outcomes with attention to trade-offs, and on what factors produce those outcomes.

  12. The measurement of fatigue in chronic illness: a systematic review of unidimensional and multidimensional fatigue measures.

    PubMed

    Whitehead, Lisa

    2009-01-01

    Fatigue is a common symptom associated with a wide range of chronic diseases. A large number of instruments have been developed to measure fatigue. An assessment regarding the reliability, validity, and utility of fatigue measures is time-consuming for the clinician and researcher, and few reviews exist on which to draw such information. The aim of this article is to present a critical review of fatigue measures, the populations in which the scales have been used, and the extent to which the psychometric properties of each instrument have been evaluated to provide clinicians and researchers with information on which to base decisions. Seven databases were searched for all articles that measured fatigue and offered an insight into the psychometric properties of the scales used over the period 1980-2007. Criteria for judging the "ideal" measure were developed to encompass scale usability, clinical/research utility, and the robustness of psychometric properties. Twenty-two fatigue measures met the inclusion criteria and were evaluated. A further 17 measures met some of the criteria, but have not been tested beyond initial development, and are reviewed briefly at the end of the article. The review did not identify any instrument that met all the criteria of an ideal instrument. However, a small number of short instruments demonstrated good psychometric properties (Fatigue Severity Scale [FSS], Fatigue Impact Scale [FIS], and Brief Fatigue Inventory [BFI]), and three comprehensive instruments demonstrated the same (Fatigue Symptom Inventory [FSI], Multidimensional Assessment of Fatigue [MAF], and Multidimensional Fatigue Symptom Inventory [MFSI]). Only four measures (BFI, FSS, FSI, and MAF) demonstrated the ability to detect change over time. The clinician and researcher also should consider the populations in which the scale has been used previously to assess its validity with their own patient group, and assess the content of a scale to ensure that the key qualitative aspects of fatigue of the population of interest are covered.

  13. Risk of relapse after natalizumab withdrawal

    PubMed Central

    Vukusic, Sandra; Casey, Romain; Debard, Nadine; Stankoff, Bruno; Mrejen, Serge; Uhry, Zoe; Van Ganse, Eric; Castot, Anne; Clanet, Michel; Lubetzki, Catherine; Confavreux, Christian

    2016-01-01

    Objective: To assess disease activity within 12 months after natalizumab (NZ) discontinuation in a large French postmarketing cohort. Methods: In France, patients exposed at least once to NZ were included in the TYSEDMUS observational and multicenter cohort, part of the French NZ Risk Management Plan. Clinical disease activity during the year following NZ discontinuation was assessed in this cohort. Time to first relapse after NZ stop was analyzed using Kaplan-Meier method and potentially associated factors were studied using a multivariate Cox model. Results: Out of the 4,055 patients with multiple sclerosis (MS) included in TYSEDMUS, 1,253 discontinued NZ and 715 of them had relevant data for our study. The probability of relapse within the year after NZ stop was estimated at 45% (95% confidence interval 0.41–0.49). Conclusions: This large and systematic survey of patients with MS after NZ withdrawal allows quantifying the risk of increased disease activity following treatment discontinuation. This study provides large-scale, multicenter, systematic data after NZ cessation in real-life settings. PMID:27844037

  14. Statistical detection of systematic election irregularities

    PubMed Central

    Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan

    2012-01-01

    Democratic societies are built around the principle of free and fair elections, and that each citizen’s vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons. PMID:23010929

  15. Statistical detection of systematic election irregularities.

    PubMed

    Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan

    2012-10-09

    Democratic societies are built around the principle of free and fair elections, and that each citizen's vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons.

  16. The Value of Large-Scale Randomised Control Trials in System-Wide Improvement: The Case of the Reading Catch-Up Programme

    ERIC Educational Resources Information Center

    Fleisch, Brahm; Taylor, Stephen; Schöer, Volker; Mabogoane, Thabo

    2017-01-01

    This article illustrates the value of large-scale impact evaluations with counterfactual components. It begins by exploring the limitations of small-scale impact studies, which do not allow reliable inference to a wider population or which do not use valid comparison groups. The paper then describes the design features of a recent large-scale…

  17. Individuals with chronic ankle instability exhibit dynamic postural stability deficits and altered unilateral landing biomechanics: A systematic review.

    PubMed

    Simpson, Jeffrey D; Stewart, Ethan M; Macias, David M; Chander, Harish; Knight, Adam C

    2018-06-13

    To evaluate the literature regarding unilateral landing biomechanics and dynamic postural stability in individuals with and without chronic ankle instability (CAI). Four online databases (PubMed, ScienceDirect, Scopus, and SportDiscus) were searched from the earliest records to 31 January 2018, as well as reference sections of related journal articles, to complete the systematic search. Studies investigating the influence of CAI on unilateral landing biomechanics and dynamic postural stability were systematically reviewed and evaluated. Twenty articles met the criteria and were included in the systematic review. Individuals with CAI were found to have deficits in dynamic postural stability on the affected limb with medium to large effect sizes and altered lower extremity kinematics, most notably in the ankle and knee, with medium to large effect sizes. Additionally, greater loading rates and peak ground reaction forces, in addition to reductions in ankle muscle activity were also found in individuals with CAI during unilateral jump-landing tasks. Individuals with CAI demonstrate dynamic postural stability deficits, lower extremity kinematic alterations, and reduced neuromuscular control during unilateral jump-landings. These are likely factors that contribute recurrent lateral ankle sprain injuries during dynamic activity in individuals with CAI. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Do large-scale hospital- and system-wide interventions improve patient outcomes: a systematic review.

    PubMed

    Clay-Williams, Robyn; Nosrati, Hadis; Cunningham, Frances C; Hillman, Kenneth; Braithwaite, Jeffrey

    2014-09-03

    While health care services are beginning to implement system-wide patient safety interventions, evidence on the efficacy of these interventions is sparse. We know that uptake can be variable, but we do not know the factors that affect uptake or how the interventions establish change and, in particular, whether they influence patient outcomes. We conducted a systematic review to identify how organisational and cultural factors mediate or are mediated by hospital-wide interventions, and to assess the effects of those factors on patient outcomes. A systematic review was conducted and reported in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Database searches were conducted using MEDLINE from 1946, CINAHL from 1991, EMBASE from 1947, Web of Science from 1934, PsycINFO from 1967, and Global Health from 1910 to September 2012. The Lancet, JAMA, BMJ, BMJ Quality and Safety, The New England Journal of Medicine and Implementation Science were also hand searched for relevant studies published over the last 5 years. Eligible studies were required to focus on organisational determinants of hospital- and system-wide interventions, and to provide patient outcome data before and after implementation of the intervention. Empirical, peer-reviewed studies reporting randomised and non-randomised controlled trials, observational, and controlled before and after studies were included in the review. Six studies met the inclusion criteria. Improved outcomes were observed for studies where outcomes were measured at least two years after the intervention. Associations between organisational factors, intervention success and patient outcomes were undetermined: organisational culture and patient outcomes were rarely measured together, and measures for culture and outcome were not standardised. Common findings show the difficulty of introducing large-scale interventions, and that effective leadership and clinical champions, adequate financial and educational resources, and dedicated promotional activities appear to be common factors in successful system-wide change.The protocol has been registered in the international prospective register of systematic reviews, PROSPERO (Registration No. CRD42103003050).

  19. Assessing oral health-related quality of life in children and adolescents: a systematic review and standardized comparison of available instruments.

    PubMed

    Zaror, Carlos; Pardo, Yolanda; Espinoza-Espinoza, Gerardo; Pont, Àngels; Muñoz-Millán, Patricia; Martínez-Zapata, María José; Vilagut, Gemma; Forero, Carlos G; Garin, Olatz; Alonso, Jordi; Ferrer, Montse

    2018-03-22

    To obtain a systematic and standardized evaluation of the current evidence on development process, metric properties, and administration issues of oral health-related quality of life instruments available for children and adolescents. A systematic search until October 2016 was conducted in PubMed, Embase, Lilacs, SciELO, and Cochrane databases. Articles with information regarding the development process, metric properties, and administration issues of pediatric instruments measuring oral health-related quality of life were eligible for inclusion. Two researchers independently evaluated each instrument applying the Evaluating Measures of Patient-Reported Outcomes (EMPRO) tool. An overall and seven attribute-specific EMPRO scores were calculated (range 0-100, worst to best): measurement model, reliability, validity, responsiveness, interpretability, burden, and alternative forms. We identified 18 instruments evaluated in 132 articles. From five instruments designed for preschoolers, the Early Childhood Oral Health Impact Scale (ECOHIS) obtained the highest overall EMPRO score (82.2). Of nine identified for schoolchildren and adolescents, the best rated instrument was the Child Perceptions Questionnaire 11-14 (82.1). Among the four instruments developed for any age, the Family Impact Scale (FIS) obtained the highest scores (80.3). The evidence supports the use of the ECOHIS for preschoolers, while the age is a key factor when choosing among the four recommended instruments for schoolchildren and adolescents. Instruments for specific conditions, symptoms, or treatments need further research on metric properties. Our results facilitate decision-making on the correct oral health-related quality of life instrument selection for any certain study purpose and population during the childhood and adolescence life cycle.

  20. Systematic literature review of clinical trials evaluating pharmacotherapy for overactive bladder in elderly patients: An assessment of trial quality.

    PubMed

    Kistler, Kristin D; Xu, Yingxin; Zou, Kelly H; Ntanios, Fady; Chapman, Douglass S; Luo, Xuemei

    2018-01-01

    Overactive bladder (OAB) disproportionately affects older-aged adults, yet most randomized controlled trials (RCTs) underrepresent patients ≥65. This systematic literature review (SLR) identified RCTs evaluating β-3 adrenergic agonists or muscarinic antagonists in elderly patients with OAB, and compared study quality across trials. MEDLINE ® , Embase ® , and Cochrane Collaboration Central Register of Clinical Trials databases were searched from inception through April 28, 2015 to identify published, peer-reviewed RCT reports evaluating β-3 adrenergic agonists or muscarinic antagonists in elderly OAB patients (either ≥65 years or study-described as "elderly"). To assess study quality of RCT reports, we focused on internal/external validity, assessed via two scales: the validated Effective Public Health Practice Project [EPHPP]): Quality Assessment Tool for Quantitative Studies, and a tool commissioned by the Agency for Healthcare Research and Quality (AHRQ). Database searches yielded 1380 records that were then screened according to predefined inclusion/exclusion criteria. We included eight papers meeting study criteria. Despite scientific community efforts to improve RCT reporting standards, published reports still include incomplete and inconsistent reporting-of subject attrition, baseline patient characteristics, inclusion/exclusion criteria, and other important details. Only three of the eight OAB RCTs in this review received quality ratings of Strong (EPHPP) or Fair (AHRQ) and were multicenter with large samples. Despite the prevalence of OAB among older age individuals, relatively few RCTs evaluate OAB treatments explicitly among elderly subjects. The findings from this quality assessment suggest some areas for improvement in both conduct and reporting of future RCTs assessing OAB treatment in elderly. © 2017 Wiley Periodicals, Inc.

  1. The Regional Climate Model Evaluation System: A Systematic Evaluation Of CORDEX Simulations Using Obs4MIPs

    NASA Astrophysics Data System (ADS)

    Goodman, A.; Lee, H.; Waliser, D. E.; Guttowski, W.

    2017-12-01

    Observation-based evaluations of global climate models (GCMs) have been a key element for identifying systematic model biases that can be targeted for model improvements and for establishing uncertainty associated with projections of global climate change. However, GCMs are limited in their ability to represent physical phenomena which occur on smaller, regional scales, including many types of extreme weather events. In order to help facilitate projections in changes of such phenomena, simulations from regional climate models (RCMs) for 14 different domains around the world are being provided by the Coordinated Regional Climate Downscaling Experiment (CORDEX; www.cordex.org). However, although CORDEX specifies standard simulation and archiving protocols, these simulations are conducted independently by individual research and modeling groups representing each of these domains often with different output requirements and data archiving and exchange capabilities. Thus, with respect to similar efforts using GCMs (e.g., the Coupled Model Intercomparison Project, CMIP), it is more difficult to achieve a standardized, systematic evaluation of the RCMs for each domain and across all the CORDEX domains. Using the Regional Climate Model Evaluation System (RCMES; rcmes.jpl.nasa.gov) developed at JPL, we are developing easy to use templates for performing systematic evaluations of CORDEX simulations. Results from the application of a number of evaluation metrics (e.g., biases, centered RMS, and pattern correlations) will be shown for a variety of physical quantities and CORDEX domains. These evaluations are performed using products from obs4MIPs, an activity initiated by DOE and NASA, and now shepherded by the World Climate Research Program's Data Advisory Council.

  2. XLID-causing mutations and associated genes challenged in light of data from large-scale human exome sequencing.

    PubMed

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-08-08

    Because of the unbalanced sex ratio (1.3-1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  3. Neuroscience-related research in Ghana: a systematic evaluation of direction and capacity.

    PubMed

    Quansah, Emmanuel; Karikari, Thomas K

    2016-02-01

    Neurological and neuropsychiatric diseases account for considerable healthcare, economic and social burdens in Ghana. In order to effectively address these burdens, appropriately-trained scientists who conduct high-impact neuroscience research will be needed. Additionally, research directions should be aligned with national research priorities. However, to provide information about current neuroscience research productivity and direction, the existing capacity and focus need to be identified. This would allow opportunities for collaborative research and training to be properly explored and developmental interventions to be better targeted. In this study, we sought to evaluate the existing capacity and direction of neuroscience-related research in Ghana. To do this, we examined publications reporting research investigations authored by scientists affiliated with Ghanaian institutions in specific areas of neuroscience over the last two decades (1995-2015). 127 articles that met our inclusion criteria were systematically evaluated in terms of research foci, annual publication trends and author affiliations. The most actively-researched areas identified include neurocognitive impairments in non-nervous system disorders, depression and suicide, epilepsy and seizures, neurological impact of substance misuse, and neurological disorders. These studies were mostly hospital and community-based surveys. About 60% of these articles were published in the last seven years, suggesting a recent increase in research productivity. However, data on experimental and clinical research outcomes were particularly lacking. We suggest that future investigations should focus on the following specific areas where information was lacking: large-scale disease epidemiology, effectiveness of diagnostic platforms and therapeutic treatments, and the genetic, genomic and molecular bases of diseases.

  4. Inference of RhoGAP/GTPase regulation using single-cell morphological data from a combinatorial RNAi screen.

    PubMed

    Nir, Oaz; Bakal, Chris; Perrimon, Norbert; Berger, Bonnie

    2010-03-01

    Biological networks are highly complex systems, consisting largely of enzymes that act as molecular switches to activate/inhibit downstream targets via post-translational modification. Computational techniques have been developed to perform signaling network inference using some high-throughput data sources, such as those generated from transcriptional and proteomic studies, but comparable methods have not been developed to use high-content morphological data, which are emerging principally from large-scale RNAi screens, to these ends. Here, we describe a systematic computational framework based on a classification model for identifying genetic interactions using high-dimensional single-cell morphological data from genetic screens, apply it to RhoGAP/GTPase regulation in Drosophila, and evaluate its efficacy. Augmented by knowledge of the basic structure of RhoGAP/GTPase signaling, namely, that GAPs act directly upstream of GTPases, we apply our framework for identifying genetic interactions to predict signaling relationships between these proteins. We find that our method makes mediocre predictions using only RhoGAP single-knockdown morphological data, yet achieves vastly improved accuracy by including original data from a double-knockdown RhoGAP genetic screen, which likely reflects the redundant network structure of RhoGAP/GTPase signaling. We consider other possible methods for inference and show that our primary model outperforms the alternatives. This work demonstrates the fundamental fact that high-throughput morphological data can be used in a systematic, successful fashion to identify genetic interactions and, using additional elementary knowledge of network structure, to infer signaling relations.

  5. Controls on fallen leaf chemistry and forest floor element masses in native and novel forests across a tropical island

    Treesearch

    H.E. Erickson; E.H. Helmer; T.J. Brandeis; A.E. Lugo

    2014-01-01

    Litter chemistry varies across landscapes according to factors rarely examined simultaneously. We analyzed 11 elements in forest floor (fallen) leaves and additional litter components from 143 forest inventory plots systematically located across Puerto Rico, a tropical island recovering from large-scale forest clearing. We assessed whether three existing, independently...

  6. Using Systematic Item Selection Methods to Improve Universal Design of Assessments. Policy Directions. Number 18

    ERIC Educational Resources Information Center

    Johnstone, Christopher; Thurlow, Martha; Moore, Michael; Altman, Jason

    2006-01-01

    The No Child Left Behind Act of 2001 (NCLB) and other recent changes in federal legislation have placed greater emphasis on accountability in large-scale testing. Included in this emphasis are regulations that require assessments to be accessible. States are accountable for the success of all students, and tests should be designed in a way that…

  7. Lessons Learned from PISA: A Systematic Review of Peer-Reviewed Articles on the Programme for International Student Assessment

    ERIC Educational Resources Information Center

    Hopfenbeck, Therese N.; Lenkeit, Jenny; El Masri, Yasmine; Cantrell, Kate; Ryan, Jeanne; Baird, Jo-Anne

    2018-01-01

    International large-scale assessments are on the rise, with the Programme for International Student Assessment (PISA) seen by many as having strategic prominence in education policy debates. The present article reviews PISA-related English-language peer-reviewed articles from the programme's first cycle in 2000 to its most current in 2015. Five…

  8. ScreenBEAM: a novel meta-analysis algorithm for functional genomics screens via Bayesian hierarchical modeling | Office of Cancer Genomics

    Cancer.gov

    Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives.

  9. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  10. Should we search Chinese biomedical databases when performing systematic reviews?

    PubMed

    Cohen, Jérémie F; Korevaar, Daniël A; Wang, Junfeng; Spijker, René; Bossuyt, Patrick M

    2015-03-06

    Chinese biomedical databases contain a large number of publications available to systematic reviewers, but it is unclear whether they are used for synthesizing the available evidence. We report a case of two systematic reviews on the accuracy of anti-cyclic citrullinated peptide for diagnosing rheumatoid arthritis. In one of these, the authors did not search Chinese databases; in the other, they did. We additionally assessed the extent to which Cochrane reviewers have searched Chinese databases in a systematic overview of the Cochrane Library (inception to 2014). The two diagnostic reviews included a total of 269 unique studies, but only 4 studies were included in both reviews. The first review included five studies published in the Chinese language (out of 151) while the second included 114 (out of 118). The summary accuracy estimates from the two reviews were comparable. Only 243 of the published 8,680 Cochrane reviews (less than 3%) searched one or more of the five major Chinese databases. These Chinese databases index about 2,500 journals, of which less than 6% are also indexed in MEDLINE. All 243 Cochrane reviews evaluated an intervention, 179 (74%) had at least one author with a Chinese affiliation; 118 (49%) addressed a topic in complementary or alternative medicine. Although searching Chinese databases may lead to the identification of a large amount of additional clinical evidence, Cochrane reviewers have rarely included them in their search strategy. We encourage future initiatives to evaluate more systematically the relevance of searching Chinese databases, as well as collaborative efforts to allow better incorporation of Chinese resources in systematic reviews.

  11. Disrupted Topological Patterns of Large-Scale Network in Conduct Disorder

    PubMed Central

    Jiang, Yali; Liu, Weixiang; Ming, Qingsen; Gao, Yidian; Ma, Ren; Zhang, Xiaocui; Situ, Weijun; Wang, Xiang; Yao, Shuqiao; Huang, Bingsheng

    2016-01-01

    Regional abnormalities in brain structure and function, as well as disrupted connectivity, have been found repeatedly in adolescents with conduct disorder (CD). Yet, the large-scale brain topology associated with CD is not well characterized, and little is known about the systematic neural mechanisms of CD. We employed graphic theory to investigate systematically the structural connectivity derived from cortical thickness correlation in a group of patients with CD (N = 43) and healthy controls (HCs, N = 73). Nonparametric permutation tests were applied for between-group comparisons of graphical metrics. Compared with HCs, network measures including global/local efficiency and modularity all pointed to hypo-functioning in CD, despite of preserved small-world organization in both groups. The hubs distribution is only partially overlapped with each other. These results indicate that CD is accompanied by both impaired integration and segregation patterns of brain networks, and the distribution of highly connected neural network ‘hubs’ is also distinct between groups. Such misconfiguration extends our understanding regarding how structural neural network disruptions may underlie behavioral disturbances in adolescents with CD, and potentially, implicates an aberrant cytoarchitectonic profiles in the brain of CD patients. PMID:27841320

  12. A Systematic Evaluation of Websites Offering Information on Chronic Kidney Disease

    PubMed Central

    Lutz, Erin R.; Costello, Kaitlin L.; Jo, Minjeong; Gilet, Constance A.; Hawley, Jennifer M.; Bridgman, Jessica C.; Song, Mi-Kyung

    2014-01-01

    In this study, we described the content and characteristics of 40 non-proprietary websites offering information about chronic kidney disease (CKD) and evaluated their information quality using the DISCERN scale and readability using Flesch Reading Ease and Flesch-Kincaid grade level. The areas in which the websites scored the lowest on the DISCERN scale were whether the website discussed knowledge gaps, presented balanced information, and was clear about the information source. Websites that rated higher quality on the DISCERN scale were more difficult to read. The quality and readability of many websites about CKD to be used as meaningful educational resources for patients who desire to learn more about CKD and treatment options remain inadequate. PMID:25244890

  13. An Innovative Method for Monitoring Food Quality and the Healthfulness of Consumers’ Grocery Purchases

    PubMed Central

    Tran, Le-Thuy T.; Brewster, Philip J.; Chidambaram, Valliammai; Hurdle, John F.

    2017-01-01

    This study presents a method laying the groundwork for systematically monitoring food quality and the healthfulness of consumers’ point-of-sale grocery purchases. The method automates the process of identifying United States Department of Agriculture (USDA) Food Patterns Equivalent Database (FPED) components of grocery food items. The input to the process is the compact abbreviated descriptions of food items that are similar to those appearing on the point-of-sale sales receipts of most food retailers. The FPED components of grocery food items are identified using Natural Language Processing techniques combined with a collection of food concept maps and relationships that are manually built using the USDA Food and Nutrient Database for Dietary Studies, the USDA National Nutrient Database for Standard Reference, the What We Eat In America food categories, and the hierarchical organization of food items used by many grocery stores. We have established the construct validity of the method using data from the National Health and Nutrition Examination Survey, but further evaluation of validity and reliability will require a large-scale reference standard with known grocery food quality measures. Here we evaluate the method’s utility in identifying the FPED components of grocery food items available in a large sample of retail grocery sales data (~190 million transaction records). PMID:28475153

  14. An Innovative Method for Monitoring Food Quality and the Healthfulness of Consumers' Grocery Purchases.

    PubMed

    Tran, Le-Thuy T; Brewster, Philip J; Chidambaram, Valliammai; Hurdle, John F

    2017-05-05

    This study presents a method laying the groundwork for systematically monitoring food quality and the healthfulness of consumers' point-of-sale grocery purchases. The method automates the process of identifying United States Department of Agriculture (USDA) Food Patterns Equivalent Database (FPED) components of grocery food items. The input to the process is the compact abbreviated descriptions of food items that are similar to those appearing on the point-of-sale sales receipts of most food retailers. The FPED components of grocery food items are identified using Natural Language Processing techniques combined with a collection of food concept maps and relationships that are manually built using the USDA Food and Nutrient Database for Dietary Studies, the USDA National Nutrient Database for Standard Reference, the What We Eat In America food categories, and the hierarchical organization of food items used by many grocery stores. We have established the construct validity of the method using data from the National Health and Nutrition Examination Survey, but further evaluation of validity and reliability will require a large-scale reference standard with known grocery food quality measures. Here we evaluate the method's utility in identifying the FPED components of grocery food items available in a large sample of retail grocery sales data (~190 million transaction records).

  15. Large-scale tissue clearing (PACT): Technical evaluation and new perspectives in immunofluorescence, histology, and ultrastructure.

    PubMed

    Neckel, Peter H; Mattheus, Ulrich; Hirt, Bernhard; Just, Lothar; Mack, Andreas F

    2016-09-29

    Novel techniques, like CLARITY and PACT, render large tissue specimens transparent and thereby suitable for microscopic analysis. We used these techniques to evaluate their potential in the intestine as an exemplary organ with a complex tissue composition. Immunohistochemistry, light sheet-, and confocal scanning-microscopy enabled us to follow complex three-dimensional structures, like nerve fibers, vessels, and epithelial barriers throughout the entire organ. Moreover, in a systematic electron microscopic study, we analyzed the morphology and preservation of tissue on ultrastructural level during the clearing process. We also connect tissue clearing with classical histology and demonstrate that cleared tissues can be stained with Hematoxylin-Eosin and Heidenhain's Azan stain, suggesting potential use in histopathology. These experiments showed that a neutral pH during the clearing process results in much better preservation of tissue ultrastructure and standard stainability. Volume changes of specimens were monitored and quantified during the course of the protocol. Additionally, we employed the technique to visualize the enteric nervous system and the epithelial barrier in post mortem human gut preparations. Our data show the high potential of tissue clearing throughout different tissue types supporting its usefulness in research and diagnosis, and contribute to the technical discussion of ultrastructural tissue-retention.

  16. Large-scale tissue clearing (PACT): Technical evaluation and new perspectives in immunofluorescence, histology, and ultrastructure

    PubMed Central

    Neckel, Peter H.; Mattheus, Ulrich; Hirt, Bernhard; Just, Lothar; Mack, Andreas F.

    2016-01-01

    Novel techniques, like CLARITY and PACT, render large tissue specimens transparent and thereby suitable for microscopic analysis. We used these techniques to evaluate their potential in the intestine as an exemplary organ with a complex tissue composition. Immunohistochemistry, light sheet-, and confocal scanning-microscopy enabled us to follow complex three-dimensional structures, like nerve fibers, vessels, and epithelial barriers throughout the entire organ. Moreover, in a systematic electron microscopic study, we analyzed the morphology and preservation of tissue on ultrastructural level during the clearing process. We also connect tissue clearing with classical histology and demonstrate that cleared tissues can be stained with Hematoxylin-Eosin and Heidenhain’s Azan stain, suggesting potential use in histopathology. These experiments showed that a neutral pH during the clearing process results in much better preservation of tissue ultrastructure and standard stainability. Volume changes of specimens were monitored and quantified during the course of the protocol. Additionally, we employed the technique to visualize the enteric nervous system and the epithelial barrier in post mortem human gut preparations. Our data show the high potential of tissue clearing throughout different tissue types supporting its usefulness in research and diagnosis, and contribute to the technical discussion of ultrastructural tissue-retention. PMID:27680942

  17. Electronic self-monitoring of mood using IT platforms in adult patients with bipolar disorder: A systematic review of the validity and evidence.

    PubMed

    Faurholt-Jepsen, Maria; Munkholm, Klaus; Frost, Mads; Bardram, Jakob E; Kessing, Lars Vedel

    2016-01-15

    Various paper-based mood charting instruments are used in the monitoring of symptoms in bipolar disorder. During recent years an increasing number of electronic self-monitoring tools have been developed. The objectives of this systematic review were 1) to evaluate the validity of electronic self-monitoring tools as a method of evaluating mood compared to clinical rating scales for depression and mania and 2) to investigate the effect of electronic self-monitoring tools on clinically relevant outcomes in bipolar disorder. A systematic review of the scientific literature, reported according to the Preferred Reporting items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines was conducted. MEDLINE, Embase, PsycINFO and The Cochrane Library were searched and supplemented by hand search of reference lists. Databases were searched for 1) studies on electronic self-monitoring tools in patients with bipolar disorder reporting on validity of electronically self-reported mood ratings compared to clinical rating scales for depression and mania and 2) randomized controlled trials (RCT) evaluating electronic mood self-monitoring tools in patients with bipolar disorder. A total of 13 published articles were included. Seven articles were RCTs and six were longitudinal studies. Electronic self-monitoring of mood was considered valid compared to clinical rating scales for depression in six out of six studies, and in two out of seven studies compared to clinical rating scales for mania. The included RCTs primarily investigated the effect of heterogeneous electronically delivered interventions; none of the RCTs investigated the sole effect of electronic mood self-monitoring tools. Methodological issues with risk of bias at different levels limited the evidence in the majority of studies. Electronic self-monitoring of mood in depression appears to be a valid measure of mood in contrast to self-monitoring of mood in mania. There are yet few studies on the effect of electronic self-monitoring of mood in bipolar disorder. The evidence of electronic self-monitoring is limited by methodological issues and by a lack of RCTs. Although the idea of electronic self-monitoring of mood seems appealing, studies using rigorous methodology investigating the beneficial as well as possible harmful effects of electronic self-monitoring are needed.

  18. Shifts in tree functional composition amplify the response of forest biomass to climate

    NASA Astrophysics Data System (ADS)

    Zhang, Tao; Niinemets, Ülo; Sheffield, Justin; Lichstein, Jeremy W.

    2018-04-01

    Forests have a key role in global ecosystems, hosting much of the world’s terrestrial biodiversity and acting as a net sink for atmospheric carbon. These and other ecosystem services that are provided by forests may be sensitive to climate change as well as climate variability on shorter time scales (for example, annual to decadal). Previous studies have documented responses of forest ecosystems to climate change and climate variability, including drought-induced increases in tree mortality rates. However, relationships between forest biomass, tree species composition and climate variability have not been quantified across a large region using systematically sampled data. Here we use systematic forest inventories from the 1980s and 2000s across the eastern USA to show that forest biomass responds to decadal-scale changes in water deficit, and that this biomass response is amplified by concurrent changes in community-mean drought tolerance, a functionally important aspect of tree species composition. The amplification of the direct effects of water stress on biomass occurs because water stress tends to induce a shift in tree species composition towards species that are more tolerant to drought but are slower growing. These results demonstrate concurrent changes in forest species composition and biomass carbon storage across a large, systematically sampled region, and highlight the potential for climate-induced changes in forest ecosystems across the world, resulting from both direct effects of climate on forest biomass and indirect effects mediated by shifts in species composition.

  19. Shifts in tree functional composition amplify the response of forest biomass to climate.

    PubMed

    Zhang, Tao; Niinemets, Ülo; Sheffield, Justin; Lichstein, Jeremy W

    2018-04-05

    Forests have a key role in global ecosystems, hosting much of the world's terrestrial biodiversity and acting as a net sink for atmospheric carbon. These and other ecosystem services that are provided by forests may be sensitive to climate change as well as climate variability on shorter time scales (for example, annual to decadal). Previous studies have documented responses of forest ecosystems to climate change and climate variability, including drought-induced increases in tree mortality rates. However, relationships between forest biomass, tree species composition and climate variability have not been quantified across a large region using systematically sampled data. Here we use systematic forest inventories from the 1980s and 2000s across the eastern USA to show that forest biomass responds to decadal-scale changes in water deficit, and that this biomass response is amplified by concurrent changes in community-mean drought tolerance, a functionally important aspect of tree species composition. The amplification of the direct effects of water stress on biomass occurs because water stress tends to induce a shift in tree species composition towards species that are more tolerant to drought but are slower growing. These results demonstrate concurrent changes in forest species composition and biomass carbon storage across a large, systematically sampled region, and highlight the potential for climate-induced changes in forest ecosystems across the world, resulting from both direct effects of climate on forest biomass and indirect effects mediated by shifts in species composition.

  20. A Systematic Review of the Psychometric Properties of Composite LGBT Prejudice and Discrimination Scales.

    PubMed

    Morrison, Melanie A; Bishop, C J; Morrison, Todd G

    2018-01-08

    Prejudice and discrimination against LGBT individuals is widespread and has been shown to have negative consequences for sexual and gender minority persons' physical and psychological wellbeing. A recent and problematic trend in the literature is to compositely measure prejudice toward and discrimination against LGBT persons. As such, a review of the psychometric properties of scales assessing, in a combinatory fashion, negative attitudes and/or behaviors toward LGBT persons is warranted. In the current study, 32 scales were identified, and their psychometric properties were evaluated. Most of the scales reviewed did not provide sufficient information regarding item development and refinement, scale dimensionality, scale score reliability, or validity. Properties of the reviewed scales are summarized, and recommendations for better measurement practice are articulated.

  1. Lagrangian or Eulerian; real or Fourier? Not all approaches to large-scale structure are created equal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tassev, Svetlin, E-mail: tassev@astro.princeton.edu

    We present a pedagogical systematic investigation of the accuracy of Eulerian and Lagrangian perturbation theories of large-scale structure. We show that significant differences exist between them especially when trying to model the Baryon Acoustic Oscillations (BAO). We find that the best available model of the BAO in real space is the Zel'dovich Approximation (ZA), giving an accuracy of ∼<3% at redshift of z = 0 in modelling the matter 2-pt function around the acoustic peak. All corrections to the ZA around the BAO scale are perfectly perturbative in real space. Any attempt to achieve better precision requires calibrating the theorymore » to simulations because of the need to renormalize those corrections. In contrast, theories which do not fully preserve the ZA as their solution, receive O(1) corrections around the acoustic peak in real space at z = 0, and are thus of suspicious convergence at low redshift around the BAO. As an example, we find that a similar accuracy of 3% for the acoustic peak is achieved by Eulerian Standard Perturbation Theory (SPT) at linear order only at z ≈ 4. Thus even when SPT is perturbative, one needs to include loop corrections for z∼<4 in real space. In Fourier space, all models perform similarly, and are controlled by the overdensity amplitude, thus recovering standard results. However, that comes at a price. Real space cleanly separates the BAO signal from non-linear dynamics. In contrast, Fourier space mixes signal from short mildly non-linear scales with the linear signal from the BAO to the level that non-linear contributions from short scales dominate. Therefore, one has little hope in constructing a systematic theory for the BAO in Fourier space.« less

  2. The Olympic Regeneration in East London (ORiEL) study: protocol for a prospective controlled quasi-experiment to evaluate the impact of urban regeneration on young people and their families

    PubMed Central

    Smith, Neil R; Clark, Charlotte; Fahy, Amanda E; Tharmaratnam, Vanathi; Lewis, Daniel J; Thompson, Claire; Renton, Adrian; Moore, Derek G; Bhui, Kamaldeep S; Taylor, Stephanie J C; Eldridge, Sandra; Petticrew, Mark; Greenhalgh, Tricia; Stansfeld, Stephen A; Cummins, Steven

    2012-01-01

    Introduction Recent systematic reviews suggest that there is a dearth of evidence on the effectiveness of large-scale urban regeneration programmes in improving health and well-being and alleviating health inequalities. The development of the Olympic Park in Stratford for the London 2012 Olympic and Paralympic Games provides the opportunity to take advantage of a natural experiment to examine the impact of large-scale urban regeneration on the health and well-being of young people and their families. Design and methods A prospective school-based survey of adolescents (11–12 years) with parent data collected through face-to-face interviews at home. Adolescents will be recruited from six randomly selected schools in an area receiving large-scale urban regeneration (London Borough of Newham) and compared with adolescents in 18 schools in three comparison areas with no equivalent regeneration (London Boroughs of Tower Hamlets, Hackney and Barking & Dagenham). Baseline data will be completed prior to the start of the London Olympics (July 2012) with follow-up at 6 and 18 months postintervention. Primary outcomes are: pre–post change in adolescent and parent mental health and well-being, physical activity and parental employment status. Secondary outcomes include: pre–post change in social cohesion, smoking, alcohol use, diet and body mass index. The study will account for individual and environmental contextual effects in evaluating changes to identified outcomes. A nested longitudinal qualitative study will explore families’ experiences of regeneration in order to unpack the process by which regeneration impacts on health and well-being. Ethics and dissemination The study has approval from Queen Mary University of London Ethics Committee (QMREC2011/40), the Association of Directors of Children's Services (RGE110927) and the London Boroughs Research Governance Framework (CERGF113). Fieldworkers have had advanced Criminal Records Bureau clearance. Findings will be disseminated through peer-reviewed publications, national and international conferences, through participating schools and the study website (http://www.orielproject.co.uk). PMID:22936822

  3. Sense of competence in dementia care staff (SCIDS) scale: development, reliability, and validity.

    PubMed

    Schepers, Astrid Kristine; Orrell, Martin; Shanahan, Niamh; Spector, Aimee

    2012-07-01

    Sense of competence in dementia care staff (SCIDS) may be associated with more positive attitudes to dementia among care staff and better outcomes for those being cared for. There is a need for a reliable and valid measure of sense of competence specific to dementia care staff. This study describes the development and evaluation of a measure to assess "sense of competence" in dementia care staff and reports on its psychometric properties. The systematic measure development process involved care staff and experts. For item selection and assessment of psychometric properties, a pilot study (N = 37) and a large-scale study (N = 211) with a test-retest reliability (N = 58) sub-study were undertaken. The final measure consists of 17 items across four subscales with acceptable to good internal consistency and moderate to substantial test-retest reliability. As predicted, the measure was positively associated with work experience, job satisfaction, and person-centered approaches to dementia care, giving a first indication for its validity. The SCIDS scale provides a useful and user-friendly means of measuring sense of competence in care staff. It has been developed using a robust process and has adequate psychometric properties. Further exploration of the construct and the scale's validity is warranted. It may be useful to assess the impact of training and perceived abilities and skills in dementia care.

  4. Systematic review of the multidimensional fatigue symptom inventory-short form.

    PubMed

    Donovan, Kristine A; Stein, Kevin D; Lee, Morgan; Leach, Corinne R; Ilozumba, Onaedo; Jacobsen, Paul B

    2015-01-01

    Fatigue is a subjective complaint that is believed to be multifactorial in its etiology and multidimensional in its expression. Fatigue may be experienced by individuals in different dimensions as physical, mental, and emotional tiredness. The purposes of this study were to review and characterize the use of the 30-item Multidimensional Fatigue Symptom Inventory-Short Form (MFSI-SF) in published studies and to evaluate the available evidence for its psychometric properties. A systematic review was conducted to identify published articles reporting results for the MFSI-SF. Data were analyzed to characterize internal consistency reliability of multi-item MFSI-SF scales and test-retest reliability. Correlation coefficients were summarized to characterize concurrent, convergent, and divergent validity. Standardized effect sizes were calculated to characterize the discriminative validity of the MFSI-SF and its sensitivity to change. Seventy articles were identified. Sample sizes reported ranged from 10 to 529 and nearly half consisted exclusively of females. More than half the samples were composed of cancer patients; of those, 59% were breast cancer patients. Mean alpha coefficients for MFSI-SF fatigue subscales ranged from 0.84 for physical fatigue to 0.93 for general fatigue. The MFSI-SF demonstrated moderate test-retest reliability in a small number of studies. Correlations with other fatigue and vitality measures were moderate to large in size and in the expected direction. The MFSI-SF fatigue subscales were positively correlated with measures of distress, depressive, and anxious symptoms. Effect sizes for discriminative validity ranged from medium to large, while effect sizes for sensitivity to change ranged from small to large. Findings demonstrate the positive psychometric properties of the MFSI-SF, provide evidence for its usefulness in medically ill and nonmedically ill individuals, and support its use in future studies.

  5. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    NASA Astrophysics Data System (ADS)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera-to-object and baseline distance reduces errors in occluded areas and that realistic ground truths help to quantify those errors.

  6. Influence of a large-scale field on energy dissipation in magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Zhdankin, Vladimir; Boldyrev, Stanislav; Mason, Joanne

    2017-07-01

    In magnetohydrodynamic (MHD) turbulence, the large-scale magnetic field sets a preferred local direction for the small-scale dynamics, altering the statistics of turbulence from the isotropic case. This happens even in the absence of a total magnetic flux, since MHD turbulence forms randomly oriented large-scale domains of strong magnetic field. It is therefore customary to study small-scale magnetic plasma turbulence by assuming a strong background magnetic field relative to the turbulent fluctuations. This is done, for example, in reduced models of plasmas, such as reduced MHD, reduced-dimension kinetic models, gyrokinetics, etc., which make theoretical calculations easier and numerical computations cheaper. Recently, however, it has become clear that the turbulent energy dissipation is concentrated in the regions of strong magnetic field variations. A significant fraction of the energy dissipation may be localized in very small volumes corresponding to the boundaries between strongly magnetized domains. In these regions, the reduced models are not applicable. This has important implications for studies of particle heating and acceleration in magnetic plasma turbulence. The goal of this work is to systematically investigate the relationship between local magnetic field variations and magnetic energy dissipation, and to understand its implications for modelling energy dissipation in realistic turbulent plasmas.

  7. MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BOLLEN, JOHAN; RODRIGUEZ, MARKO A.; VAN DE SOMPEL, HERBERT

    2007-01-30

    The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process.more » The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.« less

  8. Large-scale adverse effects related to treatment evidence standardization (LAERTES): an open scalable system for linking pharmacovigilance evidence sources with clinical data.

    PubMed

    2017-03-07

    Integrating multiple sources of pharmacovigilance evidence has the potential to advance the science of safety signal detection and evaluation. In this regard, there is a need for more research on how to integrate multiple disparate evidence sources while making the evidence computable from a knowledge representation perspective (i.e., semantic enrichment). Existing frameworks suggest well-promising outcomes for such integration but employ a rather limited number of sources. In particular, none have been specifically designed to support both regulatory and clinical use cases, nor have any been designed to add new resources and use cases through an open architecture. This paper discusses the architecture and functionality of a system called Large-scale Adverse Effects Related to Treatment Evidence Standardization (LAERTES) that aims to address these shortcomings. LAERTES provides a standardized, open, and scalable architecture for linking evidence sources relevant to the association of drugs with health outcomes of interest (HOIs). Standard terminologies are used to represent different entities. For example, drugs and HOIs are represented in RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms respectively. At the time of this writing, six evidence sources have been loaded into the LAERTES evidence base and are accessible through prototype evidence exploration user interface and a set of Web application programming interface services. This system operates within a larger software stack provided by the Observational Health Data Sciences and Informatics clinical research framework, including the relational Common Data Model for observational patient data created by the Observational Medical Outcomes Partnership. Elements of the Linked Data paradigm facilitate the systematic and scalable integration of relevant evidence sources. The prototype LAERTES system provides useful functionality while creating opportunities for further research. Future work will involve improving the method for normalizing drug and HOI concepts across the integrated sources, aggregated evidence at different levels of a hierarchy of HOI concepts, and developing more advanced user interface for drug-HOI investigations.

  9. Revisiting Grodzins systematics of B(E2) values

    DOE PAGES

    Pritychenko, B.; Birch, M.; Singh, B.

    2017-04-03

    Using Grodzins formalism, we analyze systematics of our latest evaluated B(E2) data for all the even–even nuclei in Z=2–104. The analysis indicates a low predictive power of systematics for a large number of cases, and a strong correlation between B(E2) fit values and nuclear structure effects. These findings provide a strong rationale for introduction of individual or elemental (grouped by Z) fit parameters. The current estimates of quadrupole collectivities for systematics of even–even nuclei yield complementary values for comparison with experimental results and theoretical calculations. Furthermore, the lists of fit parameters and predicted B(E2) values are given and possible implicationsmore » are discussed.« less

  10. Bino variations: Effective field theory methods for dark matter direct detection

    NASA Astrophysics Data System (ADS)

    Berlin, Asher; Robertson, Denis S.; Solon, Mikhail P.; Zurek, Kathryn M.

    2016-05-01

    We apply effective field theory methods to compute bino-nucleon scattering, in the case where tree-level interactions are suppressed and the leading contribution is at loop order via heavy flavor squarks or sleptons. We find that leading log corrections to fixed-order calculations can increase the bino mass reach of direct detection experiments by a factor of 2 in some models. These effects are particularly large for the bino-sbottom coannihilation region, where bino dark matter as heavy as 5-10 TeV may be detected by near future experiments. For the case of stop- and selectron-loop mediated scattering, an experiment reaching the neutrino background will probe thermal binos as heavy as 500 and 300 GeV, respectively. We present three key examples that illustrate in detail the framework for determining weak scale coefficients, and for mapping onto a low-energy theory at hadronic scales, through a sequence of effective theories and renormalization group evolution. For the case of a squark degenerate with the bino, we extend the framework to include a squark degree of freedom at low energies using heavy particle effective theory, thus accounting for large logarithms through a "heavy-light current." Benchmark predictions for scattering cross sections are evaluated, including complete leading order matching onto quark and gluon operators, and a systematic treatment of perturbative and hadronic uncertainties.

  11. Facile and Green Production of Impurity-Free Aqueous Solutions of WS2 Nanosheets by Direct Exfoliation in Water.

    PubMed

    Pan, Long; Liu, Yi-Tao; Xie, Xu-Ming; Ye, Xiong-Ying

    2016-12-01

    To obtain 2D materials with large quantity, low cost, and little pollution, liquid-phase exfoliation of their bulk form in water is a particularly fascinating concept. However, the current strategies for water-borne exfoliation exclusively employ stabilizers, such as surfactants, polymers, or inorganic salts, to minimize the extremely high surface energy of these nanosheets and stabilize them by steric repulsion. It is worth noting, however, that the remaining impurities inevitably bring about adverse effects to the ultimate performances of 2D materials. Here, a facile and green route to large-scale production of impurity-free aqueous solutions of WS 2 nanosheets is reported by direct exfoliation in water. Crucial parameters such as initial concentration, sonication time, centrifugation speed, and centrifugation time are systematically evaluated to screen out an optimized condition for scaling up. Statistics based on morphological characterization prove that substantial fraction (66%) of the obtained WS 2 nanosheets are one to five layers. X-ray diffraction and Raman characterizations reveal a high quality with few, if any, structural distortions. The water-borne exfoliation route opens up new opportunities for easy, clean processing of WS 2 -based film devices that may shine in the fields of, e.g., energy storage and functional nanocomposites owing to their excellent electrochemical, mechanical, and thermal properties. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Bino variations: Effective field theory methods for dark matter direct detection

    DOE PAGES

    Berlin, Asher; Robertson, Denis S.; Solon, Mikhail P.; ...

    2016-05-10

    We apply effective field theory methods to compute bino-nucleon scattering, in the case where tree-level interactions are suppressed and the leading contribution is at loop order via heavy flavor squarks or sleptons. We find that leading log corrections to fixed-order calculations can increase the bino mass reach of direct detection experiments by a factor of 2 in some models. These effects are particularly large for the bino-sbottom coannihilation region, where bino dark matter as heavy as 5–10 TeV may be detected by near future experiments. For the case of stop- and selectron-loop mediated scattering, an experiment reaching the neutrino backgroundmore » will probe thermal binos as heavy as 500 and 300 GeV, respectively. We present three key examples that illustrate in detail the framework for determining weak scale coefficients, and for mapping onto a low-energy theory at hadronic scales, through a sequence of effective theories and renormalization group evolution. For the case of a squark degenerate with the bino, we extend the framework to include a squark degree of freedom at low energies using heavy particle effective theory, thus accounting for large logarithms through a “heavy-light current.” Finally, benchmark predictions for scattering cross sections are evaluated, including complete leading order matching onto quark and gluon operators, and a systematic treatment of perturbative and hadronic uncertainties.« less

  13. BEHAVIOR ANALYSTS IN THE WAR ON POVERTY: A REVIEW OF THE USE OF FINANCIAL INCENTIVES TO PROMOTE EDUCATION AND EMPLOYMENT

    PubMed Central

    Holtyn, August F.; Jarvis, Brantley P.; Silverman, Kenneth

    2017-01-01

    Poverty is a pervasive risk factor underlying poor health. Many interventions that have sought to reduce health disparities associated with poverty have focused on improving health-related behaviors of low-income adults. Poverty itself could be targeted to improve health, but this approach would require programs that can consistently move poor individuals out of poverty. Governments and other organizations in the United States have tested a diverse range of antipoverty programs, generally on a large scale and in conjunction with welfare reform initiatives. This paper reviews antipoverty programs that used financial incentives to promote education and employment among welfare recipients and other low-income adults. The incentive-based, antipoverty programs had small or no effects on the target behaviors; they were implemented on large scales from the outset, without systematic development and evaluation of their components; and they did not apply principles of operant conditioning that have been shown to determine the effectiveness of incentive or reinforcement interventions. By applying basic principles of operant conditioning, behavior analysts could help address poverty and improve health through development of effective antipoverty programs. This paper describes a potential framework for a behavior-analytic antipoverty program, with the goal of illustrating that behavior analysts could be uniquely suited to make substantial contributions to the war on poverty. PMID:28078664

  14. Behavior analysts in the war on poverty: A review of the use of financial incentives to promote education and employment.

    PubMed

    Holtyn, August F; Jarvis, Brantley P; Silverman, Kenneth

    2017-01-01

    Poverty is a pervasive risk factor underlying poor health. Many interventions that have sought to reduce health disparities associated with poverty have focused on improving health-related behaviors of low-income adults. Poverty itself could be targeted to improve health, but this approach would require programs that can consistently move poor individuals out of poverty. Governments and other organizations in the United States have tested a diverse range of antipoverty programs, generally on a large scale and in conjunction with welfare reform initiatives. This paper reviews antipoverty programs that used financial incentives to promote education and employment among welfare recipients and other low-income adults. The incentive-based, antipoverty programs had small or no effects on the target behaviors; they were implemented on large scales from the outset, without systematic development and evaluation of their components; and they did not apply principles of operant conditioning that have been shown to determine the effectiveness of incentive or reinforcement interventions. By applying basic principles of operant conditioning, behavior analysts could help address poverty and improve health through development of effective antipoverty programs. This paper describes a potential framework for a behavior-analytic antipoverty program, with the goal of illustrating that behavior analysts could be uniquely suited to make substantial contributions to the war on poverty. © 2017 Society for the Experimental Analysis of Behavior.

  15. Disorder in the Disk: The Influence of Accretion Disk Thickness on the Large-scale Magnetic Dynamo.

    NASA Astrophysics Data System (ADS)

    Hogg, J. Drew; Reynolds, Christopher S.

    2018-01-01

    The evolution of the magnetic field from the enigmatic large-scale dynamo is often considered a central feature of the accretion disk around a black hole. The resulting low-frequency oscillations introduced from the growth and decay of the field strength, along with the change in field orientation, are thought to be intimately tied to variability from the disk. Several factors are at play, but the dynamo can either be directly tied to observable signatures through modulation of the heating rate, or indirectly as the source of quasiperiodic oscillations, the driver of nonlinear structure from propagating fluctuations in mass accretion rate, or even the trigger of state transitions. We present a selection of results from a recent study of this process using a suite of four global, high-resolution, MHD accretion disk simulations. We systematically vary the scale height ratio and find the large-scale dynamo fails to develop above a scale height ratio of h/r ≥ 0.2. Using “butterfly” diagrams of the azimuthal magnetic field, we show the large-scale dynamo exists in the thinner accretion disk models, but fails to excite when the scale height ratio is increased, a feature which is also reflected in 2D Fourier transforms. Additionally, we calculate the dynamo α-parameter through correlations in the averaged magnetic field and turbulent electromotive force, and also generate synthetic light curves from the disk cooling. Using our emission proxy, we find the disks have markedly different characters as photometric fluctuations are larger and less ordered when the disk is thicker and the dynamo is absent.

  16. The role of the large-scale coronal magnetic field in the eruption of prominence/cavity systems

    NASA Astrophysics Data System (ADS)

    de Toma, G.; Gibson, S. E.; Fan, Y.; Torok, T.

    2013-12-01

    Prominence/cavity systems are large-scale coronal structures that can live for many weeks and even months and often end their life in the form of large coronal eruptions. We investigate the role of the surrounding ambient coronal field in stabilizing these systems against eruption. In particular, we examine the extent to which the decline with height of the external coronal magnetic field influences the evolution of these coronal systems and their likelihood to erupt. We study prominence/cavity systems during the rising phase of cycle 24 in 2010-2013, when a significant number of CMEs were associated with polar crown or large filament eruptions. We use EUV observations from SDO/AIA to identify stable and eruptive coronal cavities, and SDO/HMI magnetograms as boundary conditions to PFSS extrapolation to derive the ambient coronal field. We compute the decay index of the potential field for the two groups and find that systematic differences exist between eruptive and non-eruptive systems.

  17. Antipsychotic Drug Side Effects for Persons with Intellectual Disability

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Mahan, Sara

    2010-01-01

    Antipsychotic drugs are the most frequently prescribed of the psychotropic drugs among the intellectually disabled (ID) population. Given their widespread use, efforts to systematically assess and report side effects are warranted. Specific scaling methods such as the "Matson Evaluation of Side Effects" ("MEDS"), the "Abnormal Inventory Movement…

  18. Developing a Standardized Letter of Recommendation

    ERIC Educational Resources Information Center

    Walters, Alyssa M.; Kyllonen, Patrick C.; Plante, Janice W.

    2006-01-01

    The Standardized Letter of Recommendation (SLR) is a Web-based admission tool designed to replace traditional, narrative letters of recommendation with a more systematic and equitable source of information about applicants to institutions of higher education. The SLR includes a rating scale and open-ended response space that prompt evaluators to…

  19. Quality of systematic reviews in pediatric oncology--a systematic review.

    PubMed

    Lundh, Andreas; Knijnenburg, Sebastiaan L; Jørgensen, Anders W; van Dalen, Elvira C; Kremer, Leontien C M

    2009-12-01

    To ensure evidence-based decision making in pediatric oncology systematic reviews are necessary. The objective of our study was to evaluate the methodological quality of all currently existing systematic reviews in pediatric oncology. We identified eligible systematic reviews through a systematic search of the literature. Data on clinical and methodological characteristics of the included systematic reviews were extracted. The methodological quality of the included systematic reviews was assessed using the overview quality assessment questionnaire, a validated 10-item quality assessment tool. We compared the methodological quality of systematic reviews published in regular journals with that of Cochrane systematic reviews. We included 117 systematic reviews, 99 systematic reviews published in regular journals and 18 Cochrane systematic reviews. The average methodological quality of systematic reviews was low for all ten items, but the quality of Cochrane systematic reviews was significantly higher than systematic reviews published in regular journals. On a 1-7 scale, the median overall quality score for all systematic reviews was 2 (range 1-7), with a score of 1 (range 1-7) for systematic reviews in regular journals compared to 6 (range 3-7) in Cochrane systematic reviews (p<0.001). Most systematic reviews in the field of pediatric oncology seem to have serious methodological flaws leading to a high risk of bias. While Cochrane systematic reviews were of higher methodological quality than systematic reviews in regular journals, some of them also had methodological problems. Therefore, the methodology of each individual systematic review should be scrutinized before accepting its results.

  20. Environmental risks in the developing world: exposure indicators for evaluating interventions, programmes, and policies.

    PubMed

    Ezzati, Majid; Utzinger, Jürg; Cairncross, Sandy; Cohen, Aaron J; Singer, Burton H

    2005-01-01

    Monitoring and empirical evaluation are essential components of evidence based public health policies and programmes. Consequently, there is a growing interest in monitoring of, and indicators for, major environmental health risks, particularly in the developing world. Current large scale data collection efforts are generally disconnected from micro-scale studies in health sciences, which in turn have insufficiently investigated the behavioural and socioeconomic factors that influence exposure. A basic framework is proposed for development of indicators of exposure to environmental health risks that would facilitate the (a) assessment of the health effects of risk factors, (b) design and evaluation of interventions and programmes to deliver the interventions, and (c) appraisal and quantification of inequalities in health effects of risk factors, and benefits of intervention programmes and policies. Specific emphasis is put on the features of environmental risks that should guide the choice of indicators, in particular the interactions of technology, the environment, and human behaviour in determining exposure. The indicators are divided into four categories: (a) access and infrastructure, (b) technology, (c) agents and vectors, and (d) behaviour. The study used water and sanitation, indoor air pollution from solid fuels, urban ambient air pollution, and malaria as illustrative examples for this framework. Organised and systematic indicator selection and monitoring can provide an evidence base for design and implementation of more effective and equitable technological interventions, delivery programmes, and policies for environmental health risks in resource poor settings.

  1. Training for Quality: Improving Early Childhood Programs through Systematic Inservice Training. Monographs of the High/Scope Educational Research Foundation, Number Nine.

    ERIC Educational Resources Information Center

    Epstein, Ann S.

    The Training of Trainers (ToT) Evaluation investigated the efficacy of the High/Scope model for improving the quality of early childhood programs on a national scale. To address this question, the High/Scope Foundation undertook a multimethod evaluation that collected anecdotal records from the consultants and 793 participants in 40 ToT projects,…

  2. Advanced DInSAR analysis for building damage assessment in large urban areas: an application to the city of Roma, Italy

    NASA Astrophysics Data System (ADS)

    D'Aranno, Peppe J. V.; Marsella, Maria; Scifoni, Silvia; Scutti, Marianna; Sonnessa, Alberico; Bonano, Manuela

    2015-10-01

    Remote sensing data play an important role for the environmental monitoring because they allow to provide systematic information on very large areas and for a long period of time. Such information must be analyzed, validated and incorporated into proper modeling tools in order to become useful for performing risk assessment analysis. These approaches has been already applied in the field of natural hazard evaluation (i.e. for monitoring seismic, volcanic areas and landslides). However, not enough attention has been devoted to the development of validated methods for implementing quantitative analysis on civil structures. This work is dedicated to the comprehensive utilization of ERS / ENVISAT data store ESA SAR used to detect deformation trends and perform back-analysis of the investigated structures useful to calibrate the damage assessment models. After this preliminary analysis, SAR data of the new satellite mission (ie Cosmo SkyMed) were adopted to monitor the evolution of existent surface deformation processes and to detect new occurrence. The specific objective was to set up a data processing and data analysis chain tailored on a service that sustains the safe maintenance of the built-up environment, including critical construction such as public (schools, hospital, etc), strategic (dam, highways, etc) and also the cultural heritage sites. The analysis of the test area, in the southeastern sector of Roma, has provided three different levels and sub-levels of products from metropolitan area scale (territorial analysis), settlement scale (aggregated analysis) to single structure scale (damage degree associated to the structure).

  3. Modeling Atmospheric Transport for Greenhouse Gas Observations within the Urban Dome

    NASA Astrophysics Data System (ADS)

    Nehrkorn, T.; Sargent, M. R.; Wofsy, S. C.

    2016-12-01

    Observations of CO2, CH4, and other greenhouse gases (GHGs) within the urban dome of major cities generally show large enhancements over background values, and large sensitivity to surface fluxes (as measured by the footprints computed by Lagrangian Particle Dispersion Models, LPDMs) within the urban dome. However, their use in top-down inversion studies to constrain urban emission estimates is complicated by difficulties in proper modeling of the atmospheric transport. We are conducting experiments with the Weather Research and Forecast model (WRF) coupled to the STILT LPDM to improve model simulation of atmospheric transport on spatial scales of a few km in urban domains, because errors in transport on short time/space scales are amplified by the patchiness of GHG emissions and may engender systematic errors of simulated concentrations.We are evaluating the quality of the meteorological simulations from model configurations with different resolutions and PBL packages, using both standard and non-standard (Lidar PBL height and ACARS aircraft profile) observations. To take into account the effect of building scale eddies for observations located on top of buildings, we are modifying the basic STILT algorithm for the computation of footprints by replacing the nominal receptor height by an effective sampling height. In addition, the footprint computations for near-field emissions make use of the vertical particle spread within the LPDM to arrive at a more appropriate estimate of mixing heights in the immediate vicinity of receptors. We present the effect of these and similar modifications on simulated concentrations and their level of agreement with observed values.

  4. Lagrangian Flow Network: a new tool to evaluate connectivity and understand the structural complexity of marine populations

    NASA Astrophysics Data System (ADS)

    Rossi, V.; Dubois, M.; Ser-Giacomi, E.; Monroy, P.; Lopez, C.; Hernandez-Garcia, E.

    2016-02-01

    Assessing the spatial structure and dynamics of marine populations is still a major challenge for ecologists. The necessity to manage marine resources from a large-scale perspective and considering the whole ecosystem is now recognized but the absence of appropriate tools to address these objectives limits the implementation of globally pertinent conservation planning. Inspired from Network Theory, we present a new methodological framework called Lagrangian Flow Network which allows a systematic characterization of multi-scale dispersal and connectivity of early life history stages of marine organisms. The network is constructed by subdividing the basin into an ensemble of equal-area subregions which are interconnected through the transport of propagules by ocean currents. The present version allows the identification of hydrodynamical provinces and the computation of various connectivity proxies measuring retention and exchange of larvae. Due to our spatial discretization and subsequent network representation, as well as our Lagrangian approach, further methodological improvements are handily accessible. These future developments include a parametrization of habitat patchiness, the implementation of realistic larval traits and the consideration of abiotic variables (e.g. temperature, salinity, planktonic resources...) and their effects on larval production and survival. While the model is potentially tunable to any species whose biological traits and ecological preferences are precisely known, it can also be used in a more generic configuration by efficient computing and analysis of a large number of experiments with relevant ecological parameters. It permits a better characterization of population connectivity at multiple scales and it informs its ecological and managerial interpretations.

  5. Evaluating service user pedagogy in UK higher education: Validating the Huddersfield Service User Pedagogy Scale.

    PubMed

    Tobbell, Jane; Boduszek, Daniel; Kola-Palmer, Susanna; Vaughan, Joanne; Hargreaves, Janet

    2018-04-01

    There is global recognition that the inclusion of service users in the education of health and social care students in higher education can lead to more compassionate professional identities which will enable better decision making. However, to date there is no systematic tool to explore learning and service user involvement in the curriculum. To generate and validate a psychometric instrument which will allow educators to evaluate service user pedagogy. Construction and validation of a new scale. 365 undergraduate students from health and social care departments in two universities. A two correlated factor scale. Factor 1 - perceived presence of service users in the taught curriculum and factor 2 - professionals and service users working together (correlation between factor 1 and factor 2 - r = 0.32). The Huddersfield Service User Pedagogy Scale provides a valid instrument for educators to evaluate student learning. In addition, the tool can contribute to student reflections on their shifting professional identities as they progress through their studies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. [Land use and land cover charnge (LUCC) and landscape service: Evaluation, mapping and modeling].

    PubMed

    Song, Zhang-jian; Cao, Yu; Tan, Yong-zhong; Chen, Xiao-dong; Chen, Xian-peng

    2015-05-01

    Studies on ecosystem service from landscape scale aspect have received increasing attention from researchers all over the world. Compared with ecosystem scale, it should be more suitable to explore the influence of human activities on land use and land cover change (LUCC), and to interpret the mechanisms and processes of sustainable landscape dynamics on landscape scale. Based on comprehensive and systematic analysis of researches on landscape service, this paper firstly discussed basic concepts and classification of landscape service. Then, methods of evaluation, mapping and modeling of landscape service were analyzed and concluded. Finally, future trends for the research on landscape service were proposed. It was put forward that, exploring further connotation and classification system of landscape service, improving methods and quantitative indicators for evaluation, mapping and modelling of landscape service, carrying out long-term integrated researches on landscape pattern-process-service-scale relationships and enhancing the applications of theories and methods on landscape economics and landscape ecology are very important fields of the research on landscape service in future.

  7. Post-trial follow-up methodology in large randomized controlled trials: a systematic review protocol.

    PubMed

    Llewellyn-Bennett, Rebecca; Bowman, Louise; Bulbulia, Richard

    2016-12-15

    Clinical trials typically have a relatively short follow-up period, and may both underestimate potential benefits of treatments investigated, and fail to detect hazards, which can take much longer to emerge. Prolonged follow-up of trial participants after the end of the scheduled trial period can provide important information on both efficacy and safety outcomes. This protocol describes a systematic review to qualitatively compare methods of post-trial follow-up used in large randomized controlled trials. A systematic search of electronic databases and clinical trial registries will use a predefined search strategy. All large (more than 1000 adult participants) randomized controlled trials will be evaluated. Two reviewers will screen and extract data according to this protocol with the aim of 95% concordance of papers checked and discrepancies will be resolved by a third reviewer. Trial methods, participant retention rates and prevalence of missing data will be recorded and compared. The potential for bias will be evaluated using the Cochrane Risk of Bias tool (applied to the methods used during the in-trial period) with the aim of investigating whether the quality of the post-trial follow-up methodology might be predicted by the quality of the methods used for the original trial. Post-trial follow-up can provide valuable information about the long-term benefits and hazards of medical interventions. However, it can be logistically challenging and costly. The aim of this systematic review is to describe how trial participants have been followed-up post-trial in order to inform future post-trial follow-up designs. Not applicable for PROSPERO registration.

  8. The efficacy of cognitive prosthetic technology for people with memory impairments: a systematic review and meta-analysis.

    PubMed

    Jamieson, Matthew; Cullen, Breda; McGee-Lennon, Marilyn; Brewster, Stephen; Evans, Jonathan J

    2014-01-01

    Technology can compensate for memory impairment. The efficacy of assistive technology for people with memory difficulties and the methodology of selected studies are assessed. A systematic search was performed and all studies that investigated the impact of technology on memory performance for adults with impaired memory resulting from acquired brain injury (ABI) or a degenerative disease were included. Two 10-point scales were used to compare each study to an ideally reported single case experimental design (SCED) study (SCED scale; Tate et al., 2008 ) or randomised control group study (PEDro-P scale; Maher, Sherrington, Herbert, Moseley, & Elkins, 2003 ). Thirty-two SCED (mean = 5.9 on the SCED scale) and 11 group studies (mean = 4.45 on the PEDro-P scale) were found. Baseline and intervention performance for each participant in the SCED studies was re-calculated using non-overlap of all pairs (Parker & Vannest, 2009 ) giving a mean score of 0.85 on a 0 to 1 scale (17 studies, n = 36). A meta-analysis of the efficacy of technology vs. control in seven group studies gave a large effect size (d = 1.27) (n = 147). It was concluded that prosthetic technology can improve performance on everyday tasks requiring memory. There is a specific need for investigations of technology for people with degenerative diseases.

  9. The accurate particle tracer code

    NASA Astrophysics Data System (ADS)

    Wang, Yulei; Liu, Jian; Qin, Hong; Yu, Zhi; Yao, Yicun

    2017-11-01

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runaway electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world's fastest computer, the Sunway TaihuLight supercomputer, by supporting master-slave architecture of Sunway many-core processors. Based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.

  10. Slipping Anchor? Testing the Vignettes Approach to Identification and Correction of Reporting Heterogeneity

    PubMed Central

    d’Uva, Teresa Bago; Lindeboom, Maarten; O’Donnell, Owen; van Doorslaer, Eddy

    2011-01-01

    We propose tests of the two assumptions under which anchoring vignettes identify heterogeneity in reporting of categorical evaluations. Systematic variation in the perceived difference between any two vignette states is sufficient to reject vignette equivalence. Response consistency - the respondent uses the same response scale to evaluate the vignette and herself – is testable given sufficiently comprehensive objective indicators that independently identify response scales. Both assumptions are rejected for reporting of cognitive and physical functioning in a sample of older English individuals, although a weaker test resting on less stringent assumptions does not reject response consistency for cognition. PMID:22184479

  11. Estimating the Effectiveness of Special Education Using Large-Scale Assessment Data

    ERIC Educational Resources Information Center

    Ewing, Katherine Anne

    2009-01-01

    The inclusion of students with disabilities in large scale assessment and accountability programs has provided new opportunities to examine the impact of special education services on student achievement. Hanushek, Kain, and Rivkin (1998, 2002) evaluated the effectiveness of special education programs by examining students' gains on a large-scale…

  12. Evaluating the Performance of the Goddard Multi-Scale Modeling Framework against GPM, TRMM and CloudSat/CALIPSO Products

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.; Lang, S. E.; Matsui, T.; Mohr, K. I.

    2014-12-01

    Four six-month (March-August 2014) experiments with the Goddard Multi-scale Modeling Framework (MMF) were performed to study the impacts of different Goddard one-moment bulk microphysical schemes and large-scale forcings on the performance of the MMF. Recently a new Goddard one-moment bulk microphysics with four-ice classes (cloud ice, snow, graupel, and frozen drops/hail) has been developed based on cloud-resolving model simulations with large-scale forcings from field campaign observations. The new scheme has been successfully implemented to the MMF and two MMF experiments were carried out with this new scheme and the old three-ice classes (cloud ice, snow graupel) scheme. The MMF has global coverage and can rigorously evaluate microphysics performance for different cloud regimes. The results show MMF with the new scheme outperformed the old one. The MMF simulations are also strongly affected by the interaction between large-scale and cloud-scale processes. Two MMF sensitivity experiments with and without nudging large-scale forcings to those of ERA-Interim reanalysis were carried out to study the impacts of large-scale forcings. The model simulated mean and variability of surface precipitation, cloud types, cloud properties such as cloud amount, hydrometeors vertical profiles, and cloud water contents, etc. in different geographic locations and climate regimes are evaluated against GPM, TRMM, CloudSat/CALIPSO satellite observations. The Goddard MMF has also been coupled with the Goddard Satellite Data Simulation Unit (G-SDSU), a system with multi-satellite, multi-sensor, and multi-spectrum satellite simulators. The statistics of MMF simulated radiances and backscattering can be directly compared with satellite observations to assess the strengths and/or deficiencies of MMF simulations and provide guidance on how to improve the MMF and microphysics.

  13. Spatial Variability of Snowpack Properties On Small Slopes

    NASA Astrophysics Data System (ADS)

    Pielmeier, C.; Kronholm, K.; Schneebeli, M.; Schweizer, J.

    The spatial variability of alpine snowpacks is created by a variety of parameters like deposition, wind erosion, sublimation, melting, temperature, radiation and metamor- phism of the snow. Spatial variability is thought to strongly control the avalanche initi- ation and failure propagation processes. Local snowpack measurements are currently the basis for avalanche warning services and there exist contradicting hypotheses about the spatial continuity of avalanche active snow layers and interfaces. Very little about the spatial variability of the snowpack is known so far, therefore we have devel- oped a systematic and objective method to measure the spatial variability of snowpack properties, layering and its relation to stability. For a complete coverage, the analysis of the spatial variability has to entail all scales from mm to km. In this study the small to medium scale spatial variability is investigated, i.e. the range from centimeters to tenths of meters. During the winter 2000/2001 we took systematic measurements in lines and grids on a flat snow test field with grid distances from 5 cm to 0.5 m. Fur- thermore, we measured systematic grids with grid distances between 0.5 m and 2 m in undisturbed flat fields and on small slopes above the tree line at the Choerbschhorn, in the region of Davos, Switzerland. On 13 days we measured the spatial pattern of the snowpack stratigraphy with more than 110 snow micro penetrometer measure- ments at slopes and flat fields. Within this measuring grid we placed 1 rutschblock and 12 stuffblock tests to measure the stability of the snowpack. With the large num- ber of measurements we are able to use geostatistical methods to analyse the spatial variability of the snowpack. Typical correlation lengths are calculated from semivari- ograms. Discerning the systematic trends from random spatial variability is analysed using statistical models. Scale dependencies are shown and recurring scaling patterns are outlined. The importance of the small and medium scale spatial variability for the larger (kilometer) scale spatial variability as well as for the avalanche formation are discussed. Finally, an outlook on spatial models for the snowpack variability is given.

  14. Large Scale Processes and Extreme Floods in Brazil

    NASA Astrophysics Data System (ADS)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  15. [Producing know-how and making recommendations for promoting high blood pressure management in Colombia , 1998-2005].

    PubMed

    Ortega-Bolaños, Jesús

    2008-01-01

    Producing know-how and making recommendations concerning the most effective health promotion community interventions for managing high blood pressure in Colombia. A systematic review was made of the Cochrane, Lilacs, Ovid, Proquest and Pubmed databases, the main interest of the study lying within the framework of the most effective community interventions around the world for managing high blood pressure. The following search words were used: systematic review, community intervention, cost effectiveness, health promotion and high blood pressure. Studies published in Spanish, English and Portuguese were reviewed. The research strategies used were derived from defining the most pertinent methodological approach for involving individual, interpersonal and community levels in developing the project. 1,041 articles were obtained from the systematic review of the literature: 246 abstracts, 197 articles about educational interventions for preventing and controlling high blood pressure and 53 articles adopting different approaches regarding informative interventions and communication. Only 11 complete referenced articles from this world of information fulfilled the levels of evidence and evaluation criteria necessary for producing recommendations. The available evidence concerning effective, culturally-suitable programmes and for promoting a reduction in these risk factors is limited. Greater evidence regarding community interventions is required for reducing risk factors directed towards special population groups adapted to the cultural characteristics of the participating population. This must involve determinants of the social and physical context related to social practices, these being developed on a large scale within the daily settings in which the subjects and their families are living.

  16. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume VII. A Model for Evaluation of the Response of the Lake Conway, Florida, Ecosystem to Introduction of the White Amur.

    DTIC Science & Technology

    1981-11-01

    OPERATIONS MANAGEMENT S. TYPE OF REPORT A PERIOD COVERED TEST OF THE USE OF THE WHITE AMUR FOR CONTROL OF Report 2 of a series PROBLEM AQUATIC PLANTS...111. 1981. "Large-Scale Operations Management Test of the Use of the White Amur for Control of Problem Aquatic Plants; Report 2, First Year Poststock...Al 3 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC PLANTS A MODEL FOR EVALUATION OF

  17. Systematic high-resolution assessment of global hydropower potential.

    PubMed

    Hoes, Olivier A C; Meijer, Lourens J J; van der Ent, Ruud J; van de Giesen, Nick C

    2017-01-01

    Population growth, increasing energy demand and the depletion of fossil fuel reserves necessitate a search for sustainable alternatives for electricity generation. Hydropower could replace a large part of the contribution of gas and oil to the present energy mix. However, previous high-resolution estimates of hydropower potential have been local, and have yet to be applied on a global scale. This study is the first to formally present a detailed evaluation of the hydropower potential of each location, based on slope and discharge of each river in the world. The gross theoretical hydropower potential is approximately 52 PWh/year divided over 11.8 million locations. This 52 PWh/year is equal to 33% of the annually required energy, while the present energy production by hydropower plants is just 3% of the annually required energy. The results of this study: all potentially interesting locations for hydroelectric power plants, are available online.

  18. Systematically Ranking the Tightness of Membrane Association for Peripheral Membrane Proteins (PMPs)*

    PubMed Central

    Gao, Liyan; Ge, Haitao; Huang, Xiahe; Liu, Kehui; Zhang, Yuanya; Xu, Wu; Wang, Yingchun

    2015-01-01

    Large-scale quantitative evaluation of the tightness of membrane association for nontransmembrane proteins is important for identifying true peripheral membrane proteins with functional significance. Herein, we simultaneously ranked more than 1000 proteins of the photosynthetic model organism Synechocystis sp. PCC 6803 for their relative tightness of membrane association using a proteomic approach. Using multiple precisely ranked and experimentally verified peripheral subunits of photosynthetic protein complexes as the landmarks, we found that proteins involved in two-component signal transduction systems and transporters are overall tightly associated with the membranes, whereas the associations of ribosomal proteins are much weaker. Moreover, we found that hypothetical proteins containing the same domains generally have similar tightness. This work provided a global view of the structural organization of the membrane proteome with respect to divergent functions, and built the foundation for future investigation of the dynamic membrane proteome reorganization in response to different environmental or internal stimuli. PMID:25505158

  19. Systematic high-resolution assessment of global hydropower potential

    PubMed Central

    van de Giesen, Nick C.

    2017-01-01

    Population growth, increasing energy demand and the depletion of fossil fuel reserves necessitate a search for sustainable alternatives for electricity generation. Hydropower could replace a large part of the contribution of gas and oil to the present energy mix. However, previous high-resolution estimates of hydropower potential have been local, and have yet to be applied on a global scale. This study is the first to formally present a detailed evaluation of the hydropower potential of each location, based on slope and discharge of each river in the world. The gross theoretical hydropower potential is approximately 52 PWh/year divided over 11.8 million locations. This 52 PWh/year is equal to 33% of the annually required energy, while the present energy production by hydropower plants is just 3% of the annually required energy. The results of this study: all potentially interesting locations for hydroelectric power plants, are available online. PMID:28178329

  20. Atmospheric energetics in regions of intense convective activity

    NASA Technical Reports Server (NTRS)

    Fuelberg, H. E.

    1977-01-01

    Synoptic-scale budgets of kinetic and total potential energy are computed using 3- and 6-h data at nine times from NASA's fourth Atmospheric Variability Experiment (AVE IV). Two intense squall lines occurred during the period. Energy budgets for areas that enclose regions of intense convection are shown to have systematic changes that relate to the life cycles of the convection. Some of the synoptic-scale energy processes associated with the convection are found to be larger than those observed in the vicinity of mature cyclones. Volumes enclosing intense convection are found to have large values of cross-contour conversion of potential to kinetic energy and large horizontal export of kinetic energy. Although small net vertical transport of kinetic energy is observed, values at individual layers indicate large upward transport. Transfer of kinetic energy from grid to subgrid scales of motion occurs in the volumes. Latent heat release is large in the middle and upper troposphere and is thought to be the cause of the observed cyclic changes in the budget terms. Total potential energy is found to be imported horizontally in the lower half of the atmosphere, transported aloft, and then exported horizontally. Although local changes of kinetic energy and total potential energy are small, interaction between volumes enclosing convection with surrounding larger volumes is quite large.

  1. Friction in debris flows: inferences from large-scale flume experiments

    USGS Publications Warehouse

    Iverson, Richard M.; LaHusen, Richard G.; ,

    1993-01-01

    A recently constructed flume, 95 m long and 2 m wide, permits systematic experimentation with unsteady, nonuniform flows of poorly sorted geological debris. Preliminary experiments with water-saturated mixtures of sand and gravel show that they flow in a manner consistent with Coulomb frictional behavior. The Coulomb flow model of Savage and Hutter (1989, 1991), modified to include quasi-static pore-pressure effects, predicts flow-front velocities and flow depths reasonably well. Moreover, simple scaling analyses show that grain friction, rather than liquid viscosity or grain collisions, probably dominates shear resistance and momentum transport in the experimental flows. The same scaling indicates that grain friction is also important in many natural debris flows.

  2. Development of renormalization group analysis of turbulence

    NASA Technical Reports Server (NTRS)

    Smith, L. M.

    1990-01-01

    The renormalization group (RG) procedure for nonlinear, dissipative systems is now quite standard, and its applications to the problem of hydrodynamic turbulence are becoming well known. In summary, the RG method isolates self similar behavior and provides a systematic procedure to describe scale invariant dynamics in terms of large scale variables only. The parameterization of the small scales in a self consistent manner has important implications for sub-grid modeling. This paper develops the homogeneous, isotropic turbulence and addresses the meaning and consequence of epsilon-expansion. The theory is then extended to include a weak mean flow and application of the RG method to a sequence of models is shown to converge to the Navier-Stokes equations.

  3. Combining states without scale hierarchies with ordered parton showers

    DOE PAGES

    Fischer, Nadine; Prestel, Stefan

    2017-09-12

    Here, we present a parameter-free scheme to combine fixed-order multi-jet results with parton-shower evolution. The scheme produces jet cross sections with leading-order accuracy in the complete phase space of multiple emissions, resumming large logarithms when appropriate, while not arbitrarily enforcing ordering on momentum configurations beyond the reach of the parton-shower evolution equation. This then requires the development of a matrix-element correction scheme for complex phase-spaces including ordering conditions as well as a systematic scale-setting procedure for unordered phase-space points. Our algorithm does not require a merging-scale parameter. We implement the new method in the Vincia framework and compare to LHCmore » data.« less

  4. Stress tolerance and growth physiology of yeast strains from the Brazilian fuel ethanol industry.

    PubMed

    Della-Bianca, B E; Gombert, A K

    2013-12-01

    Improved biofuels production requires a better understanding of industrial microorganisms. Some wild Saccharomyces cerevisiae strains, isolated from the fuel ethanol industry in Brazil, present exceptional fermentation performance, persistence and prevalence in the harsh industrial environment. Nevertheless, their physiology has not yet been systematically investigated. Here we present a first systematic evaluation of the widely used industrial strains PE-2, CAT-1, BG-1 and JP1, in terms of their tolerance towards process-related stressors. We also analyzed their growth physiology under heat stress. These strains were evaluated in parallel to laboratory and baker's strains. Whereas the industrial strains performed in general better than the laboratory strains under ethanol or acetic acid stresses and on industrial media, high sugar stress was tolerated equally by all strains. Heat and low pH stresses clearly distinguished fuel ethanol strains from the others, indicating that these conditions might be the ones that mostly exert selective pressure on cells in the industrial environment. During shake-flask cultivations using a synthetic medium at 37 °C, industrial strains presented higher ethanol yields on glucose than the laboratory strains, indicating that they could have been selected for this trait-a response to energy-demanding fermentation conditions. These results might be useful to guide future improvements of large-scale fuel ethanol production via engineering of stress tolerance traits in other strains, and eventually also for promoting the use of these fuel ethanol strains in different industrial bioprocesses.

  5. Estimating the coverage of mental health programmes: a systematic review

    PubMed Central

    De Silva, Mary J; Lee, Lucy; Fuhr, Daniela C; Rathod, Sujit; Chisholm, Dan; Schellenberg, Joanna; Patel, Vikram

    2014-01-01

    Background The large treatment gap for people suffering from mental disorders has led to initiatives to scale up mental health services. In order to track progress, estimates of programme coverage, and changes in coverage over time, are needed. Methods Systematic review of mental health programme evaluations that assess coverage, measured either as the proportion of the target population in contact with services (contact coverage) or as the proportion of the target population who receive appropriate and effective care (effective coverage). We performed a search of electronic databases and grey literature up to March 2013 and contacted experts in the field. Methods to estimate the numerator (service utilization) and the denominator (target population) were reviewed to explore methods which could be used in programme evaluations. Results We identified 15 735 unique records of which only seven met the inclusion criteria. All studies reported contact coverage. No study explicitly measured effective coverage, but it was possible to estimate this for one study. In six studies the numerator of coverage, service utilization, was estimated using routine clinical information, whereas one study used a national community survey. The methods for estimating the denominator, the population in need of services, were more varied and included national prevalence surveys case registers, and estimates from the literature. Conclusions Very few coverage estimates are available. Coverage could be estimated at low cost by combining routine programme data with population prevalence estimates from national surveys. PMID:24760874

  6. Altering micro-environments to change population health behaviour: towards an evidence base for choice architecture interventions

    PubMed Central

    2013-01-01

    Background The idea that behaviour can be influenced at population level by altering the environments within which people make choices (choice architecture) has gained traction in policy circles. However, empirical evidence to support this idea is limited, especially its application to changing health behaviour. We propose an evidence-based definition and typology of choice architecture interventions that have been implemented within small-scale micro-environments and evaluated for their effects on four key sets of health behaviours: diet, physical activity, alcohol and tobacco use. Discussion We argue that the limitations of the evidence base are due not simply to an absence of evidence, but also to a prior lack of definitional and conceptual clarity concerning applications of choice architecture to public health intervention. This has hampered the potential for systematic assessment of existing evidence. By seeking to address this issue, we demonstrate how our definition and typology have enabled systematic identification and preliminary mapping of a large body of available evidence for the effects of choice architecture interventions. We discuss key implications for further primary research, evidence synthesis and conceptual development to support the design and evaluation of such interventions. Summary This conceptual groundwork provides a foundation for future research to investigate the effectiveness of choice architecture interventions within micro-environments for changing health behaviour. The approach we used may also serve as a template for mapping other under-explored fields of enquiry. PMID:24359583

  7. Large-scale image region documentation for fully automated image biomarker algorithm development and evaluation.

    PubMed

    Reeves, Anthony P; Xie, Yiting; Liu, Shuang

    2017-04-01

    With the advent of fully automated image analysis and modern machine learning methods, there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. This paper presents a method and implementation for facilitating such datasets that addresses the critical issue of size scaling for algorithm validation and evaluation; current evaluation methods that are usually used in academic studies do not scale to large datasets. This method includes protocols for the documentation of many regions in very large image datasets; the documentation may be incrementally updated by new image data and by improved algorithm outcomes. This method has been used for 5 years in the context of chest health biomarkers from low-dose chest CT images that are now being used with increasing frequency in lung cancer screening practice. The lung scans are segmented into over 100 different anatomical regions, and the method has been applied to a dataset of over 20,000 chest CT images. Using this framework, the computer algorithms have been developed to achieve over 90% acceptable image segmentation on the complete dataset.

  8. Conservation of lynx in the United States: A systematic approach to closing critical knowledge gaps [Chapter 17

    Treesearch

    Keith B. Aubry; Leonard F. Ruggiero; John R. Squires; Kevin S. McKelvey; Gary M. Koehler; Steven W. Buskirk; Charles J. Krebs

    2000-01-01

    Large-scale ecological studies and assessments are often implemented only after the focus of study generates substantial social, political, or legal pressure to take action (e.g., Thomas et al. 1990; Ruggiero et al. 1991; FEMAT 1993). In such a funding environment, the coordinated planning of research may suffer as the pressure to produce results escalates. To avoid...

  9. U.S. Regional Aquifer Analysis Program

    NASA Astrophysics Data System (ADS)

    Johnson, Ivan

    As a result of the severe 1976-1978 drought, Congress in 1978 requested that the U.S. Geological Survey (USGS) initiate studies of the nation's aquifers on a regional scale. This continuing USGS project, the Regional Aquifer System Analysis (RASA) Program, consists of systematic studies of the quality and quantity of water in the regional groundwater systems that supply a large part of the nation's water.

  10. Neutrino footprint in large scale structure

    NASA Astrophysics Data System (ADS)

    Garay, Carlos Peña; Verde, Licia; Jimenez, Raul

    2017-03-01

    Recent constrains on the sum of neutrino masses inferred by analyzing cosmological data, show that detecting a non-zero neutrino mass is within reach of forthcoming cosmological surveys. Such a measurement will imply a direct determination of the absolute neutrino mass scale. Physically, the measurement relies on constraining the shape of the matter power spectrum below the neutrino free streaming scale: massive neutrinos erase power at these scales. However, detection of a lack of small-scale power from cosmological data could also be due to a host of other effects. It is therefore of paramount importance to validate neutrinos as the source of power suppression at small scales. We show that, independent on hierarchy, neutrinos always show a footprint on large, linear scales; the exact location and properties are fully specified by the measured power suppression (an astrophysical measurement) and atmospheric neutrinos mass splitting (a neutrino oscillation experiment measurement). This feature cannot be easily mimicked by systematic uncertainties in the cosmological data analysis or modifications in the cosmological model. Therefore the measurement of such a feature, up to 1% relative change in the power spectrum for extreme differences in the mass eigenstates mass ratios, is a smoking gun for confirming the determination of the absolute neutrino mass scale from cosmological observations. It also demonstrates the synergy between astrophysics and particle physics experiments.

  11. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    PubMed

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  12. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data

    PubMed Central

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H.

    2013-01-01

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the “big data” challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce. PMID:24501719

  13. In Situ Cross-Linking of Stimuli-Responsive Hemicellulose Microgels during Spray Drying

    PubMed Central

    2015-01-01

    Chemical cross-linking during spray drying offers the potential for green fabrication of microgels with a rapid stimuli response and good blood compatibility and provides a platform for stimuli-responsive hemicellulose microgels (SRHMGs). The cross-linking reaction occurs rapidly in situ at elevated temperature during spray drying, enabling the production of microgels in a large scale within a few minutes. The SRHMGs with an average size range of ∼1–4 μm contain O-acetyl-galactoglucomannan as a matrix and poly(acrylic acid), aniline pentamer (AP), and iron as functional additives, which are responsive to external changes in pH, electrochemical stimuli, magnetic field, or dual-stimuli. The surface morphologies, chemical compositions, charge, pH, and mechanical properties of these smart microgels were evaluated using scanning electron microscopy, IR, zeta potential measurements, pH evaluation, and quantitative nanomechanical mapping, respectively. Different oxidation states were observed when AP was introduced, as confirmed by UV spectroscopy and cyclic voltammetry. Systematic blood compatibility evaluations revealed that the SRHMGs have good blood compatibility. This bottom-up strategy to synthesize SRHMGs enables a new route to the production of smart microgels for biomedical applications. PMID:25630464

  14. In situ cross-linking of stimuli-responsive hemicellulose microgels during spray drying.

    PubMed

    Zhao, Weifeng; Nugroho, Robertus Wahyu N; Odelius, Karin; Edlund, Ulrica; Zhao, Changsheng; Albertsson, Ann-Christine

    2015-02-25

    Chemical cross-linking during spray drying offers the potential for green fabrication of microgels with a rapid stimuli response and good blood compatibility and provides a platform for stimuli-responsive hemicellulose microgels (SRHMGs). The cross-linking reaction occurs rapidly in situ at elevated temperature during spray drying, enabling the production of microgels in a large scale within a few minutes. The SRHMGs with an average size range of ∼ 1-4 μm contain O-acetyl-galactoglucomannan as a matrix and poly(acrylic acid), aniline pentamer (AP), and iron as functional additives, which are responsive to external changes in pH, electrochemical stimuli, magnetic field, or dual-stimuli. The surface morphologies, chemical compositions, charge, pH, and mechanical properties of these smart microgels were evaluated using scanning electron microscopy, IR, zeta potential measurements, pH evaluation, and quantitative nanomechanical mapping, respectively. Different oxidation states were observed when AP was introduced, as confirmed by UV spectroscopy and cyclic voltammetry. Systematic blood compatibility evaluations revealed that the SRHMGs have good blood compatibility. This bottom-up strategy to synthesize SRHMGs enables a new route to the production of smart microgels for biomedical applications.

  15. Do school based food and nutrition policies improve diet and reduce obesity?

    PubMed

    Jaime, Patricia Constante; Lock, Karen

    2009-01-01

    To review the effectiveness of school food and nutrition policies world wide in improving the school food environment, student's dietary intake, and decreasing overweight and obesity. Systematic review of published and unpublished literature up to November 2007 of three categories of nutrition policy; nutrition guidelines, regulation of food and/or beverage availability, and price interventions applied in preschools, primary and secondary schools. 18 studies met the inclusion criteria. Most evidence of effectiveness was found for the impact of both nutrition guidelines and price interventions on intake and availability of food and drinks, with less conclusive research on product regulation. Despite the introduction of school food policies worldwide few large scale or national policies have been evaluated, and all included studies were from the USA and Europe. Some current school policies have been effective in improving the food environment and dietary intake in schools, but there is little evaluation of their impact on BMI. As schools have been proposed worldwide as a major setting for tackling childhood obesity it is essential that future policy evaluations measure the long term effectiveness of a range of school food policies in tackling both dietary intake and overweight and obesity.

  16. METHODOLOGICAL QUALITY OF ECONOMIC EVALUATIONS ALONGSIDE TRIALS OF KNEE PHYSIOTHERAPY.

    PubMed

    García-Pérez, Lidia; Linertová, Renata; Arvelo-Martín, Alejandro; Guerra-Marrero, Carolina; Martínez-Alberto, Carlos Enrique; Cuéllar-Pompa, Leticia; Escobar, Antonio; Serrano-Aguilar, Pedro

    2017-01-01

    The methodological quality of an economic evaluation performed alongside a clinical trial can be underestimated if the paper does not report key methodological features. This study discusses methodological assessment issues on the example of a systematic review on cost-effectiveness of physiotherapy for knee osteoarthritis. Six economic evaluation studies included in the systematic review and related clinical trials were assessed using the 10-question check-list by Drummond and the Physiotherapy Evidence Database (PEDro) scale. All economic evaluations were performed alongside a clinical trial but the studied interventions were too heterogeneous to be synthesized. Methodological quality of the economic evaluations reported in the papers was not free of drawbacks, and in some cases, it improved when information from the related clinical trial was taken into account. Economic evaluation papers dedicate little space to methodological features of related clinical trials; therefore, the methodological quality can be underestimated if evaluated separately from the trials. Future economic evaluations should follow more strictly the recommendations about methodology and the authors should pay special attention to the quality of reporting.

  17. Antitoxin Treatment of Inhalation Anthrax: A Systematic Review

    PubMed Central

    Huang, Eileen; Pillai, Satish K.; Bower, William A.; Hendricks, Katherine A.; Guarnizo, Julie T.; Hoyle, Jamechia D.; Gorman, Susan E.; Boyer, Anne E.; Quinn, Conrad P.; Meaney-Delman, Dana

    2016-01-01

    Concern about use of anthrax as a bioweapon prompted development of novel anthrax antitoxins for treatment. Clinical guidelines for the treatment of anthrax recommend antitoxin therapy in combination with intravenous antimicrobials; however, a large-scale or mass anthrax incident may exceed antitoxin availability and create a need for judicious antitoxin use. We conducted a systematic review of antitoxin treatment of inhalation anthrax in humans and experimental animals to inform antitoxin recommendations during a large-scale or mass anthrax incident. A comprehensive search of 11 databases and the FDA website was conducted to identify relevant animal studies and human reports: 28 animal studies and 3 human cases were identified. Antitoxin monotherapy at or shortly after symptom onset demonstrates increased survival compared to no treatment in animals. With early treatment, survival did not differ between antimicrobial monotherapy and antimicrobial-antitoxin therapy in nonhuman primates and rabbits. With delayed treatment, antitoxin-antimicrobial treatment increased rabbit survival. Among human cases, addition of antitoxin to combination antimicrobial treatment was associated with survival in 2 of the 3 cases treated. Despite the paucity of human data, limited animal data suggest that adjunctive antitoxin therapy may improve survival. Delayed treatment studies suggest improved survival with combined antitoxin-antimicrobial therapy, although a survival difference compared with antimicrobial therapy alone was not demonstrated statistically. In a mass anthrax incident with limited antitoxin supplies, antitoxin treatment of individuals who have not demonstrated a clinical benefit from antimicrobials, or those who present with more severe illness, may be warranted. Additional pathophysiology studies are needed, and a point-of-care assay correlating toxin levels with clinical status may provide important information to guide antitoxin use during a large-scale anthrax incident. PMID:26690378

  18. Antitoxin Treatment of Inhalation Anthrax: A Systematic Review.

    PubMed

    Huang, Eileen; Pillai, Satish K; Bower, William A; Hendricks, Katherine A; Guarnizo, Julie T; Hoyle, Jamechia D; Gorman, Susan E; Boyer, Anne E; Quinn, Conrad P; Meaney-Delman, Dana

    2015-01-01

    Concern about use of anthrax as a bioweapon prompted development of novel anthrax antitoxins for treatment. Clinical guidelines for the treatment of anthrax recommend antitoxin therapy in combination with intravenous antimicrobials; however, a large-scale or mass anthrax incident may exceed antitoxin availability and create a need for judicious antitoxin use. We conducted a systematic review of antitoxin treatment of inhalation anthrax in humans and experimental animals to inform antitoxin recommendations during a large-scale or mass anthrax incident. A comprehensive search of 11 databases and the FDA website was conducted to identify relevant animal studies and human reports: 28 animal studies and 3 human cases were identified. Antitoxin monotherapy at or shortly after symptom onset demonstrates increased survival compared to no treatment in animals. With early treatment, survival did not differ between antimicrobial monotherapy and antimicrobial-antitoxin therapy in nonhuman primates and rabbits. With delayed treatment, antitoxin-antimicrobial treatment increased rabbit survival. Among human cases, addition of antitoxin to combination antimicrobial treatment was associated with survival in 2 of the 3 cases treated. Despite the paucity of human data, limited animal data suggest that adjunctive antitoxin therapy may improve survival. Delayed treatment studies suggest improved survival with combined antitoxin-antimicrobial therapy, although a survival difference compared with antimicrobial therapy alone was not demonstrated statistically. In a mass anthrax incident with limited antitoxin supplies, antitoxin treatment of individuals who have not demonstrated a clinical benefit from antimicrobials, or those who present with more severe illness, may be warranted. Additional pathophysiology studies are needed, and a point-of-care assay correlating toxin levels with clinical status may provide important information to guide antitoxin use during a large-scale anthrax incident.

  19. Assessing Technical Competence in Surgical Trainees: A Systematic Review.

    PubMed

    Szasz, Peter; Louridas, Marisa; Harris, Kenneth A; Aggarwal, Rajesh; Grantcharov, Teodor P

    2015-06-01

    To systematically examine the literature describing the methods by which technical competence is assessed in surgical trainees. The last decade has witnessed an evolution away from time-based surgical education. In response, governing bodies worldwide have implemented competency-based education paradigms. The definition of competence, however, remains elusive, and the impact of these education initiatives in terms of assessment methods remains unclear. A systematic review examining the methods by which technical competence is assessed was conducted by searching MEDLINE, EMBASE, PsychINFO, and the Cochrane database of systematic reviews. Abstracts of retrieved studies were reviewed and those meeting inclusion criteria were selected for full review. Data were retrieved in a systematic manner, the validity and reliability of the assessment methods was evaluated, and quality was assessed using the Grading of Recommendations Assessment, Development and Evaluation classification. Of the 6814 studies identified, 85 studies involving 2369 surgical residents were included in this review. The methods used to assess technical competence were categorized into 5 groups; Likert scales (37), benchmarks (31), binary outcomes (11), novel tools (4), and surrogate outcomes (2). Their validity and reliability were mostly previously established. The overall Grading of Recommendations Assessment, Development and Evaluation for randomized controlled trials was high and low for the observational studies. The definition of technical competence continues to be debated within the medical literature. The methods used to evaluate technical competence predominantly include instruments that were originally created to assess technical skill. Very few studies identify standard setting approaches that differentiate competent versus noncompetent performers; subsequently, this has been identified as an area with great research potential.

  20. A Systematic Review on the Existing Screening Pathways for Lynch Syndrome Identification.

    PubMed

    Tognetto, Alessia; Michelazzo, Maria Benedetta; Calabró, Giovanna Elisa; Unim, Brigid; Di Marco, Marco; Ricciardi, Walter; Pastorino, Roberta; Boccia, Stefania

    2017-01-01

    Lynch syndrome (LS) is the most common hereditary colon cancer syndrome, accounting for 3-5% of colorectal cancer (CRC) cases, and it is associated with the development of other cancers. Early detection of individuals with LS is relevant, since they can take advantage of life-saving intensive care surveillance. The debate regarding the best screening policy, however, is far from being concluded. This prompted us to conduct a systematic review of the existing screening pathways for LS. We performed a systematic search of MEDLINE, ISI Web of Science, and SCOPUS online databases for the existing screening pathways for LS. The eligibility criteria for inclusion in this review required that the studies evaluated a structured and permanent screening pathway for the identification of LS carriers. The effectiveness of the pathways was analyzed in terms of LS detection rate. We identified five eligible studies. All the LS screening pathways started from CRC cases, of which three followed a universal screening approach. Concerning the laboratory procedures, the pathways used immunohistochemistry and/or microsatellite instability testing. If the responses of the tests indicated a risk for LS, the genetic counseling, performed by a geneticist or a genetic counselor, was mandatory to undergo DNA genetic testing. The overall LS detection rate ranged from 0 to 5.2%. This systematic review reported different existing pathways for the identification of LS patients. Although current clinical guidelines suggest to test all the CRC cases to identify LS cases, the actual implementation of pathways for LS identification has not been realized. Large-scale screening programs for LS have the potential to reduce morbidity and mortality for CRC, but coordinated efforts in educating all key stakeholders and addressing public needs are still required.

  1. Effect of Gender on the Knowledge of Medicinal Plants: Systematic Review and Meta-Analysis

    PubMed Central

    Torres-Avilez, Wendy; de Medeiros, Patrícia Muniz

    2016-01-01

    Knowledge of medicinal plants is not only one of the main components in the structure of knowledge in local medical systems but also one of the most studied resources. This study uses a systematic review and meta-analysis of a compilation of ethnobiological studies with a medicinal plant component and the variable of gender to evaluate whether there is a gender-based pattern in medicinal plant knowledge on different scales (national, continental, and global). In this study, three types of meta-analysis are conducted on different scales. We detect no significant differences on the global level; women and men have the same rich knowledge. On the national and continental levels, significant differences are observed in both directions (significant for men and for women), and a lack of significant differences in the knowledge of the genders is also observed. This finding demonstrates that there is no gender-based pattern for knowledge on different scales. PMID:27795730

  2. Measuring HIV-related stigma among healthcare providers: a systematic review.

    PubMed

    Alexandra Marshall, S; Brewington, Krista M; Kathryn Allison, M; Haynes, Tiffany F; Zaller, Nickolas D

    2017-11-01

    In the United States, HIV-related stigma in the healthcare setting is known to affect the utilization of prevention and treatment services. Multiple HIV/AIDS stigma scales have been developed to assess the attitudes and behaviors of the general population in the U.S. towards people living with HIV/AIDS, but fewer scales have been developed to assess HIV-related stigma among healthcare providers. This systematic review aimed to identify and evaluate the measurement tools used to assess HIV stigma among healthcare providers in the U.S. The five studies selected quantitatively assessed the perceived HIV stigma among healthcare providers from the patient or provider perspective, included HIV stigma as a primary outcome, and were conducted in the U.S. These five studies used adapted forms of four HIV stigma scales. No standardized measure was identified. Assessment of HIV stigma among providers is valuable to better understand how this phenomenon may impact health outcomes and to inform interventions aiming to improve healthcare delivery and utilization.

  3. Development and evaluation of the INSPIRE measure of staff support for personal recovery.

    PubMed

    Williams, Julie; Leamy, Mary; Bird, Victoria; Le Boutillier, Clair; Norton, Sam; Pesola, Francesca; Slade, Mike

    2015-05-01

    No individualised standardised measure of staff support for mental health recovery exists. To develop and evaluate a measure of staff support for recovery. initial draft of measure based on systematic review of recovery processes; consultation (n = 61); and piloting (n = 20). Psychometric evaluation: three rounds of data collection from mental health service users (n = 92). INSPIRE has two sub-scales. The 20-item Support sub-scale has convergent validity (0.60) and adequate sensitivity to change. Exploratory factor analysis (variance 71.4-85.1 %, Kaiser-Meyer-Olkin 0.65-0.78) and internal consistency (range 0.82-0.85) indicate each recovery domain is adequately assessed. The 7-item Relationship sub-scale has convergent validity 0.69, test-retest reliability 0.75, internal consistency 0.89, a one-factor solution (variance 70.5 %, KMO 0.84) and adequate sensitivity to change. A 5-item Brief INSPIRE was also evaluated. INSPIRE and Brief INSPIRE demonstrate adequate psychometric properties, and can be recommended for research and clinical use.

  4. Viscous decay of nonlinear oscillations of a spherical bubble at large Reynolds number

    NASA Astrophysics Data System (ADS)

    Smith, W. R.; Wang, Q. X.

    2017-08-01

    The long-time viscous decay of large-amplitude bubble oscillations is considered in an incompressible Newtonian fluid, based on the Rayleigh-Plesset equation. At large Reynolds numbers, this is a multi-scaled problem with a short time scale associated with inertial oscillation and a long time scale associated with viscous damping. A multi-scaled perturbation method is thus employed to solve the problem. The leading-order analytical solution of the bubble radius history is obtained to the Rayleigh-Plesset equation in a closed form including both viscous and surface tension effects. Some important formulae are derived including the following: the average energy loss rate of the bubble system during each cycle of oscillation, an explicit formula for the dependence of the oscillation frequency on the energy, and an implicit formula for the amplitude envelope of the bubble radius as a function of the energy. Our theory shows that the energy of the bubble system and the frequency of oscillation do not change on the inertial time scale at leading order, the energy loss rate on the long viscous time scale being inversely proportional to the Reynolds number. These asymptotic predictions remain valid during each cycle of oscillation whether or not compressibility effects are significant. A systematic parametric analysis is carried out using the above formula for the energy of the bubble system, frequency of oscillation, and minimum/maximum bubble radii in terms of the Reynolds number, the dimensionless initial pressure of the bubble gases, and the Weber number. Our results show that the frequency and the decay rate have substantial variations over the lifetime of a decaying oscillation. The results also reveal that large-amplitude bubble oscillations are very sensitive to small changes in the initial conditions through large changes in the phase shift.

  5. A Systematic Review of Challenging Behaviors in Children Exposed Prenatally to Substances of Abuse

    ERIC Educational Resources Information Center

    Dixon, Dennis R.; Kurtz, Patricia F.; Chin, Michelle D.

    2008-01-01

    A review of the existing literature on the occurrence of challenging behavior among children with prenatal drug exposure was conducted. While a large number of studies were identified that evaluated various outcomes of prenatal drug exposure, only 37 were found that directly evaluated challenging behaviors. Of the 37 studies, 23 focused on…

  6. An Evaluation Model To Select an Integrated Learning System in a Large, Suburban School District.

    ERIC Educational Resources Information Center

    Curlette, William L.; And Others

    The systematic evaluation process used in Georgia's DeKalb County School System to purchase comprehensive instructional software--an integrated learning system (ILS)--is described, and the decision-making model for selection is presented. Selection and implementation of an ILS were part of an instructional technology plan for the DeKalb schools…

  7. Atmospheric forcing of the upper ocean transport in the Gulf of Mexico: From seasonal to diurnal scales

    NASA Astrophysics Data System (ADS)

    Judt, Falko; Chen, Shuyi S.; Curcic, Milan

    2016-06-01

    The 2010 Deepwater Horizon oil spill in the Gulf of Mexico (GoM) was an environmental disaster, which highlighted the urgent need to predict the transport and dispersion of hydrocarbon. Although the variability of the atmospheric forcing plays a major role in the upper ocean circulation and transport of the pollutants, the air-sea interaction on various time scales is not well understood. This study provides a comprehensive overview of the atmospheric forcing and upper ocean response in the GoM from seasonal to diurnal time scales, using climatologies derived from long-term observations, in situ observations from two field campaigns, and a coupled model. The atmospheric forcing in the GoM is characterized by striking seasonality. In the summer, the time-average large-scale forcing is weak, despite occasional extreme winds associated with hurricanes. In the winter, the atmospheric forcing is much stronger, and dominated by synoptic variability on time scales of 3-7 days associated with winter storms and cold air outbreaks. The diurnal cycle is more pronounced during the summer, when sea breeze circulations affect the coastal regions and nighttime wind maxima occur over the offshore waters. Realtime predictions from a high-resolution atmosphere-wave-ocean coupled model were evaluated for both summer and winter conditions during the Grand LAgrangian Deployment (GLAD) in July-August 2012 and the Surfzone Coastal Oil Pathways Experiment (SCOPE) in November-December 2013. The model generally captured the variability of atmospheric forcing on all scales, but suffered from some systematic errors.

  8. SU-E-J-257: A PCA Model to Predict Adaptive Changes for Head&neck Patients Based On Extraction of Geometric Features From Daily CBCT Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, M; Henry Ford Health System, Detroit, MI; Siddiqui, F

    2015-06-15

    Purpose: Using daily cone beam CTs (CBCTs) to develop principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients and to assess the possibility of using these prospectively in adaptive radiation therapy (ART). Methods: Planning CT (pCT) images of 4 H&N patients were deformed to model several different systematic changes in patient anatomy during the course of the radiation therapy (RT). A Pinnacle plugin was used to linearly interpolate the systematic change in patient for the 35 fraction RT course and to generate a set of 35 synthetic CBCTs. Each synthetic CBCT represents the systematic changemore » in patient anatomy for each fraction. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CBCTs with random fraction-to-fraction changes were superimposed on the DVFs. A patient-specific PCA model was built using these DVFs containing systematic plus random changes. It was hypothesized that resulting eigenDVFs (EDVFs) with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: For all 4 patients, the PCA model provided different results depending on the type and size of systematic change in patient’s body. PCA was more successful in capturing the systematic changes early in the treatment course when these were of a larger scale with respect to the random fraction-to-fraction changes in patient’s anatomy. For smaller scale systematic changes, random changes in patient could completely “hide” the systematic change. Conclusion: The leading EDVF from the patientspecific PCA models could tentatively be identified as a major systematic change during treatment if the systematic change is large enough with respect to random fraction-to-fraction changes. Otherwise, leading EDVF could not represent systematic changes reliably. This work is expected to facilitate development of population-based PCA models that can be used to prospectively identify significant anatomical changes early in treatment. This work is supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  9. Analysis and correlation of the test data from an advanced technology rotor system

    NASA Technical Reports Server (NTRS)

    Jepson, D.; Moffitt, R.; Hilzinger, K.; Bissell, J.

    1983-01-01

    Comparisons were made of the performance and blade vibratory loads characteristics for an advanced rotor system as predicted by analysis and as measured in a 1/5 scale model wind tunnel test, a full scale model wind tunnel test and flight test. The accuracy with which the various tools available at the various stages in the design/development process (analysis, model test etc.) could predict final characteristics as measured on the aircraft was determined. The accuracy of the analyses in predicting the effects of systematic tip planform variations investigated in the full scale wind tunnel test was evaluated.

  10. Challenges in evaluating cancer as a clinical outcome in postapproval studies of drug safety

    PubMed Central

    Pinheiro, Simone P.; Rivera, Donna R.; Graham, David J.; Freedman, Andrew N.; Major, Jacqueline M.; Penberthy, Lynne; Levenson, Mark; Bradley, Marie C.; Wong, Hui-Lee; Ouellet-Hellstrom, Rita

    2017-01-01

    Pharmaceuticals approved in the United States are largely not known human carcinogens. However, cancer signals associated with pharmaceuticals may be hypothesized or arise after product approval. There are many study designs that can be used to evaluate cancer as an outcome in the postapproval setting. Because prospective systematic collection of cancer outcomes from a large number of individuals may be lengthy, expensive, and challenging, leveraging data from large existing databases are an integral approach. Such studies have the capability to evaluate the clinical experience of a large number of individuals, yet there are unique methodological challenges involved in their use to evaluate cancer outcomes. To discuss methodological challenges and potential solutions, the Food and Drug Administration and the National Cancer Institute convened a two-day public meeting in 2014. This commentary summarizes the most salient issues discussed at the meeting. PMID:27663208

  11. Challenges in evaluating cancer as a clinical outcome in postapproval studies of drug safety.

    PubMed

    Pinheiro, Simone P; Rivera, Donna R; Graham, David J; Freedman, Andrew N; Major, Jacqueline M; Penberthy, Lynne; Levenson, Mark; Bradley, Marie C; Wong, Hui-Lee; Ouellet-Hellstrom, Rita

    2016-11-01

    Pharmaceuticals approved in the United States are largely not known human carcinogens. However, cancer signals associated with pharmaceuticals may be hypothesized or arise after product approval. There are many study designs that can be used to evaluate cancer as an outcome in the postapproval setting. Because prospective systematic collection of cancer outcomes from a large number of individuals may be lengthy, expensive, and challenging, leveraging data from large existing databases are an integral approach. Such studies have the capability to evaluate the clinical experience of a large number of individuals, yet there are unique methodological challenges involved in their use to evaluate cancer outcomes. To discuss methodological challenges and potential solutions, the Food and Drug Administration and the National Cancer Institute convened a two-day public meeting in 2014. This commentary summarizes the most salient issues discussed at the meeting. Published by Elsevier Inc.

  12. Nonlinear effects of locally heterogeneous hydraulic conductivity fields on regional stream-aquifer exchanges

    NASA Astrophysics Data System (ADS)

    Zhu, J.; Winter, C. L.; Wang, Z.

    2015-08-01

    Computational experiments are performed to evaluate the effects of locally heterogeneous conductivity fields on regional exchanges of water between stream and aquifer systems in the Middle Heihe River Basin (MHRB) of northwestern China. The effects are found to be nonlinear in the sense that simulated discharges from aquifers to streams are systematically lower than discharges produced by a base model parameterized with relatively coarse effective conductivity. A similar, but weaker, effect is observed for stream leakage. The study is organized around three hypotheses: (H1) small-scale spatial variations of conductivity significantly affect regional exchanges of water between streams and aquifers in river basins, (H2) aggregating small-scale heterogeneities into regional effective parameters systematically biases estimates of stream-aquifer exchanges, and (H3) the biases result from slow-paths in groundwater flow that emerge due to small-scale heterogeneities. The hypotheses are evaluated by comparing stream-aquifer fluxes produced by the base model to fluxes simulated using realizations of the MHRB characterized by local (grid-scale) heterogeneity. Levels of local heterogeneity are manipulated as control variables by adjusting coefficients of variation. All models are implemented using the MODFLOW simulation environment, and the PEST tool is used to calibrate effective conductivities defined over 16 zones within the MHRB. The effective parameters are also used as expected values to develop log-normally distributed conductivity (K) fields on local grid scales. Stream-aquifer exchanges are simulated with K fields at both scales and then compared. Results show that the effects of small-scale heterogeneities significantly influence exchanges with simulations based on local-scale heterogeneities always producing discharges that are less than those produced by the base model. Although aquifer heterogeneities are uncorrelated at local scales, they appear to induce coherent slow-paths in groundwater fluxes that in turn reduce aquifer-stream exchanges. Since surface water-groundwater exchanges are critical hydrologic processes in basin-scale water budgets, these results also have implications for water resources management.

  13. Pharyngeal Residue Severity Rating Scales Based on Fiberoptic Endoscopic Evaluation of Swallowing: A Systematic Review.

    PubMed

    Neubauer, Paul D; Hersey, Denise P; Leder, Steven B

    2016-06-01

    Identification of pharyngeal residue severity located in the valleculae and pyriform sinuses has always been a primary goal during fiberoptic endoscopic evaluation of swallowing (FEES). Pharyngeal residue is a clinical sign of potential prandial aspiration making an accurate description of its severity an important but difficult challenge. A reliable, validated, and generalizable pharyngeal residue severity rating scale for FEES would be beneficial. A systematic review of the published English language literature since 1995 was conducted to determine the quality of existing pharyngeal residue severity rating scales based on FEES. Databases were searched using controlled vocabulary words and synonymous free text words for topics of interest (deglutition disorders, pharyngeal residue, endoscopy, videofluoroscopy, fiberoptic technology, aspiration, etc.) and outcomes of interest (scores, scales, grades, tests, FEES, etc.). Search strategies were adjusted for syntax appropriate for each database/platform. Data sources included MEDLINE (OvidSP 1946-April Week 3 2015), Embase (OvidSP 1974-2015 April 20), Scopus (Elsevier), and the unindexed material in PubMed (NLM/NIH) were searched for relevant articles. Supplementary efforts to identify studies included checking reference lists of articles retrieved. Scales were compared using qualitative properties (sample size, severity definitions, number of raters, and raters' experience and training) and psychometric analyses (randomization, intra- and inter-rater reliability, and construct validity). Seven articles describing pharyngeal residue severity rating scales met inclusion criteria. Six of seven scales had insufficient data to support their use as evidenced by methodological weaknesses with both qualitative properties and psychometric analyses. There is a need for qualitative and psychometrically reliable, validated, and generalizable pharyngeal residue severity rating scales that are anatomically specific, image-based, and easily learned by both novice and experienced clinicians. Only the Yale Pharyngeal Residue Severity Rating Scale, an anatomically defined and image-based tool, met all qualitative and psychometric criteria necessary for a valid, reliable, and generalizable vallecula and pyriform sinus severity rating scale based on FEES.

  14. Topics in geophysical fluid dynamics: Atmospheric dynamics, dynamo theory, and climate dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghil, M.; Childress, S.

    1987-01-01

    This text is the first study to apply systematically the successive bifurcations approach to complex time-dependent processes in large scale atmospheric dynamics, geomagnetism, and theoretical climate dynamics. The presentation of recent results on planetary-scale phenomena in the earth's atmosphere, ocean, cryosphere, mantle and core provides an integral account of mathematical theory and methods together with physical phenomena and processes. The authors address a number of problems in rapidly developing areas of geophysics, bringing into closer contact the modern tools of nonlinear mathematics and the novel problems of global change in the environment.

  15. The hubble constant.

    PubMed

    Huchra, J P

    1992-04-17

    The Hubble constant is the constant of proportionality between recession velocity and distance in the expanding universe. It is a fundamental property of cosmology that sets both the scale and the expansion age of the universe. It is determined by measurement of galaxy The Hubble constant is the constant of proportionality between recession velocity and development of new techniques for the measurements of galaxy distances, both calibration uncertainties and debates over systematic errors remain. Current determinations still range over nearly a factor of 2; the higher values favored by most local measurements are not consistent with many theories of the origin of large-scale structure and stellar evolution.

  16. Beyond student ratings: peer observation of classroom and clinical teaching.

    PubMed

    Berk, Ronald A; Naumann, Phyllis L; Appling, Susan E

    2004-01-01

    Peer observation of classroom and clinical teaching has received increased attention over the past decade in schools of nursing to augment student ratings of teaching effectiveness. One essential ingredient is the scale used to evaluate performance. A five-step systematic procedure for adapting, writing, and building any peer observation scale is described. The differences between the development of a classroom observation scale and an appraisal scale to observe clinical instructors are examined. Psychometric issues peculiar to observation scales are discussed in terms of content validity, eight types of response bias, and interobserver reliability. The applications of the scales in one school of nursing as part of the triangulation of methods with student ratings and the teaching portfolio are illustrated. Copies of the scales are also provided.

  17. Cosmology from Cosmic Microwave Background and large- scale structure

    NASA Astrophysics Data System (ADS)

    Xu, Yongzhong

    2003-10-01

    This dissertation consists of a series of studies, constituting four published papers, involving the Cosmic Microwave Background and the large scale structure, which help constrain Cosmological parameters and potential systematic errors. First, we present a method for comparing and combining maps with different resolutions and beam shapes, and apply it to the Saskatoon, QMAP and COBE/DMR data sets. Although the Saskatoon and QMAP maps detect signal at the 21σ and 40σ, levels, respectively, their difference is consistent with pure noise, placing strong limits on possible systematic errors. In particular, we obtain quantitative upper limits on relative calibration and pointing errors. Splitting the combined data by frequency shows similar consistency between the Ka- and Q-bands, placing limits on foreground contamination. The visual agreement between the maps is equally striking. Our combined QMAP+Saskatoon map, nicknamed QMASK, is publicly available at www.hep.upenn.edu/˜xuyz/qmask.html together with its 6495 x 6495 noise covariance matrix. This thoroughly tested data set covers a large enough area (648 square degrees—at the time, the largest degree-scale map available) to allow a statistical comparison with LOBE/DMR, showing good agreement. By band-pass-filtering the QMAP and Saskatoon maps, we are also able to spatially compare them scale-by-scale to check for beam- and pointing-related systematic errors. Using the QMASK map, we then measure the cosmic microwave background (CMB) power spectrum on angular scales ℓ ˜ 30 200 (1° 6°), and we test it for non-Gaussianity using morphological statistics known as Minkowski functionals. We conclude that the QMASK map is neither a very typical nor a very exceptional realization of a Gaussian random field. At least about 20% of the 1000 Gaussian Monte Carlo maps differ more than the QMASK map from the mean morphological parameters of the Gaussian fields. Finally, we compute the real-space power spectrum and the redshift-space distortions of galaxies in the 2dF 100k galaxy redshift survey using pseudo-Karhunen-Loève eigenmodes and the stochastic bias formalism. Our results agree well with those published by the 2dFGRS team, and have the added advantage of producing easy-to-interpret uncorrelated minimum-variance measurements of the galaxy- galaxy, galaxy-velocity and velocity-velocity power spectra in 27 k-bands, with narrow and well-behaved window functions in the range 0.01 h /Mpc < k < 0.8 h/Mpc. We find no significant detection of baryonic wiggles. We measure the galaxy-matter correlation coefficient r > 0.4 and the redshift-distortion parameter β = 0.49 ± 0.16 for r = 1.

  18. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    PubMed Central

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  19. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Converting Data to Knowledge: One District's Experience Using Large-Scale Proficiency Assessment

    ERIC Educational Resources Information Center

    Davin, Kristin J.; Rempert, Tania A.; Hammerand, Amy A.

    2014-01-01

    The present study reports data from a large-scale foreign language proficiency assessment to explore trends across a large urban school district. These data were used in conjunction with data from teacher and student questionnaires to make recommendations for foreign language programs across the district. This evaluation process resulted in…

Top