Sample records for estimation overview methodology

  1. Methodological approaches in conducting overviews: current state in HTA agencies.

    PubMed

    Pieper, Dawid; Antoine, Sunya-Lee; Morfeld, Jana-Carina; Mathes, Tim; Eikermann, Michaela

    2014-09-01

    Overviews search for reviews rather than for primary studies. They might have the potential to support decision making within a shorter time frame by reducing production time. We aimed to summarize available instructions for authors intending to conduct overviews as well as the currently applied methodology of overviews in international Health Technology Assessment (HTA) agencies. We identified 127 HTA agencies and scanned their websites for methodological handbooks as well as published overviews as HTA reports. Additionally, we contacted HTA agencies by e-mail to retrieve possible unidentified handbooks or other related sources. In total, eight HTA agencies providing methodological support were found. Thirteen HTA agencies were found to have produced overviews since 2007, but only six of them published more than four overviews. Overviews were mostly employed in HTA products related to rapid assessment. Additional searches for primary studies published after the last review are often mentioned in order to update results. Although the interest in overviews is rising, little methodological guidance for the conduct of overviews is provided by HTA agencies. Overviews are of special interest in the context of rapid assessments to support policy-making within a short time frame. Therefore, empirical work on overviews needs to be extended. National strategies and experience should be disclosed and discussed. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Methodological Approaches in Conducting Overviews: Current State in HTA Agencies

    ERIC Educational Resources Information Center

    Pieper, Dawid; Antoine, Sunya-Lee; Morfeld, Jana-Carina; Mathes, Tim; Eikermann, Michaela

    2014-01-01

    Objectives: Overviews search for reviews rather than for primary studies. They might have the potential to support decision making within a shorter time frame by reducing production time. We aimed to summarize available instructions for authors intending to conduct overviews as well as the currently applied methodology of overviews in…

  3. ASSESSMENT OF TOXICANT-INDUCED ALTERATIONS IN OVARIAN STEROIDOGENESIS: A METHODOLOGICAL OVERVIEW

    EPA Science Inventory

    RTD-03-035

    Assessment of Toxicant-induced Alterations in Ovarian Steroidogenesis:
    A Methodological Overview

    Jerome M. Goldman, Susan C. Laws and Ralph L. Cooper

    Abstract

    A variety of methodological approaches have been used for the assessment of tox...

  4. Risk of bias in overviews of reviews: a scoping review of methodological guidance and four-item checklist.

    PubMed

    Ballard, Madeleine; Montgomery, Paul

    2017-03-01

    To assess the conditions under which employing an overview of systematic reviews is likely to lead to a high risk of bias. To synthesise existing guidance concerning overview practice, a scoping review was conducted. Four electronic databases were searched with a pre-specified strategy (PROSPERO 2015:CRD42015027592) ending October 2015. Included studies needed to describe or develop overview methodology. Data were narratively synthesised to delineate areas highlighted as outstanding challenges or where methodological recommendations conflict. Twenty-four papers met the inclusion criteria. There is emerging debate regarding overlapping systematic reviews; systematic review scope; quality of included research; updating; and synthesizing and reporting results. While three functions for overviews have been proposed-identify gaps, explore heterogeneity, summarize evidence-overviews cannot perform the first; are unlikely to achieve the second and third simultaneously; and can only perform the third under specific circumstances. Namely, when identified systematic reviews meet the following four conditions: (1) include primary trials that do not substantially overlap, (2) match overview scope, (3) are of high methodological quality, and (4) are up-to-date. Considering the intended function of proposed overviews with the corresponding methodological conditions may improve the quality of this burgeoning publication type. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Evaluation of AMSTAR to assess the methodological quality of systematic reviews in overviews of reviews of healthcare interventions.

    PubMed

    Pollock, Michelle; Fernandes, Ricardo M; Hartling, Lisa

    2017-03-23

    Overviews of reviews (overviews) compile information from multiple systematic reviews (SRs) to provide a single synthesis of relevant evidence for decision-making. It is recommended that authors assess and report the methodological quality of SRs in overviews-for example, using A MeaSurement Tool to Assess systematic Reviews (AMSTAR). Currently, there is variation in whether and how overview authors assess and report SR quality, and limited guidance is available. Our objectives were to: examine methodological considerations involved in using AMSTAR to assess the quality of Cochrane and non-Cochrane SRs in overviews of healthcare interventions; identify challenges (and develop potential decision rules) when using AMSTAR in overviews; and examine the potential impact of considering methodological quality when making inclusion decisions in overviews. We selected seven overviews of healthcare interventions and included all SRs meeting each overview's inclusion criteria. For each SR, two reviewers independently conducted AMSTAR assessments with consensus and discussed challenges encountered. We also examined the correlation between AMSTAR assessments and SR results/conclusions. Ninety-five SRs were included (30 Cochrane, 65 non-Cochrane). Mean AMSTAR assessments (9.6/11 vs. 5.5/11; p < 0.001) and inter-rater reliability (AC1 statistic: 0.84 vs. 0.69; "almost perfect" vs. "substantial" using the Landis & Koch criteria) were higher for Cochrane compared to non-Cochrane SRs. Four challenges were identified when applying AMSTAR in overviews: the scope of the SRs and overviews often differed; SRs examining similar topics sometimes made different methodological decisions; reporting of non-Cochrane SRs was sometimes poor; and some non-Cochrane SRs included other SRs as well as primary studies. Decision rules were developed to address each challenge. We found no evidence that AMSTAR assessments were correlated with SR results/conclusions. Results indicate that the AMSTAR

  6. OVERVIEW OF THE DRAFT METHODOLOGY FOR CONDUCTING BIOLOGICAL EVALUATIONS OF AQUATIC LIFE CRITERIA

    EPA Science Inventory

    This presentation will provide an overview of the draft methodology used in developing biological evaluations of aquatic life criteria specifically addressing aquatic and aquatic-dependent threatened and endangered species.

  7. Methodology to Estimate the Quantity, Composition, and ...

    EPA Pesticide Factsheets

    This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure. This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure.

  8. Methodology for Estimating Total Automotive Manufacturing Costs

    DOT National Transportation Integrated Search

    1983-04-01

    A number of methodologies for estimating manufacturing costs have been developed. This report discusses the different approaches and shows that an approach to estimating manufacturing costs in the automobile industry based on surrogate plants is pref...

  9. A review on human reinstatement studies: an overview and methodological challenges.

    PubMed

    Haaker, Jan; Golkar, Armita; Hermans, Dirk; Lonsdorf, Tina B

    2014-09-01

    In human research, studies of return of fear (ROF) phenomena, and reinstatement in particular, began only a decade ago and recently are more widely used, e.g., as outcome measures for fear/extinction memory manipulations (e.g., reconsolidation). As reinstatement research in humans is still in its infancy, providing an overview of its stability and boundary conditions and summarizing methodological challenges is timely to foster fruitful future research. As a translational endeavor, clarifying the circumstances under which (experimental) reinstatement occurs may offer a first step toward understanding relapse as a clinical phenomenon and pave the way for the development of new pharmacological or behavioral ways to prevent ROF. The current state of research does not yet allow pinpointing these circumstances in detail and we hope this review will aid the research field to advance in this direction. As an introduction, we begin with a synopsis of rodent work on reinstatement and theories that have been proposed to explain the findings. The review however mainly focuses on reinstatement in humans. We first describe details and variations of the experimental setup in reinstatement studies in humans and give a general overview of results. We continue with a compilation of possible experimental boundary conditions and end with the role of individual differences and behavioral and/or pharmacological manipulations. Furthermore, we compile important methodological and design details on the published studies in humans and end with open research questions and some important methodological and design recommendations as a guide for future research. © 2014 Haaker et al.; Published by Cold Spring Harbor Laboratory Press.

  10. A review on human reinstatement studies: an overview and methodological challenges

    PubMed Central

    Haaker, Jan; Golkar, Armita; Hermans, Dirk

    2014-01-01

    In human research, studies of return of fear (ROF) phenomena, and reinstatement in particular, began only a decade ago and recently are more widely used, e.g., as outcome measures for fear/extinction memory manipulations (e.g., reconsolidation). As reinstatement research in humans is still in its infancy, providing an overview of its stability and boundary conditions and summarizing methodological challenges is timely to foster fruitful future research. As a translational endeavor, clarifying the circumstances under which (experimental) reinstatement occurs may offer a first step toward understanding relapse as a clinical phenomenon and pave the way for the development of new pharmacological or behavioral ways to prevent ROF. The current state of research does not yet allow pinpointing these circumstances in detail and we hope this review will aid the research field to advance in this direction. As an introduction, we begin with a synopsis of rodent work on reinstatement and theories that have been proposed to explain the findings. The review however mainly focuses on reinstatement in humans. We first describe details and variations of the experimental setup in reinstatement studies in humans and give a general overview of results. We continue with a compilation of possible experimental boundary conditions and end with the role of individual differences and behavioral and/or pharmacological manipulations. Furthermore, we compile important methodological and design details on the published studies in humans and end with open research questions and some important methodological and design recommendations as a guide for future research. PMID:25128533

  11. New Methodology for Natural Gas Production Estimates

    EIA Publications

    2010-01-01

    A new methodology is implemented with the monthly natural gas production estimates from the EIA-914 survey this month. The estimates, to be released April 29, 2010, include revisions for all of 2009. The fundamental changes in the new process include the timeliness of the historical data used for estimation and the frequency of sample updates, both of which are improved.

  12. A regressive methodology for estimating missing data in rainfall daily time series

    NASA Astrophysics Data System (ADS)

    Barca, E.; Passarella, G.

    2009-04-01

    the multivariate approach. Another approach follows the paradigm of the "multiple imputation" (Rubin, 1987; Rubin, 1988), which consists in using a set of "similar stations" instead than the most similar. This way, a sort of estimation range can be determined allowing the introduction of uncertainty. Finally, time series can be grouped on the basis of monthly rainfall rates defining classes of wetness (i.e.: dry, moderately rainy and rainy), in order to achieve the estimation using homogeneous data subsets. We expect that integrating the methodology with these enhancements will certainly improve its reliability. The methodology was applied to the daily rainfall time series data registered in the Candelaro River Basin (Apulia - South Italy) from 1970 to 2001. REFERENCES D.B., Rubin, 1976. Inference and Missing Data. Biometrika 63 581-592 D.B. Rubin, 1987. Multiple Imputation for Nonresponce in Surveys, New York: John Wiley & Sons, Inc. D.B. Rubin, 1988. An overview of multiple imputation. In Survey Research Section, pp. 79-84, American Statistical Association, 1988. J.L., Schafer, 1997. Analysis of Incomplete Multivariate Data, Chapman & Hall. J., Scheffer, 2002. Dealing with Missing Data. Res. Lett. Inf. Math. Sci. 3, 153-160. Available online at http://www.massey.ac.nz/~wwiims/research/letters/ H. Theil, 1950. A rank-invariant method of linear and polynomial regression analysis. Indicationes Mathematicae, 12, pp.85-91.

  13. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    PubMed

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  14. Effect of body composition methodology on heritability estimation of body fatness

    USDA-ARS?s Scientific Manuscript database

    Heritability estimates of human body fatness vary widely and the contribution of body composition methodology to this variability is unknown. The effect of body composition methodology on estimations of genetic and environmental contributions to body fatness variation was examined in 78 adult male ...

  15. Overview of T.E.S.T. (Toxicity Estimation Software Tool)

    EPA Science Inventory

    This talk provides an overview of T.E.S.T. (Toxicity Estimation Software Tool). T.E.S.T. predicts toxicity values and physical properties using a variety of different QSAR (quantitative structure activity relationship) approaches including hierarchical clustering, group contribut...

  16. New Methodology for Estimating Fuel Economy by Vehicle Class

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less

  17. The ECCO Family of State Estimates: An Overview

    NASA Astrophysics Data System (ADS)

    Wunsch, C.

    2008-12-01

    The idea of ECCO (Estimating the Circulation and Climate of the Ocean)originated in the middle 1980s, when it became apparent that a global oceanographic observing system for the general circulation would become a reality as it did through the World Ocean Circulation Experiment. Observational design involved extremely diverse technologies and oceanic flow regimes. To be physically interpretable, these diverse data and physical processes would need to be combined into a useful, coherent, whole. Such a synthesis can only be done with a skillful GCM having useful resolution. ECCO originated as an experiment to demonstrate the technical feasibility of such a synthesis and to determine if any of several possible methods was preferable. In contrast to a number of other superficially similar efforts, mainly derived from weather forecasting methods, the ECCO goal was to estimate the long-term circulation mean and its variability on climate (decadal and longer) time scales in a form exactly satisfying known equations of motion. ECCO was made feasible with the simultaneous construction of a new GCM (MIT) along with the development of an automatic differentiation (AD) software tool(now called TAF) which rendered practical the method of Lagrange multipliers (called the adjoint method in oceanography). Parallel developments of simplified sequential methods (smoothers) provided an alternative, also practical, methodology. One can now use the existing (publicly available) machinery to discuss the ocean circulation and its variability. The huge variety of issues connected with the global circulation has meant that an entire family of estimates has grown up, each having different emphases (primarily global; but some primarily regional---the tropics, the Southern Ocean); some focussed on physics---the role of eddies or sea ice). The methodology leads, usefully, to intense scrutiny of data and model errors and spatio-temporal coverage. As with any estimation problem, no uniquely

  18. Deciphering the complex: methodological overview of statistical models to derive OMICS-based biomarkers.

    PubMed

    Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H

    2013-08-01

    Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.

  19. A new methodology for estimating nuclear casualties as a function of time.

    PubMed

    Zirkle, Robert A; Walsh, Terri J; Disraelly, Deena S; Curling, Carl A

    2011-09-01

    The Human Response Injury Profile (HRIP) nuclear methodology provides an estimate of casualties occurring as a consequence of nuclear attacks against military targets for planning purposes. The approach develops user-defined, time-based casualty and fatality estimates based on progressions of underlying symptoms and their severity changes over time. This paper provides a description of the HRIP nuclear methodology and its development, including inputs, human response and the casualty estimation process.

  20. Overview of systematic reviews of therapeutic ranges: methodologies and recommendations for practice.

    PubMed

    Cooney, Lewis; Loke, Yoon K; Golder, Su; Kirkham, Jamie; Jorgensen, Andrea; Sinha, Ian; Hawcutt, Daniel

    2017-06-02

    Many medicines are dosed to achieve a particular therapeutic range, and monitored using therapeutic drug monitoring (TDM). The evidence base for a therapeutic range can be evaluated using systematic reviews, to ensure it continues to reflect current indications, doses, routes and formulations, as well as updated adverse effect data. There is no consensus on the optimal methodology for systematic reviews of therapeutic ranges. An overview of systematic reviews of therapeutic ranges was undertaken. The following databases were used: Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts and Reviews of Effects (DARE) and MEDLINE. The published methodologies used when systematically reviewing the therapeutic range of a drug were analyzed. Step by step recommendations to optimize such systematic reviews are proposed. Ten systematic reviews that investigated the correlation between serum concentrations and clinical outcomes encompassing a variety of medicines and indications were assessed. There were significant variations in the methodologies used (including the search terms used, data extraction methods, assessment of bias, and statistical analyses undertaken). Therapeutic ranges should be population and indication specific and based on clinically relevant outcomes. Recommendations for future systematic reviews based on these findings have been developed. Evidence based therapeutic ranges have the potential to improve TDM practice. Current systematic reviews investigating therapeutic ranges have highly variable methodologies and there is no consensus of best practice when undertaking systematic reviews in this field. These recommendations meet a need not addressed by standard protocols.

  1. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  2. Methodology for estimating helicopter performance and weights using limited data

    NASA Technical Reports Server (NTRS)

    Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard

    1990-01-01

    Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.

  3. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  4. Expanded uncertainty estimation methodology in determining the sandy soils filtration coefficient

    NASA Astrophysics Data System (ADS)

    Rusanova, A. D.; Malaja, L. D.; Ivanov, R. N.; Gruzin, A. V.; Shalaj, V. V.

    2018-04-01

    The combined standard uncertainty estimation methodology in determining the sandy soils filtration coefficient has been developed. The laboratory researches were carried out which resulted in filtration coefficient determination and combined uncertainty estimation obtaining.

  5. A hierarchical clustering methodology for the estimation of toxicity.

    PubMed

    Martin, Todd M; Harten, Paul; Venkatapathy, Raghuraman; Das, Shashikala; Young, Douglas M

    2008-01-01

    ABSTRACT A quantitative structure-activity relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural similarity is defined in terms of 2-D physicochemical descriptors (such as connectivity and E-state indices). A genetic algorithm-based technique is used to generate statistically valid QSAR models for each cluster (using the pool of descriptors described above). The toxicity for a given query compound is estimated using the weighted average of the predictions from the closest cluster from each step in the hierarchical clustering assuming that the compound is within the domain of applicability of the cluster. The hierarchical clustering methodology was tested using a Tetrahymena pyriformis acute toxicity data set containing 644 chemicals in the training set and with two prediction sets containing 339 and 110 chemicals. The results from the hierarchical clustering methodology were compared to the results from several different QSAR methodologies.

  6. General multiyear aggregation technology: Methodology and software documentation. [estimating seasonal crop acreage proportions

    NASA Technical Reports Server (NTRS)

    Baker, T. C. (Principal Investigator)

    1982-01-01

    A general methodology is presented for estimating a stratum's at-harvest crop acreage proportion for a given crop year (target year) from the crop's estimated acreage proportion for sample segments from within the stratum. Sample segments from crop years other than the target year are (usually) required for use in conjunction with those from the target year. In addition, the stratum's (identifiable) crop acreage proportion may be estimated for times other than at-harvest in some situations. A by-product of the procedure is a methodology for estimating the change in the stratum's at-harvest crop acreage proportion from crop year to crop year. An implementation of the proposed procedure as a statistical analysis system routine using the system's matrix language module, PROC MATRIX, is described and documented. Three examples illustrating use of the methodology and algorithm are provided.

  7. Methodologies for Estimating Cumulative Human Exposures to Current-Use Pyrethroid Pesticides

    EPA Science Inventory

    We estimated cumulative residential pesticide exposures for a group of nine young children (4–6 years) using three different methodologies developed by the US Environmental Protection Agency and compared the results with estimates derived from measured urinary metabolite concentr...

  8. [Methodologies for estimating the indirect costs of traffic accidents].

    PubMed

    Carozzi, Soledad; Elorza, María Eugenia; Moscoso, Nebel Silvana; Ripari, Nadia Vanina

    2017-01-01

    Traffic accidents generate multiple costs to society, including those associated with the loss of productivity. However, there is no consensus about the most appropriate methodology for estimating those costs. The aim of this study was to review methods for estimating indirect costs applied in crash cost studies. A thematic review of the literature was carried out between 1995 and 2012 in PubMed with the terms cost of illness, indirect cost, road traffic injuries, productivity loss. For the assessment of costs we used the the human capital method, on the basis of the wage-income lost during the time of treatment and recovery of patients and caregivers. In the case of premature death or total disability, the discount rate was applied to obtain the present value of lost future earnings. The computed years arose by subtracting to life expectancy at birth the average age of those affected who are not incorporated into the economically active life. The interest in minimizing the problem is reflected in the evolution of the implemented methodologies. We expect that this review is useful to estimate efficiently the real indirect costs of traffic accidents.

  9. Effect of Body Composition Methodology on Heritability Estimation of Body Fatness

    PubMed Central

    Elder, Sonya J.; Roberts, Susan B.; McCrory, Megan A.; Das, Sai Krupa; Fuss, Paul J.; Pittas, Anastassios G.; Greenberg, Andrew S.; Heymsfield, Steven B.; Dawson-Hughes, Bess; Bouchard, Thomas J.; Saltzman, Edward; Neale, Michael C.

    2014-01-01

    Heritability estimates of human body fatness vary widely and the contribution of body composition methodology to this variability is unknown. The effect of body composition methodology on estimations of genetic and environmental contributions to body fatness variation was examined in 78 adult male and female monozygotic twin pairs reared apart or together. Body composition was assessed by six methods – body mass index (BMI), dual energy x-ray absorptiometry (DXA), underwater weighing (UWW), total body water (TBW), bioelectric impedance (BIA), and skinfold thickness. Body fatness was expressed as percent body fat, fat mass, and fat mass/height2 to assess the effect of body fatness expression on heritability estimates. Model-fitting multivariate analyses were used to assess the genetic and environmental components of variance. Mean BMI was 24.5 kg/m2 (range of 17.8–43.4 kg/m2). There was a significant effect of body composition methodology (p<0.001) on heritability estimates, with UWW giving the highest estimate (69%) and BIA giving the lowest estimate (47%) for fat mass/height2. Expression of body fatness as percent body fat resulted in significantly higher heritability estimates (on average 10.3% higher) compared to expression as fat mass/height2 (p=0.015). DXA and TBW methods expressing body fatness as fat mass/height2 gave the least biased heritability assessments, based on the small contribution of specific genetic factors to their genetic variance. A model combining DXA and TBW methods resulted in a relatively low FM/ht2 heritability estimate of 60%, and significant contributions of common and unique environmental factors (22% and 18%, respectively). The body fatness heritability estimate of 60% indicates a smaller contribution of genetic variance to total variance than many previous studies using less powerful research designs have indicated. The results also highlight the importance of environmental factors and possibly genotype by environmental

  10. Breastfeeding and Postpartum Depression: An Overview and Methodological Recommendations for Future Research

    PubMed Central

    Pope, Carley J.; Mazmanian, Dwight

    2016-01-01

    Emerging research suggests that a relationship exists between breastfeeding and postpartum depression; however, the direction and precise nature of this relationship are not yet clear. The purpose of this paper is to provide an overview of the relationship between breastfeeding and postpartum depression as it has been examined in the empirical literature. Also, the potential mechanisms of action that have been implicated in this relationship are also explored. PubMed and PsycINFO were searched using the keywords: breastfeeding with postpartum depression, perinatal depression, postnatal depression. Results of this search showed that researchers have examined this relationship in diverse ways using diverse methodology. In particular, researchers have examined the relationships between postpartum depression and breastfeeding intention, initiation, duration, and dose. Due to a number of methodological differences among past studies we make some recommendations for future research that will better facilitate an integration of findings. Future research should (1) use standardized assessment protocols; (2) confirm diagnosis through established clinical interview when possible; (3) provide a clear operationalized definition for breastfeeding variables; (4) clearly define the postpartum period interval assessed and time frame for onset of symptoms; (5) be prospective or longitudinal in nature; and (6) take into consideration other potential risk factors identified in the empirical literature. PMID:27148457

  11. Methodology for estimating human perception to tremors in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  12. Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001

    Treesearch

    L. S. Heath; R. A. Birdsey; D. W. Williams

    2002-01-01

    The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...

  13. CO2 storage capacity estimation: Methodology and gaps

    USGS Publications Warehouse

    Bachu, S.; Bonijoly, D.; Bradshaw, J.; Burruss, R.; Holloway, S.; Christensen, N.P.; Mathiassen, O.M.

    2007-01-01

    Implementation of CO2 capture and geological storage (CCGS) technology at the scale needed to achieve a significant and meaningful reduction in CO2 emissions requires knowledge of the available CO2 storage capacity. CO2 storage capacity assessments may be conducted at various scales-in decreasing order of size and increasing order of resolution: country, basin, regional, local and site-specific. Estimation of the CO2 storage capacity in depleted oil and gas reservoirs is straightforward and is based on recoverable reserves, reservoir properties and in situ CO2 characteristics. In the case of CO2-EOR, the CO2 storage capacity can be roughly evaluated on the basis of worldwide field experience or more accurately through numerical simulations. Determination of the theoretical CO2 storage capacity in coal beds is based on coal thickness and CO2 adsorption isotherms, and recovery and completion factors. Evaluation of the CO2 storage capacity in deep saline aquifers is very complex because four trapping mechanisms that act at different rates are involved and, at times, all mechanisms may be operating simultaneously. The level of detail and resolution required in the data make reliable and accurate estimation of CO2 storage capacity in deep saline aquifers practical only at the local and site-specific scales. This paper follows a previous one on issues and development of standards for CO2 storage capacity estimation, and provides a clear set of definitions and methodologies for the assessment of CO2 storage capacity in geological media. Notwithstanding the defined methodologies suggested for estimating CO2 storage capacity, major challenges lie ahead because of lack of data, particularly for coal beds and deep saline aquifers, lack of knowledge about the coefficients that reduce storage capacity from theoretical to effective and to practical, and lack of knowledge about the interplay between various trapping mechanisms at work in deep saline aquifers. ?? 2007 Elsevier Ltd

  14. Health Insurance Dynamics: Methodological Considerations and a Comparison of Estimates from Two Surveys.

    PubMed

    Graves, John A; Mishra, Pranita

    2016-10-01

    To highlight key methodological issues in studying insurance dynamics and to compare estimates across two commonly used surveys. Nonelderly uninsured adults and children sampled between 2001 and 2011 in the Medical Expenditure Panel Survey and the Survey of Income and Program Participation. We utilized nonparametric Kaplan-Meier methods to estimate quantiles (25th, 50th, and 75th percentiles) in the distribution of uninsured spells. We compared estimates obtained across surveys and across different methodological approaches to address issues like attrition, seam bias, censoring and truncation, and survey weighting method. All data were drawn from publicly available household surveys. Estimated uninsured spell durations in the MEPS were longer than those observed in the SIPP. There were few changes in spell durations between 2001 and 2011, with median durations of 14 months among adults and 5-7 months among children in the MEPS, and 8 months (adults) and 4 months (children) in the SIPP. The use of panel survey data to study insurance dynamics presents a unique set of methodological challenges. Researchers should consider key analytic and survey design trade-offs when choosing which survey can best suit their research goals. © Health Research and Educational Trust.

  15. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  16. Peaks Over Threshold (POT): A methodology for automatic threshold estimation using goodness of fit p-value

    NASA Astrophysics Data System (ADS)

    Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.

    2017-04-01

    Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.

  17. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  18. The IDF Diabetes Atlas methodology for estimating global prevalence of hyperglycaemia in pregnancy.

    PubMed

    Linnenkamp, U; Guariguata, L; Beagley, J; Whiting, D R; Cho, N H

    2014-02-01

    Hyperglycaemia is one of the most prevalent metabolic disorders occurring during pregnancy. Limited data are available on the global prevalence of hyperglycaemia in pregnancy. The International Diabetes Federation (IDF) has developed a methodology for generating estimates of the prevalence of hyperglycaemia in pregnancy, including hyperglycaemia first detected in pregnancy and live births to women with known diabetes, among women of childbearing age (20-49 years). A systematic review of the literature for studies reporting the prevalence of gestational diabetes was conducted. Studies were evaluated and scored to favour those that were representative of a large population, conducted recently, reported age-specific estimates, and case identification was based on blood test. Age-specific prevalence data from studies were entered to produce estimates for five-year age groups using logistic regression to smooth curves, with age as the independent variable. The derived age-specific prevalence was adjusted for differences in diagnostic criteria in the underlying data. Cases of hyperglycaemia in pregnancy were derived from age-specific estimates of fertility and age-specific population estimates. Country-specific estimates were generated for countries with available data. Regional and global estimates were generated based on aggregation and extrapolation for 219 countries and territories. Available fertility rates and diabetes prevalence estimates were used to estimate the proportion of hyperglycaemia in pregnancy that may be due to total diabetes in pregnancy - pregnancy in women with known diabetes and diabetes first detected in pregnancy. The literature review identified 199 studies that were eligible for characterisation and selection. After scoring and exclusion requirements, 46 studies were selected representing 34 countries. More than 50% of selected studies came from Europe and North America and Caribbean. The smallest number of identified studies came from sub

  19. Using State Estimation Residuals to Detect Abnormal SCADA Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Jian; Chen, Yousu; Huang, Zhenyu

    2010-04-30

    Detection of abnormal supervisory control and data acquisition (SCADA) data is critically important for safe and secure operation of modern power systems. In this paper, a methodology of abnormal SCADA data detection based on state estimation residuals is presented. Preceded with a brief overview of outlier detection methods and bad SCADA data detection for state estimation, the framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection algorithm. The BACON algorithm ismore » applied to the outlier detection task. The IEEE 118-bus system is used as a test base to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less

  20. Work-based physiological assessment of physically-demanding trades: a methodological overview.

    PubMed

    Taylor, Nigel A S; Groeller, Herb

    2003-03-01

    Technological advances, modified work practices, altered employment strategies, work-related injuries, and the rise in work-related litigation and compensation claims necessitate ongoing trade analysis research. Such research enables the identification and development of gender- and age-neutral skills, physiological attributes and employment standards required to satisfactorily perform critical trade tasks. This paper overviews a methodological approach which may be adopted when seeking to establish trade-specific physiological competencies for physically-demanding trades (occupations). A general template is presented for conducting a trade analyses within physically-demanding trades, such as those encountered within military or emergency service occupations. Two streams of analysis are recommended: the trade analysis and the task analysis. The former involves a progressive dissection of activities and skills into a series of specific tasks (elements), and results in a broad approximation of the types of trade duties, and the links between trade tasks. The latter, will lead to the determination of how a task is performed within a trade, and the physiological attributes required to satisfactorily perform that task. The approach described within this paper is designed to provide research outcomes which have high content, criterion-related and construct validities.

  1. The Development of a Methodology for Estimating the Cost of Air Force On-the-Job Training.

    ERIC Educational Resources Information Center

    Samers, Bernard N.; And Others

    The Air Force uses a standardized costing methodology for resident technical training schools (TTS); no comparable methodology exists for computing the cost of on-the-job training (OJT). This study evaluates three alternative survey methodologies and a number of cost models for estimating the cost of OJT for airmen training in the Administrative…

  2. Utility Estimation for Pediatric Vesicoureteral Reflux: Methodological Considerations Using an Online Survey Platform.

    PubMed

    Tejwani, Rohit; Wang, Hsin-Hsiao S; Lloyd, Jessica C; Kokorowski, Paul J; Nelson, Caleb P; Routh, Jonathan C

    2017-03-01

    The advent of online task distribution has opened a new avenue for efficiently gathering community perspectives needed for utility estimation. Methodological consensus for estimating pediatric utilities is lacking, with disagreement over whom to sample, what perspective to use (patient vs parent) and whether instrument induced anchoring bias is significant. We evaluated what methodological factors potentially impact utility estimates for vesicoureteral reflux. Cross-sectional surveys using a time trade-off instrument were conducted via the Amazon Mechanical Turk® (https://www.mturk.com) online interface. Respondents were randomized to answer questions from child, parent or dyad perspectives on the utility of a vesicoureteral reflux health state and 1 of 3 "warm-up" scenarios (paralysis, common cold, none) before a vesicoureteral reflux scenario. Utility estimates and potential predictors were fitted to a generalized linear model to determine what factors most impacted utilities. A total of 1,627 responses were obtained. Mean respondent age was 34.9 years. Of the respondents 48% were female, 38% were married and 44% had children. Utility values were uninfluenced by child/personal vesicoureteral reflux/urinary tract infection history, income or race. Utilities were affected by perspective and were higher in the child group (34% lower in parent vs child, p <0.001, and 13% lower in dyad vs child, p <0.001). Vesicoureteral reflux utility was not significantly affected by the presence or type of time trade-off warm-up scenario (p = 0.17). Time trade-off perspective affects utilities when estimated via an online interface. However, utilities are unaffected by the presence, type or absence of warm-up scenarios. These findings could have significant methodological implications for future utility elicitations regarding other pediatric conditions. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  3. Methodology and Implications of Maximum Paleodischarge Estimates for

    USGS Publications Warehouse

    Channels, M.; Pruess, J.; Wohl, E.E.; Jarrett, R.D.

    1998-01-01

    Historical and geologic records may be used to enhance magnitude estimates for extreme floods along mountain channels, as demonstrated in this study from the San Juan Mountains of Colorado. Historical photographs and local newspaper accounts from the October 1911 flood indicate the likely extent of flooding and damage. A checklist designed to organize and numerically score evidence of flooding was used in 15 field reconnaissance surveys in the upper Animas River valley of southwestern Colorado. Step-backwater flow modeling estimated the discharges necessary to create longitudinal flood bars observed at 6 additional field sites. According to these analyses, maximum unit discharge peaks at approximately 1.3 m3 s~' km"2 around 2200 m elevation, with decreased unit discharges at both higher and lower elevations. These results (1) are consistent with Jarrett's (1987, 1990, 1993) maximum 2300-m elevation limit for flash-flooding in the Colorado Rocky Mountains, and (2) suggest that current Probable Maximum Flood (PMF) estimates based on a 24-h rainfall of 30 cm at elevations above 2700 m are unrealistically large. The methodology used for this study should be readily applicable to other mountain regions where systematic streamflow records are of short duration or nonexistent. ?? 1998 Regents of the University of Colorado.

  4. An overview of age estimation in forensic anthropology: perspectives and practical considerations.

    PubMed

    Márquez-Grant, Nicholas

    2015-01-01

    Information on methods of age estimation in physical anthropology, in particular with regard to age-at-death from human skeletal remains, is widely available in the literature. However, the practicalities and real challenges faced in forensic casework are not always highlighted. To provide a practitioner's perspective, regarding age estimation in forensic anthropology (both in the living as well as the dead), with an emphasis on the types of cases, the value of such work and its challenges and limitations. The paper reviews the current literature on age estimation with a focus on forensic anthropology, but it also brings the author's personal perspective derived from a number of forensic cases. Although much is known about what methods to use, but not always how to apply them, little attention has been given in the literature to the real practicalities faced by forensic anthropologists, for example: the challenges in different types of scenarios; how to report age estimations; responsibilities; and ethical concerns. This paper gathers some of these aspects into one overview which includes the value of such work and the practical challenges, not necessarily with the methods themselves, but also with regard to how these are applied in the different cases where age estimation is required.

  5. A Brief Introduction to Q Methodology

    ERIC Educational Resources Information Center

    Yang, Yang

    2016-01-01

    Q methodology is a method to systematically study subjective matters such as thoughts and beliefs on any given topic. Q methodology can be used for both theory building and theory testing. The purpose of this paper was to give a brief overview of Q methodology to readers with various backgrounds. This paper discussed several advantages of Q…

  6. Uterotonic use immediately following birth: using a novel methodology to estimate population coverage in four countries.

    PubMed

    Ricca, Jim; Dwivedi, Vikas; Varallo, John; Singh, Gajendra; Pallipamula, Suranjeen Prasad; Amade, Nazir; de Luz Vaz, Maria; Bishanga, Dustan; Plotkin, Marya; Al-Makaleh, Bushra; Suhowatsky, Stephanie; Smith, Jeffrey Michael

    2015-01-22

    Postpartum hemorrhage (PPH) is the leading cause of maternal mortality in developing countries. While incidence of PPH can be dramatically reduced by uterotonic use immediately following birth (UUIFB) in both community and facility settings, national coverage estimates are rare. Most national health systems have no indicator to track this, and community-based measurements are even more scarce. To fill this information gap, a methodology for estimating national coverage for UUIFB was developed and piloted in four settings. The rapid estimation methodology consisted of convening a group of national technical experts and using the Delphi method to come to consensus on key data elements that were applied to a simple algorithm, generating a non-precise national estimate of coverage of UUIFB. Data elements needed for the calculation were the distribution of births by location and estimates of UUIFB in each of those settings, adjusted to take account of stockout rates and potency of uterotonics. This exercise was conducted in 2013 in Mozambique, Tanzania, the state of Jharkhand in India, and Yemen. Available data showed that deliveries in public health facilities account for approximately half of births in Mozambique and Tanzania, 16% in Jharkhand and 24% of births in Yemen. Significant proportions of births occur in private facilities in Jharkhand and faith-based facilities in Tanzania. Estimated uterotonic use for facility births ranged from 70 to 100%. Uterotonics are not used routinely for PPH prevention at home births in any of the settings. National UUIFB coverage estimates of all births were 43% in Mozambique, 40% in Tanzania, 44% in Jharkhand, and 14% in Yemen. This methodology for estimating coverage of UUIFB was found to be feasible and acceptable. While the exercise produces imprecise estimates whose validity cannot be assessed objectively in the absence of a gold standard estimate, stakeholders felt they were accurate enough to be actionable. The exercise

  7. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort

  8. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  9. Model Parameter Estimation Experiment (MOPEX): An overview of science strategy and major results from the second and third workshops

    USGS Publications Warehouse

    Duan, Q.; Schaake, J.; Andreassian, V.; Franks, S.; Goteti, G.; Gupta, H.V.; Gusev, Y.M.; Habets, F.; Hall, A.; Hay, L.; Hogue, T.; Huang, M.; Leavesley, G.; Liang, X.; Nasonova, O.N.; Noilhan, J.; Oudin, L.; Sorooshian, S.; Wagener, T.; Wood, E.F.

    2006-01-01

    The Model Parameter Estimation Experiment (MOPEX) is an international project aimed at developing enhanced techniques for the a priori estimation of parameters in hydrologic models and in land surface parameterization schemes of atmospheric models. The MOPEX science strategy involves three major steps: data preparation, a priori parameter estimation methodology development, and demonstration of parameter transferability. A comprehensive MOPEX database has been developed that contains historical hydrometeorological data and land surface characteristics data for many hydrologic basins in the United States (US) and in other countries. This database is being continuously expanded to include more basins in all parts of the world. A number of international MOPEX workshops have been convened to bring together interested hydrologists and land surface modelers from all over world to exchange knowledge and experience in developing a priori parameter estimation techniques. This paper describes the results from the second and third MOPEX workshops. The specific objective of these workshops is to examine the state of a priori parameter estimation techniques and how they can be potentially improved with observations from well-monitored hydrologic basins. Participants of the second and third MOPEX workshops were provided with data from 12 basins in the southeastern US and were asked to carry out a series of numerical experiments using a priori parameters as well as calibrated parameters developed for their respective hydrologic models. Different modeling groups carried out all the required experiments independently using eight different models, and the results from these models have been assembled for analysis in this paper. This paper presents an overview of the MOPEX experiment and its design. The main experimental results are analyzed. A key finding is that existing a priori parameter estimation procedures are problematic and need improvement. Significant improvement of these

  10. Chinese herbal medicine for the treatment of primary hypertension: a methodology overview of systematic reviews.

    PubMed

    Xinke, Zhao; Yingdong, Li; Mingxia, Feng; Kai, Liu; Kaibing, Chen; Yuqing, Lu; Shaobo, Sun; Peng, Song; Bin, Liu

    2016-10-20

    Chinese herbal medicine has been used to treat hypertension in China and East Asia since centuries. In this study, we conduct an overview of systematic reviews of Chinese herbal medicine in the treatment of primary hypertension to 1) summarize the conclusions of these reviews, 2) evaluate the methodological quality of these reviews, and 3) rate the confidence in the effect on each outcome. We comprehensively searched six databases to retrieve systematic reviews of Chinese herbal medicine for primary hypertension from inception to December 31, 2015. We used AMSTAR to evaluate the methodological quality of included reviews, and we classified the quality of evidence for each outcome in included reviews using the GRADE approach. A total of 12 systematic reviews with 31 outcomes were included, among which 11 systematic reviews focus on the therapeutic effect of Chinese herbal medicine combined with conventional medicine or simple Chinese herbal medicine versus simple conventional medicine. Among the 11 items of AMSTAR, the lowest quality was "providing a priori design" item, none review conformed to this item, the next was "stating the conflict of interest" item, only three reviews conformed to this item. Five reviews scored less than seven in AMSTAR, which means that the overall methodological quality was fairly poor. For GRADE, of the 31 outcomes, the quality of evidence was high in none (0 %), moderate in three (10 %), low in 19 (61 %), and very low in nine (29 %). Of the five downgrading factors, risk of bias (100 %) was the most common downgrading factor in the included reviews, followed by imprecision (42 %), inconsistency (39 %), publication bias (39 %), and indirectness (0 %). The methodological quality of systematic reviews about Chinese herbal medicine for primary hypertension is fairly poor, and the quality of evidence level is low. Physicians should be cautious when applying the interventions in these reviews for primary hypertension patients in

  11. GPS system simulation methodology

    NASA Technical Reports Server (NTRS)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  12. The Migrant Border Crossing Study: A methodological overview of research along the Sonora-Arizona border.

    PubMed

    Martínez, Daniel E; Slack, Jeremy; Beyerlein, Kraig; Vandervoet, Prescott; Klingman, Kristin; Molina, Paola; Manning, Shiras; Burham, Melissa; Walzak, Kylie; Valencia, Kristen; Gamboa, Lorenzo

    2017-07-01

    Increased border enforcement efforts have redistributed unauthorized Mexican migration to the United States (US) away from traditional points of crossing, such as San Diego and El Paso, and into more remote areas along the US-Mexico border, including southern Arizona. Yet relatively little quantitative scholarly work exists examining Mexican migrants' crossing, apprehension, and repatriation experiences in southern Arizona. We contend that if scholars truly want to understand the experiences of unauthorized migrants in transit, such migrants should be interviewed either at the border after being removed from the US, or during their trajectories across the border, or both. This paper provides a methodological overview of the Migrant Border Crossing Study (MBCS), a unique data source on Mexican migrants who attempted an unauthorized crossing along the Sonora-Arizona border, were apprehended, and repatriated to Nogales, Sonora in 2007-09. We also discuss substantive and theoretical contributions of the MBCS.

  13. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the

  14. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  15. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease

    PubMed Central

    Devleesschauwer, Brecht; Haagsma, Juanita A.; Angulo, Frederick J.; Bellinger, David C.; Cole, Dana; Döpfer, Dörte; Fazil, Aamir; Fèvre, Eric M.; Gibb, Herman J.; Hald, Tine; Kirk, Martyn D.; Lake, Robin J.; Maertens de Noordhout, Charline; Mathers, Colin D.; McDonald, Scott A.; Pires, Sara M.; Speybroeck, Niko; Thomas, M. Kate; Torgerson, Paul R.; Wu, Felicia; Havelaar, Arie H.; Praet, Nicolas

    2015-01-01

    Background The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. Methods and Findings The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. Conclusions We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. PMID:26633883

  16. Chicago Area Transportation Study (CATS): Methodological Overview

    DOT National Transportation Integrated Search

    1994-04-01

    This report contains a methodological discussion of the Chicago Area : Transportation Study (CATS) 1990 Household Travel Survey. It was prepared to : assist those who are working with the Household Travel Survey database. This : report concentrates o...

  17. Development of methodologies for the estimation of thermal properties associated with aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1993-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.

  18. Development of methodologies for the estimation of thermal properties associated with aerospace vehicles

    NASA Astrophysics Data System (ADS)

    Scott, Elaine P.

    1993-12-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles such as the National Aero-Space Plane (NASP) and the High-Speed Civil Transport (HSCT) at NASA-LaRC. These analyses require knowledge of the temperature within the structures which consequently necessitates the need for thermal property data. The initial goal of this research effort was to develop a methodology for the estimation of thermal properties of aerospace structural materials at room temperature and to develop a procedure to optimize the estimation process. The estimation procedure was implemented utilizing a general purpose finite element code. In addition, an optimization procedure was developed and implemented to determine critical experimental parameters to optimize the estimation procedure. Finally, preliminary experiments were conducted at the Aircraft Structures Branch (ASB) laboratory.

  19. Methodological Approaches for Estimating the Benefits and Costs of Smart Grid Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Russell

    This report presents a comprehensive framework for estimating the benefits and costs of Smart Grid projects and a step-by-step approach for making these estimates. The framework identifies the basic categories of benefits, the beneficiaries of these benefits, and the Smart Grid functionalities that lead to different benefits and proposes ways to estimate these benefits, including their monetization. The report covers cost-effectiveness evaluation, uncertainty, and issues in estimating baseline conditions against which a project would be compared. The report also suggests metrics suitable for describing principal characteristics of a modern Smart Grid to which a project can contribute. This first sectionmore » of the report presents background information on the motivation for the report and its purpose. Section 2 introduces the methodological framework, focusing on the definition of benefits and a sequential, logical process for estimating them. Beginning with the Smart Grid technologies and functions of a project, it maps these functions to the benefits they produce. Section 3 provides a hypothetical example to illustrate the approach. Section 4 describes each of the 10 steps in the approach. Section 5 covers issues related to estimating benefits of the Smart Grid. Section 6 summarizes the next steps. The methods developed in this study will help improve future estimates - both retrospective and prospective - of the benefits of Smart Grid investments. These benefits, including those to consumers, society in general, and utilities, can then be weighed against the investments. Such methods would be useful in total resource cost tests and in societal versions of such tests. As such, the report will be of interest not only to electric utilities, but also to a broad constituency of stakeholders. Significant aspects of the methodology were used by the U.S. Department of Energy (DOE) to develop its methods for estimating the benefits and costs of its renewable and

  20. A Methodological Framework to Estimate the Site Fidelity of Tagged Animals Using Passive Acoustic Telemetry

    PubMed Central

    Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent

    2015-01-01

    The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985

  1. A Methodological Framework to Estimate the Site Fidelity of Tagged Animals Using Passive Acoustic Telemetry.

    PubMed

    Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent

    2015-01-01

    The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.

  2. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    PubMed

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  3. Evaluative methodology for prioritizing transportation energy conservation strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pang, L.M.G.

    An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less

  4. Estimation of retired mobile phones generation in China: A comparative study on methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Bo; Yang, Jianxin, E-mail: yangjx@rcees.ac.cn; Lu, Bin

    Highlights: • The sales data of mobile phones in China was revised by considering the amount of smuggled and counterfeit mobile phones. • The estimation of retired mobile phones in China was made by comparing some relevant methods. • The advanced result of estimation can help improve the policy-making. • The method suggested in this paper can be also used in other countries. • Some discussions on methodology are also conducted in order for the improvement. - Abstract: Due to the rapid development of economy and technology, China has the biggest production and possession of mobile phones around the world.more » In general, mobile phones have relatively short life time because the majority of users replace their mobile phones frequently. Retired mobile phones represent the most valuable electrical and electronic equipment (EEE) in the main waste stream because of such characteristics as large quantity, high reuse/recovery value and fast replacement frequency. Consequently, the huge amount of retired mobile phones in China calls for a sustainable management system. The generation estimation can provide fundamental information to construct the sustainable management system of retired mobile phones and other waste electrical and electronic equipment (WEEE). However, the reliable estimation result is difficult to get and verify. The priority aim of this paper is to provide proper estimation approach for the generation of retired mobile phones in China, by comparing some relevant methods. The results show that the sales and new method is in the highest priority in estimation of the retired mobile phones. The result of sales and new method shows that there are 47.92 million mobile phones retired in 2002, and it reached to 739.98 million in China in 2012. It presents an increasing tendency with some fluctuations clearly. Furthermore, some discussions on methodology, such as the selection of improper approach and error in the input data, are also conducted in

  5. Methodological Issues in Measuring the Development of Character

    ERIC Educational Resources Information Center

    Card, Noel A.

    2017-01-01

    In this article I provide an overview of the methodological issues involved in measuring constructs relevant to character development and education. I begin with a nontechnical overview of the 3 fundamental psychometric properties of measurement: reliability, validity, and equivalence. Developing and evaluating measures to ensure evidence of all 3…

  6. Estimating the Global Prevalence of Inadequate Zinc Intake from National Food Balance Sheets: Effects of Methodological Assumptions

    PubMed Central

    Wessells, K. Ryan; Singh, Gitanjali M.; Brown, Kenneth H.

    2012-01-01

    Background The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population’s theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1) evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2) generate a model considered to provide the best estimates. Methodology and Principal Findings National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation). Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimated global prevalence of inadequate zinc intake varied between 12–66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57–0.99, P<0.01). A “best-estimate” model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%. Conclusions and Significance Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public health

  7. Quantifying automobile refinishing VOC air emissions - a methodology with estimates and forecasts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S.P.; Rubick, C.

    1996-12-31

    Automobile refinishing coatings (referred to as paints), paint thinners, reducers, hardeners, catalysts, and cleanup solvents used during their application, contain volatile organic compounds (VOCs) which are precursors to ground level ozone formation. Some of these painting compounds create hazardous air pollutants (HAPs) which are toxic. This paper documents the methodology, data sets, and the results of surveys (conducted in the fall of 1995) used to develop revised per capita emissions factors for estimating and forecasting the VOC air emissions from the area source category of automobile refinishing. Emissions estimates, forecasts, trends, and reasons for these trends are presented. Future emissionsmore » inventory (EI) challenges are addressed in light of data availability and information networks.« less

  8. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    PubMed

    Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil

  9. Total materials consumption; an estimation methodology and example using lead; a materials flow analysis

    USGS Publications Warehouse

    Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.

    1999-01-01

    Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.

  10. Methods of albumin estimation in clinical biochemistry: Past, present, and future.

    PubMed

    Kumar, Deepak; Banerjee, Dibyajyoti

    2017-06-01

    Estimation of serum and urinary albumin is routinely performed in clinical biochemistry laboratories. In the past, precipitation-based methods were popular for estimation of human serum albumin (HSA). Currently, dye-binding or immunochemical methods are widely practiced. Each of these methods has its limitations. Research endeavors to overcome such limitations are on-going. The current trends in methodological aspects of albumin estimation guiding the field have not been reviewed. Therefore, it is the need of the hour to review several aspects of albumin estimation. The present review focuses on the modern trends of research from a conceptual point of view and gives an overview of recent developments to offer the readers a comprehensive understanding of the subject. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A methodology for estimating health benefits of electricity generation using renewable technologies.

    PubMed

    Partridge, Ian; Gamkhar, Shama

    2012-02-01

    At Copenhagen, the developed countries agreed to provide up to $100 bn per year to finance climate change mitigation and adaptation by developing countries. Projects aimed at cutting greenhouse gas (GHG) emissions will need to be evaluated against dual criteria: from the viewpoint of the developed countries they must cut emissions of GHGs at reasonable cost, while host countries will assess their contribution to development, or simply their overall economic benefits. Co-benefits of some types of project will also be of interest to host countries: for example some projects will contribute to reducing air pollution, thus improving the health of the local population. This paper uses a simple damage function methodology to quantify some of the health co-benefits of replacing coal-fired generation with wind or small hydro in China. We estimate the monetary value of these co-benefits and find that it is probably small compared to the added costs. We have not made a full cost-benefit analysis of renewable energy in China as some likely co-benefits are omitted from our calculations. Our results are subject to considerable uncertainty however, after careful consideration of their likely accuracy and comparisons with other studies, we believe that they provide a good first cut estimate of co-benefits and are sufficiently robust to stand as a guide for policy makers. In addition to these empirical results, a key contribution made by the paper is to demonstrate a simple and reasonably accurate methodology for health benefits estimation that applies the most recent academic research in the field to the solution of an increasingly important problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Research MethodologyOverview of Qualitative Research

    PubMed Central

    GROSSOEHME, DANIEL H.

    2015-01-01

    Qualitative research methods are a robust tool for chaplaincy research questions. Similar to much of chaplaincy clinical care, qualitative research generally works with written texts, often transcriptions of individual interviews or focus group conversations and seeks to understand the meaning of experience in a study sample. This article describes three common methodologies: ethnography, grounded theory, and phenomenology. Issues to consider relating to the study sample, design, and analysis are discussed. Enhancing the validity of the data, as well reliability and ethical issues in qualitative research are described. Qualitative research is an accessible way for chaplains to contribute new knowledge about the sacred dimension of people's lived experience. PMID:24926897

  13. Overview and Assessment of Antarctic Ice-Sheet Mass Balance Estimates: 1992-2009

    NASA Technical Reports Server (NTRS)

    Zwally, H. Jay; Giovinetto, Mario B.

    2011-01-01

    Mass balance estimates for the Antarctic Ice Sheet (AIS) in the 2007 report by the Intergovernmental Panel on Climate Change and in more recent reports lie between approximately ?50 to -250 Gt/year for 1992 to 2009. The 300 Gt/year range is approximately 15% of the annual mass input and 0.8 mm/year Sea Level Equivalent (SLE). Two estimates from radar altimeter measurements of elevation change by European Remote-sensing Satellites (ERS) (?28 and -31 Gt/year) lie in the upper part, whereas estimates from the Input-minus-Output Method (IOM) and the Gravity Recovery and Climate Experiment (GRACE) lie in the lower part (-40 to -246 Gt/year). We compare the various estimates, discuss the methodology used, and critically assess the results. We also modify the IOM estimate using (1) an alternate extrapolation to estimate the discharge from the non-observed 15% of the periphery, and (2) substitution of input from a field data compilation for input from an atmospheric model in 6% of area. The modified IOM estimate reduces the loss from 136 Gt/year to 13 Gt/year. Two ERS-based estimates, the modified IOM, and a GRACE-based estimate for observations within 1992 2005 lie in a narrowed range of ?27 to -40 Gt/year, which is about 3% of the annual mass input and only 0.2 mm/year SLE. Our preferred estimate for 1992 2001 is -47 Gt/year for West Antarctica, ?16 Gt/year for East Antarctica, and -31 Gt/year overall (?0.1 mm/year SLE), not including part of the Antarctic Peninsula (1.07% of the AIS area). Although recent reports of large and increasing rates of mass loss with time from GRACE-based studies cite agreement with IOM results, our evaluation does not support that conclusion

  14. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    USGS Publications Warehouse

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  15. An overview of maternal separation effects on behavioural outcomes in mice: Evidence from a four-stage methodological systematic review.

    PubMed

    Tractenberg, Saulo G; Levandowski, Mateus L; de Azeredo, Lucas Araújo; Orso, Rodrigo; Roithmann, Laura G; Hoffmann, Emerson S; Brenhouse, Heather; Grassi-Oliveira, Rodrigo

    2016-09-01

    Early life stress (ELS) developmental effects have been widely studied by preclinical researchers. Despite the growing body of evidence from ELS models, such as the maternal separation paradigm, the reported results have marked inconsistencies. The maternal separation model has several methodological pitfalls that could influence the reliability of its results. Here, we critically review 94 mice studies that addressed the effects of maternal separation on behavioural outcomes. We also discuss methodological issues related to the heterogeneity of separation protocols and the quality of reporting methods. Our findings indicate a lack of consistency in maternal separation effects: major studies of behavioural and biological phenotypes failed to find significant deleterious effects. Furthermore, we identified several specific variations in separation methodological procedures. These methodological variations could contribute to the inconsistency of maternal separation effects by producing different degrees of stress exposure in maternal separation-reared pups. These methodological problems, together with insufficient reporting, might lead to inaccurate and unreliable effect estimates in maternal separation studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Coal gasification systems engineering and analysis. Appendix E: Cost estimation and economic evaluation methodology

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The cost estimation and economic evaluation methodologies presented are consistent with industry practice for assessing capital investment requirements and operating costs of coal conversion systems. All values stated are based on January, 1980 dollars with appropriate recognition of the time value of money. Evaluation of project economic feasibility can be considered a two step process (subject to considerable refinement). First, the costs of the project must be quantified and second, the price at which the product can be manufacturd must be determined. These two major categories are discussed. The summary of methodology is divided into five parts: (1) systems costs, (2)instant plant costs, (3) annual operating costs, (4) escalation and discounting process, and (5) product pricing.

  17. (Per)Forming Archival Research Methodologies

    ERIC Educational Resources Information Center

    Gaillet, Lynee Lewis

    2012-01-01

    This article raises multiple issues associated with archival research methodologies and methods. Based on a survey of recent scholarship and interviews with experienced archival researchers, this overview of the current status of archival research both complicates traditional conceptions of archival investigation and encourages scholars to adopt…

  18. Validating alternative methodologies to estimate the regime of temporary rivers when flow data are unavailable.

    PubMed

    Gallart, F; Llorens, P; Latron, J; Cid, N; Rieradevall, M; Prat, N

    2016-09-15

    Hydrological data for assessing the regime of temporary rivers are often non-existent or scarce. The scarcity of flow data makes impossible to characterize the hydrological regime of temporary streams and, in consequence, to select the correct periods and methods to determine their ecological status. This is why the TREHS software is being developed, in the framework of the LIFE Trivers project. It will help managers to implement adequately the European Water Framework Directive in this kind of water body. TREHS, using the methodology described in Gallart et al. (2012), defines six transient 'aquatic states', based on hydrological conditions representing different mesohabitats, for a given reach at a particular moment. Because of its qualitative nature, this approach allows using alternative methodologies to assess the regime of temporary rivers when there are no observed flow data. These methods, based on interviews and high-resolution aerial photographs, were tested for estimating the aquatic regime of temporary rivers. All the gauging stations (13) belonging to the Catalan Internal Catchments (NE Spain) with recurrent zero-flow periods were selected to validate this methodology. On the one hand, non-structured interviews were conducted with inhabitants of villages near the gauging stations. On the other hand, the historical series of available orthophotographs were examined. Flow records measured at the gauging stations were used to validate the alternative methods. Flow permanence in the reaches was estimated reasonably by the interviews and adequately by aerial photographs, when compared with the values estimated using daily flows. The degree of seasonality was assessed only roughly by the interviews. The recurrence of disconnected pools was not detected by flow records but was estimated with some divergences by the two methods. The combination of the two alternative methods allows substituting or complementing flow records, to be updated in the future through

  19. Implementation of an acoustic-based methane flux estimation methodology in the Eastern Siberian Arctic Sea

    NASA Astrophysics Data System (ADS)

    Weidner, E. F.; Weber, T. C.; Mayer, L. A.

    2017-12-01

    Quantifying methane flux originating from marine seep systems in climatically sensitive regions is of critically importance for current and future climate studies. Yet, the methane contribution from these systems has been difficult to estimate given the broad spatial scale of the ocean and the heterogeneity of seep activity. One such region is the Eastern Siberian Arctic Sea (ESAS), where bubble release into the shallow water column (<40 meters average depth) facilitates transport of methane to the atmosphere without oxidation. Quantifying the current seep methane flux from the ESAS is necessary to understand not only the total ocean methane budget, but also to provide baseline estimates against which future climate-induced changes can be measured. At the 2016 AGU fall meeting, we presented a new acoustic-based flux methodology using a calibrated broadband split-beam echosounder. The broad (14-24 kHz) bandwidth provides a vertical resolution of 10 cm, making possible the identification of single bubbles. After calibration using 64 mm copper sphere of known backscatter, the acoustic backscatter of individual bubbles is measured and compared to analytical models to estimate bubble radius. Additionally, bubbles are precisely located and traced upwards through the water column to estimate rise velocity. The combination of radius and rise velocity allows for gas flux estimation. Here, we follow up with the completed implementation of this methodology applied to the Herald Canyon region of the western ESAS. From the 68 recognized seeps, bubble radii and rise velocity were computed for more than 550 individual bubbles. The range of bubble radii, 1-6 mm, is comparable to those published by other investigators, while the radius dependent rise velocities are consistent with published models. Methane flux for the Herald Canyon region was estimated by extrapolation from individual seep flux values.

  20. Mobile and Web 2.0 interventions for weight management: an overview of review evidence and its methodological quality

    PubMed Central

    Smith, Jane R.; Samaha, Laya; Abraham, Charles

    2016-01-01

    Abstract Background : The use of Internet and related technologies for promoting weight management (WM), physical activity (PA), or dietary-related behaviours has been examined in many articles and systematic reviews. This overview aims to summarize and assess the quality of the review evidence specifically focusing on mobile and Web 2.0 technologies, which are the most utilized, currently available technologies. Methods: Following a registered protocol (CRD42014010323), we searched 16 databases for articles published in English until 31 December 2014 discussing the use of either mobile or Web 2.0 technologies to promote WM or related behaviors, i.e. diet and physical activity (PA). Two reviewers independently selected reviews and assessed their methodological quality using the AMSTAR checklist. Citation matrices were used to determine the overlap among reviews. Results: Forty-four eligible reviews were identified, 39 of which evaluated the effects of interventions using mobile or Web 2.0 technologies. Methodological quality was generally low with only 7 reviews (16%) meeting the highest standards. Suggestive evidence exists for positive effects of mobile technologies on weight-related outcomes and, to a lesser extent, PA. Evidence is inconclusive regarding Web 2.0 technologies. Conclusions : Reviews on mobile and Web 2.0 interventions for WM and related behaviors suggest that these technologies can, under certain circumstances, be effective, but conclusions are limited by poor review quality based on a heterogeneous evidence base. PMID:27335330

  1. A descriptive analysis of overviews of reviews published between 2000 and 2011.

    PubMed

    Hartling, Lisa; Chisholm, Annabritt; Thomson, Denise; Dryden, Donna M

    2012-01-01

    Overviews of systematic reviews compile data from multiple systematic reviews (SRs) and are a new method of evidence synthesis. To describe the methodological approaches in overviews of interventions. Descriptive study. We searched 4 databases from 2000 to July 2011; we handsearched Evidence-based Child Health: A Cochrane Review Journal. We defined an overview as a study that: stated a clear objective; examined an intervention; used explicit methods to identify SRs; collected and synthesized outcome data from the SRs; and intended to include only SRs. We did not restrict inclusion by population characteristics (e.g., adult or children only). Two researchers independently screened studies and applied eligibility criteria. One researcher extracted data with verification by a second. We conducted a descriptive analysis. From 2,245 citations, 75 overviews were included. The number of overviews increased from 1 in 2000 to 14 in 2010. The interventions were pharmacological (n = 20, 26.7%), non-pharmacological (n = 26, 34.7%), or both (n = 29, 38.7%). Inclusion criteria were clearly stated in 65 overviews. Thirty-three (44%) overviews searched at least 2 databases. The majority reported the years and databases searched (n = 46, 61%), and provided key words (n = 58, 77%). Thirty-nine (52%) overviews included Cochrane SRs only. Two reviewers independently screened and completed full text review in 29 overviews (39%). Methods of data extraction were reported in 45 (60%). Information on quality of individual studies was extracted from the original SRs in 27 (36%) overviews. Quality assessment of the SRs was performed in 28 (37%) overviews; at least 9 different tools were used. Quality of the body of evidence was assessed in 13 (17%) overviews. Most overviews provided a narrative or descriptive analysis of the included SRs. One overview conducted indirect analyses and the other conducted mixed treatment comparisons. Publication bias was discussed in 18 (24

  2. Methodologies for the quantitative estimation of toxicant dose to cigarette smokers using physical, chemical and bioanalytical data.

    PubMed

    St Charles, Frank Kelley; McAughey, John; Shepperd, Christopher J

    2013-06-01

    Methodologies have been developed, described and demonstrated that convert mouth exposure estimates of cigarette smoke constituents to dose by accounting for smoke spilled from the mouth prior to inhalation (mouth-spill (MS)) and the respiratory retention (RR) during the inhalation cycle. The methodologies are applicable to just about any chemical compound in cigarette smoke that can be measured analytically and can be used with ambulatory population studies. Conversion of exposure to dose improves the relevancy for risk assessment paradigms. Except for urinary nicotine plus metabolites, biomarkers generally do not provide quantitative exposure or dose estimates. In addition, many smoke constituents have no reliable biomarkers. We describe methods to estimate the RR of chemical compounds in smoke based on their vapor pressure (VP) and to estimate the MS for a given subject. Data from two clinical studies were used to demonstrate dose estimation for 13 compounds, of which only 3 have urinary biomarkers. Compounds with VP > 10(-5) Pa generally have RRs of 88% or greater, which do not vary appreciably with inhalation volume (IV). Compounds with VP < 10(-7) Pa generally have RRs dependent on IV and lung exposure time. For MS, mean subject values from both studies were slightly greater than 30%. For constituents with urinary biomarkers, correlations with the calculated dose were significantly improved over correlations with mouth exposure. Of toxicological importance is that the dose correlations provide an estimate of the metabolic conversion of a constituent to its respective biomarker.

  3. Statistical Methodology for Assigning Emissions to Industries in the United States, Revised Estimates: 1970 to 1997 (2001)

    EPA Pesticide Factsheets

    This report presents the results of a study that develops a methodology to assign emissions to the manufacturing and nonmanufacturing industries that comprise the industrial sector of the EPA’s national emission estimates for 1970 to 1997.

  4. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  5. A non-parametric automatic blending methodology to estimate rainfall fields from rain gauge and radar data

    NASA Astrophysics Data System (ADS)

    Velasco-Forero, Carlos A.; Sempere-Torres, Daniel; Cassiraga, Eduardo F.; Jaime Gómez-Hernández, J.

    2009-07-01

    Quantitative estimation of rainfall fields has been a crucial objective from early studies of the hydrological applications of weather radar. Previous studies have suggested that flow estimations are improved when radar and rain gauge data are combined to estimate input rainfall fields. This paper reports new research carried out in this field. Classical approaches for the selection and fitting of a theoretical correlogram (or semivariogram) model (needed to apply geostatistical estimators) are avoided in this study. Instead, a non-parametric technique based on FFT is used to obtain two-dimensional positive-definite correlograms directly from radar observations, dealing with both the natural anisotropy and the temporal variation of the spatial structure of the rainfall in the estimated fields. Because these correlation maps can be automatically obtained at each time step of a given rainfall event, this technique might easily be used in operational (real-time) applications. This paper describes the development of the non-parametric estimator exploiting the advantages of FFT for the automatic computation of correlograms and provides examples of its application on a case study using six rainfall events. This methodology is applied to three different alternatives to incorporate the radar information (as a secondary variable), and a comparison of performances is provided. In particular, their ability to reproduce in estimated rainfall fields (i) the rain gauge observations (in a cross-validation analysis) and (ii) the spatial patterns of radar fields are analyzed. Results seem to indicate that the methodology of kriging with external drift [KED], in combination with the technique of automatically computing 2-D spatial correlograms, provides merged rainfall fields with good agreement with rain gauges and with the most accurate approach to the spatial tendencies observed in the radar rainfall fields, when compared with other alternatives analyzed.

  6. Temperature-based estimation of global solar radiation using soft computing methodologies

    NASA Astrophysics Data System (ADS)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  7. Methodologies for the quantitative estimation of toxicant dose to cigarette smokers using physical, chemical and bioanalytical data

    PubMed Central

    McAughey, John; Shepperd, Christopher J.

    2013-01-01

    Methodologies have been developed, described and demonstrated that convert mouth exposure estimates of cigarette smoke constituents to dose by accounting for smoke spilled from the mouth prior to inhalation (mouth-spill (MS)) and the respiratory retention (RR) during the inhalation cycle. The methodologies are applicable to just about any chemical compound in cigarette smoke that can be measured analytically and can be used with ambulatory population studies. Conversion of exposure to dose improves the relevancy for risk assessment paradigms. Except for urinary nicotine plus metabolites, biomarkers generally do not provide quantitative exposure or dose estimates. In addition, many smoke constituents have no reliable biomarkers. We describe methods to estimate the RR of chemical compounds in smoke based on their vapor pressure (VP) and to estimate the MS for a given subject. Data from two clinical studies were used to demonstrate dose estimation for 13 compounds, of which only 3 have urinary biomarkers. Compounds with VP > 10−5 Pa generally have RRs of 88% or greater, which do not vary appreciably with inhalation volume (IV). Compounds with VP < 10−7 Pa generally have RRs dependent on IV and lung exposure time. For MS, mean subject values from both studies were slightly greater than 30%. For constituents with urinary biomarkers, correlations with the calculated dose were significantly improved over correlations with mouth exposure. Of toxicological importance is that the dose correlations provide an estimate of the metabolic conversion of a constituent to its respective biomarker. PMID:23742081

  8. Global inverse modeling of CH4 sources and sinks: an overview of methods

    NASA Astrophysics Data System (ADS)

    Houweling, Sander; Bergamaschi, Peter; Chevallier, Frederic; Heimann, Martin; Kaminski, Thomas; Krol, Maarten; Michalak, Anna M.; Patra, Prabir

    2017-01-01

    The aim of this paper is to present an overview of inverse modeling methods that have been developed over the years for estimating the global sources and sinks of CH4. It provides insight into how techniques and estimates have evolved over time and what the remaining shortcomings are. As such, it serves a didactical purpose of introducing apprentices to the field, but it also takes stock of developments so far and reflects on promising new directions. The main focus is on methodological aspects that are particularly relevant for CH4, such as its atmospheric oxidation, the use of methane isotopologues, and specific challenges in atmospheric transport modeling of CH4. The use of satellite retrievals receives special attention as it is an active field of methodological development, with special requirements on the sampling of the model and the treatment of data uncertainty. Regional scale flux estimation and attribution is still a grand challenge, which calls for new methods capable of combining information from multiple data streams of different measured parameters. A process model representation of sources and sinks in atmospheric transport inversion schemes allows the integrated use of such data. These new developments are needed not only to improve our understanding of the main processes driving the observed global trend but also to support international efforts to reduce greenhouse gas emissions.

  9. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

  10. A Review on Human Reinstatement Studies: An Overview and Methodological Challenges

    ERIC Educational Resources Information Center

    Haaker, Jan; Golkar, Armita; Hermans, Dirk; Lonsdorf, Tina B.

    2014-01-01

    In human research, studies of return of fear (ROF) phenomena, and reinstatement in particular, began only a decade ago and recently are more widely used, e.g., as outcome measures for fear/extinction memory manipulations (e.g., reconsolidation). As reinstatement research in humans is still in its infancy, providing an overview of its stability and…

  11. Methodology for Estimating ton-Miles of Goods Movements for U.S. Freight Mulitimodal Network System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling

    2013-01-01

    Ton-miles is a commonly used measure of freight transportation output. Estimation of ton-miles in the U.S. transportation system requires freight flow data at disaggregated level (either by link flow, path flows or origin-destination flows between small geographic areas). However, the sheer magnitude of the freight data system as well as industrial confidentiality concerns in Census survey, limit the freight data which is made available to the public. Through the years, the Center for Transportation Analysis (CTA) of the Oak Ridge National Laboratory (ORNL) has been working in the development of comprehensive national and regional freight databases and network flow models.more » One of the main products of this effort is the Freight Analysis Framework (FAF), a public database released by the ORNL. FAF provides to the general public a multidimensional matrix of freight flows (weight and dollar value) on the U.S. transportation system between states, major metropolitan areas, and remainder of states. Recently, the CTA research team has developed a methodology to estimate ton-miles by mode of transportation between the 2007 FAF regions. This paper describes the data disaggregation methodology. The method relies on the estimation of disaggregation factors that are related to measures of production, attractiveness and average shipments distances by mode service. Production and attractiveness of counties are captured by the total employment payroll. Likely mileages for shipments between counties are calculated by using a geographic database, i.e. the CTA multimodal network system. Results of validation experiments demonstrate the validity of the method. Moreover, 2007 FAF ton-miles estimates are consistent with the major freight data programs for rail and water movements.« less

  12. [Methodology for estimating total direct costs of comprehensive care for non-communicable diseases].

    PubMed

    Castillo, Nancy; Malo, Miguel; Villacres, Nilda; Chauca, José; Cornetero, Víctor; de Flores, Karin Roedel; Tapia, Rafaela; Ríos, Raúl

    2017-01-01

    RESUMEN Diseases like diabetes mellitus (DM) and hypertension (HT) generate high costs and are the most common cause of mortality in the Americas. In the case of Peru, given demographic and epidemiological changes, particularly the alarming increase in overweight and obesity, the burden of these diseases is constantly increasing, resulting in the need to budget more financial resources to the health services. The total care costs of these diseases and their complications represent a financial burden that should be considered very carefully by health institutions when they draft their budgets. With this aim, the Pan American Health Organization has assisted the Ministry of Health (MINSA) with a study to estimate these costs. This article graphically describes the methodology developed to estimate the direct costs of comprehensive care for DM and HT to the health services of MINSA and regional governments.

  13. A Descriptive Analysis of Overviews of Reviews Published between 2000 and 2011

    PubMed Central

    Hartling, Lisa; Chisholm, Annabritt; Thomson, Denise; Dryden, Donna M.

    2012-01-01

    Background Overviews of systematic reviews compile data from multiple systematic reviews (SRs) and are a new method of evidence synthesis. Objectives To describe the methodological approaches in overviews of interventions. Design Descriptive study. Methods We searched 4 databases from 2000 to July 2011; we handsearched Evidence-based Child Health: A Cochrane Review Journal. We defined an overview as a study that: stated a clear objective; examined an intervention; used explicit methods to identify SRs; collected and synthesized outcome data from the SRs; and intended to include only SRs. We did not restrict inclusion by population characteristics (e.g., adult or children only). Two researchers independently screened studies and applied eligibility criteria. One researcher extracted data with verification by a second. We conducted a descriptive analysis. Results From 2,245 citations, 75 overviews were included. The number of overviews increased from 1 in 2000 to 14 in 2010. The interventions were pharmacological (n = 20, 26.7%), non-pharmacological (n = 26, 34.7%), or both (n = 29, 38.7%). Inclusion criteria were clearly stated in 65 overviews. Thirty-three (44%) overviews searched at least 2 databases. The majority reported the years and databases searched (n = 46, 61%), and provided key words (n = 58, 77%). Thirty-nine (52%) overviews included Cochrane SRs only. Two reviewers independently screened and completed full text review in 29 overviews (39%). Methods of data extraction were reported in 45 (60%). Information on quality of individual studies was extracted from the original SRs in 27 (36%) overviews. Quality assessment of the SRs was performed in 28 (37%) overviews; at least 9 different tools were used. Quality of the body of evidence was assessed in 13 (17%) overviews. Most overviews provided a narrative or descriptive analysis of the included SRs. One overview conducted indirect analyses and the other conducted mixed treatment

  14. STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS

    EPA Science Inventory

    The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...

  15. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAEmore » by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics

  16. Functional-Based Assessment of Social Behavior: Introduction and Overview.

    ERIC Educational Resources Information Center

    Lewis, Timothy J.; Sugai, George

    1994-01-01

    This introduction to and overview of a special issue on social behavior assessment within schools discusses the impact of function-based methodologies on assessment and intervention practices in identification and remediation of challenging social behaviors. (JDD)

  17. Methodology to Estimate the Quantity, Composition, and Management of Construction and Demolition Debris in the United States

    EPA Science Inventory

    This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estima...

  18. 1995 American travel survey : an overview of the survey design and methodology

    DOT National Transportation Integrated Search

    1995-01-01

    This paper describes the methods used in the 1995 ATS. The introduction provides an overview of : the purpose and objectives of the survey followed by a description of the survey and sample designs, survey field operations, and processing of survey d...

  19. Evaluation of Methodology for Estimating the Cost of Air Force On-The-Job Training. Final Report.

    ERIC Educational Resources Information Center

    Samers, Bernard N.; And Others

    Described is the final phase of a study directed at the development of an on-the-job training (OJT) costing methodology. Utilizing a modification of survey techniques tested and evaluated during the previous phase, estimates were obtained for the cost of OJT for airman training from the l-level (unskilled to the 3-level (semiskilled) in five…

  20. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  1. Data Mining: A Hybrid Methodology for Complex and Dynamic Research

    ERIC Educational Resources Information Center

    Lang, Susan; Baehr, Craig

    2012-01-01

    This article provides an overview of the ways in which data and text mining have potential as research methodologies in composition studies. It introduces data mining in the context of the field of composition studies and discusses ways in which this methodology can complement and extend our existing research practices by blending the best of what…

  2. Methodology of automated ionosphere front velocity estimation for ground-based augmentation of GNSS

    NASA Astrophysics Data System (ADS)

    Bang, Eugene; Lee, Jiyun

    2013-11-01

    ionospheric anomalies occurring during severe ionospheric storms can pose integrity threats to Global Navigation Satellite System (GNSS) Ground-Based Augmentation Systems (GBAS). Ionospheric anomaly threat models for each region of operation need to be developed to analyze the potential impact of these anomalies on GBAS users and develop mitigation strategies. Along with the magnitude of ionospheric gradients, the speed of the ionosphere "fronts" in which these gradients are embedded is an important parameter for simulation-based GBAS integrity analysis. This paper presents a methodology for automated ionosphere front velocity estimation which will be used to analyze a vast amount of ionospheric data, build ionospheric anomaly threat models for different regions, and monitor ionospheric anomalies continuously going forward. This procedure automatically selects stations that show a similar trend of ionospheric delays, computes the orientation of detected fronts using a three-station-based trigonometric method, and estimates speeds for the front using a two-station-based method. It also includes fine-tuning methods to improve the estimation to be robust against faulty measurements and modeling errors. It demonstrates the performance of the algorithm by comparing the results of automated speed estimation to those manually computed previously. All speed estimates from the automated algorithm fall within error bars of ± 30% of the manually computed speeds. In addition, this algorithm is used to populate the current threat space with newly generated threat points. A larger number of velocity estimates helps us to better understand the behavior of ionospheric gradients under geomagnetic storm conditions.

  3. A Hierarchical Clustering Methodology for the Estimation of Toxicity

    EPA Science Inventory

    A Quantitative Structure Activity Relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural sim...

  4. Aromatherapy for health care: an overview of systematic reviews.

    PubMed

    Lee, Myeong Soo; Choi, Jiae; Posadzki, Paul; Ernst, Edzard

    2012-03-01

    Aromatherapy is the therapeutic use of essential oil from herbs, flowers, and other plants. The aim of this overview was to provide an overview of systematic reviews evaluating the effectiveness of aromatherapy. We searched 12 electronic databases and our departmental files without restrictions of time or language. The methodological quality of all systematic reviews was evaluated independently by two authors. Of 201 potentially relevant publications, 10 met our inclusion criteria. Most of the systematic reviews were of poor methodological quality. The clinical subject areas were hypertension, depression, anxiety, pain relief, and dementia. For none of the conditions was the evidence convincing. Several SRs of aromatherapy have recently been published. Due to a number of caveats, the evidence is not sufficiently convincing that aromatherapy is an effective therapy for any condition. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.

    PubMed

    Schmidt, Silke; Verweij, Marcel

    2013-01-01

    The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.

  6. Trip Energy Estimation Methodology and Model Based on Real-World Driving Data for Green Routing Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holden, Jacob; Van Til, Harrison J; Wood, Eric W

    A data-informed model to predict energy use for a proposed vehicle trip has been developed in this paper. The methodology leverages nearly 1 million miles of real-world driving data to generate the estimation model. Driving is categorized at the sub-trip level by average speed, road gradient, and road network geometry, then aggregated by category. An average energy consumption rate is determined for each category, creating an energy rates look-up table. Proposed vehicle trips are then categorized in the same manner, and estimated energy rates are appended from the look-up table. The methodology is robust and applicable to almost any typemore » of driving data. The model has been trained on vehicle global positioning system data from the Transportation Secure Data Center at the National Renewable Energy Laboratory and validated against on-road fuel consumption data from testing in Phoenix, Arizona. The estimation model has demonstrated an error range of 8.6% to 13.8%. The model results can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations to reduce energy consumption. This work provides a highly extensible framework that allows the model to be tuned to a specific driver or vehicle type.« less

  7. Regional Shelter Analysis Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less

  8. Methodology of Estimation of Methane Emissions from Coal Mines in Poland

    NASA Astrophysics Data System (ADS)

    Patyńska, Renata

    2014-03-01

    Based on a literature review concerning methane emissions in Poland, it was stated in 2009 that the National Greenhouse Inventory 2007 [13] was published. It was prepared firstly to meet Poland's obligations resulting from point 3.1 Decision no. 280/2004/WE of the European Parliament and of the Council of 11 February 2004, concerning a mechanism for monitoring community greenhouse gas emissions and for implementing the Kyoto Protocol and secondly, for the United Nations Framework Convention on Climate Change (UNFCCC) and Kyoto Protocol. The National Greenhouse Inventory states that there are no detailed data concerning methane emissions in collieries in the Polish mining industry. That is why the methane emission in the methane coal mines of Górnośląskie Zagłębie Węglowe - GZW (Upper Silesian Coal Basin - USCB) in Poland was meticulously studied and evaluated. The applied methodology for estimating methane emission from the GZW coal mining system was used for the four basic sources of its emission. Methane emission during the mining and post-mining process. Such an approach resulted from the IPCC guidelines of 2006 [10]. Updating the proposed methods (IPCC2006) of estimating the methane emissions of hard coal mines (active and abandoned ones) in Poland, assumes that the methane emission factor (EF) is calculated based on methane coal mine output and actual values of absolute methane content. The result of verifying the method of estimating methane emission during the mining process for Polish coal mines is the equation of methane emission factor EF.

  9. Contemporary health care economics: an overview.

    PubMed

    McLaughlin, Nancy; Ong, Michael K; Tabbush, Victor; Hagigi, Farhad; Martin, Neil A

    2014-11-01

    Economic evaluations provide a decision-making framework in which outcomes (benefits) and costs are assessed for various alternative options. Although the interest in complete and partial economic evaluations has increased over the past 2 decades, the quality of studies has been marginal due to methodological challenges or incomplete cost determination. This paper provides an overview of the main types of complete and partial economic evaluations, reviews key methodological elements to be considered for any economic evaluation, and reviews concepts of cost determination. The goal is to provide the clinician neurosurgeon with the knowledge and tools needed to appraise published economic evaluations and to direct high-quality health economic evaluations.

  10. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  11. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie

    2008-06-01

    Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word

  12. Active placebo control groups of pharmacological interventions were rarely used but merited serious consideration: a methodological overview.

    PubMed

    Jensen, Jakob Solgaard; Bielefeldt, Andreas Ørsted; Hróbjartsson, Asbjørn

    2017-07-01

    Active placebos are control interventions that mimic the side effects of the experimental interventions in randomized trials and are sometimes used to reduce the risk of unblinding. We wanted to assess how often randomized clinical drug trials use active placebo control groups; to provide a catalog, and a characterization, of such trials; and to analyze methodological arguments for and against the use of active placebo. An overview consisting of three thematically linked substudies. In an observational substudy, we assessed the prevalence of active placebo groups based on a random sample of 200 PubMed indexed placebo-controlled randomized drug trials published in October 2013. In a systematic review, we identified and characterized trials with active placebo control groups irrespective of publication time. In a third substudy, we reviewed publications with substantial methodological comments on active placebo groups (searches in PubMed, The Cochrane Library, Google Scholar, and HighWirePress). The prevalence of trials with active placebo groups published in 2013 was 1 out of 200 (95% confidence interval: 0-2), 0.5% (0-1%). We identified and characterized 89 randomized trials (published 1961-2014) using active placebos, for example, antihistamines, anticholinergic drugs, and sedatives. Such trials typically involved a crossover design, the experimental intervention had noticeable side effects, and the outcomes were patient-reported. The use of active placebos was clustered in specific research settings and did not appear to reflect consistently the side effect profile of the experimental intervention, for example, selective serotonin reuptake inhibitors were compared with active placebos in pain trials but not in depression trials. We identified and analyzed 25 methods publications with substantial comments. The main argument for active placebo was to reduce risk of unblinding; the main argument against was the risk of unintended therapeutic effect. Pharmacological

  13. Drought risk assessment under climate change is sensitive to methodological choices for the estimation of evaporative demand

    PubMed Central

    Barsugli, Joseph J.; Hobbins, Michael T.; Kumar, Sanjiv

    2017-01-01

    Several studies have projected increases in drought severity, extent and duration in many parts of the world under climate change. We examine sources of uncertainty arising from the methodological choices for the assessment of future drought risk in the continental US (CONUS). One such uncertainty is in the climate models’ expression of evaporative demand (E0), which is not a direct climate model output but has been traditionally estimated using several different formulations. Here we analyze daily output from two CMIP5 GCMs to evaluate how differences in E0 formulation, treatment of meteorological driving data, choice of GCM, and standardization of time series influence the estimation of E0. These methodological choices yield different assessments of spatio-temporal variability in E0 and different trends in 21st century drought risk. First, we estimate E0 using three widely used E0 formulations: Penman-Monteith; Hargreaves-Samani; and Priestley-Taylor. Our analysis, which primarily focuses on the May-September warm-season period, shows that E0 climatology and its spatial pattern differ substantially between these three formulations. Overall, we find higher magnitudes of E0 and its interannual variability using Penman-Monteith, in particular for regions like the Great Plains and southwestern US where E0 is strongly influenced by variations in wind and relative humidity. When examining projected changes in E0 during the 21st century, there are also large differences among the three formulations, particularly the Penman-Monteith relative to the other two formulations. The 21st century E0 trends, particularly in percent change and standardized anomalies of E0, are found to be sensitive to the long-term mean value and the amplitude of interannual variability, i.e. if the magnitude of E0 and its interannual variability are relatively low for a particular E0 formulation, then the normalized or standardized 21st century trend based on that formulation is amplified

  14. Drought risk assessment under climate change is sensitive to methodological choices for the estimation of evaporative demand.

    PubMed

    Dewes, Candida F; Rangwala, Imtiaz; Barsugli, Joseph J; Hobbins, Michael T; Kumar, Sanjiv

    2017-01-01

    Several studies have projected increases in drought severity, extent and duration in many parts of the world under climate change. We examine sources of uncertainty arising from the methodological choices for the assessment of future drought risk in the continental US (CONUS). One such uncertainty is in the climate models' expression of evaporative demand (E0), which is not a direct climate model output but has been traditionally estimated using several different formulations. Here we analyze daily output from two CMIP5 GCMs to evaluate how differences in E0 formulation, treatment of meteorological driving data, choice of GCM, and standardization of time series influence the estimation of E0. These methodological choices yield different assessments of spatio-temporal variability in E0 and different trends in 21st century drought risk. First, we estimate E0 using three widely used E0 formulations: Penman-Monteith; Hargreaves-Samani; and Priestley-Taylor. Our analysis, which primarily focuses on the May-September warm-season period, shows that E0 climatology and its spatial pattern differ substantially between these three formulations. Overall, we find higher magnitudes of E0 and its interannual variability using Penman-Monteith, in particular for regions like the Great Plains and southwestern US where E0 is strongly influenced by variations in wind and relative humidity. When examining projected changes in E0 during the 21st century, there are also large differences among the three formulations, particularly the Penman-Monteith relative to the other two formulations. The 21st century E0 trends, particularly in percent change and standardized anomalies of E0, are found to be sensitive to the long-term mean value and the amplitude of interannual variability, i.e. if the magnitude of E0 and its interannual variability are relatively low for a particular E0 formulation, then the normalized or standardized 21st century trend based on that formulation is amplified

  15. Longitudinal versus cross-sectional methodology for estimating the economic burden of breast cancer: a pilot study.

    PubMed

    Mullins, C Daniel; Wang, Junling; Cooke, Jesse L; Blatt, Lisa; Baquet, Claudia R

    2004-01-01

    Projecting future breast cancer treatment expenditure is critical for budgeting purposes, medical decision making and the allocation of resources in order to maximise the overall impact on health-related outcomes of care. Currently, both longitudinal and cross-sectional methodologies are used to project the economic burden of cancer. This pilot study examined the differences in estimates that were obtained using these two methods, focusing on Maryland, US Medicaid reimbursement data for chemotherapy and prescription drugs for the years 1999-2000. Two different methodologies for projecting life cycles of cancer expenditure were considered. The first examined expenditure according to chronological time (calendar quarter) for all cancer patients in the database in a given quarter. The second examined only the most recent quarter and constructed a hypothetical expenditure life cycle by taking into consideration the number of quarters since the respective patient had her first claim. We found different average expenditures using the same data and over the same time period. The longitudinal measurement had less extreme peaks and troughs, and yielded average expenditure in the final period that was 60% higher than that produced using the cross-sectional analysis; however, the longitudinal analysis had intermediate periods with significantly lower estimated expenditure than the cross-sectional data. These disparate results signify that each of the methods has merit. The longitudinal method tracks changes over time while the cross-sectional approach reflects more recent data, e.g. current practice patterns. Thus, this study reiterates the importance of considering the methodology when projecting future cancer expenditure.

  16. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  17. Estimation of retired mobile phones generation in China: A comparative study on methodology.

    PubMed

    Li, Bo; Yang, Jianxin; Lu, Bin; Song, Xiaolong

    2015-01-01

    Due to the rapid development of economy and technology, China has the biggest production and possession of mobile phones around the world. In general, mobile phones have relatively short life time because the majority of users replace their mobile phones frequently. Retired mobile phones represent the most valuable electrical and electronic equipment (EEE) in the main waste stream because of such characteristics as large quantity, high reuse/recovery value and fast replacement frequency. Consequently, the huge amount of retired mobile phones in China calls for a sustainable management system. The generation estimation can provide fundamental information to construct the sustainable management system of retired mobile phones and other waste electrical and electronic equipment (WEEE). However, the reliable estimation result is difficult to get and verify. The priority aim of this paper is to provide proper estimation approach for the generation of retired mobile phones in China, by comparing some relevant methods. The results show that the sales&new method is in the highest priority in estimation of the retired mobile phones. The result of sales&new method shows that there are 47.92 million mobile phones retired in 2002, and it reached to 739.98 million in China in 2012. It presents an increasing tendency with some fluctuations clearly. Furthermore, some discussions on methodology, such as the selection of improper approach and error in the input data, are also conducted in order to improve generation estimation of retired mobile phones and other WEEE. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Risk methodology overview. [for carbon fiber release

    NASA Technical Reports Server (NTRS)

    Credeur, K. R.

    1979-01-01

    Some considerations of risk estimation, how risk is measured, and how risk analysis decisions are made are discussed. Specific problems of carbon fiber release are discussed by reviewing the objective, describing the main elements, and giving an example of the risk logic and outputs.

  19. A PDE-based methodology for modeling, parameter estimation and feedback control in structural and structural acoustic systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun

    1994-01-01

    A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.

  20. Design Research with a Focus on Learning Processes: An Overview on Achievements and Challenges

    ERIC Educational Resources Information Center

    Prediger, Susanne; Gravemeijer, Koeno; Confrey, Jere

    2015-01-01

    Design research continues to gain prominence as a significant methodology in the mathematics education research community. This overview summarizes the origins and the current state of design research practices focusing on methodological requirements and processes of theorizing. While recognizing the rich variations in the foci and scale of design…

  1. Development of risk-based decision methodology for facility design.

    DOT National Transportation Integrated Search

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  2. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide—An Overview

    PubMed Central

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-01-01

    People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools. PMID:28257103

  3. A Study of IR Loss Correction Methodologies for Commercially Available Pyranometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Chuck; Andreas, Afshin; Augustine, John

    2017-03-24

    This presentation provides a high-level overview of a study of IR Loss Connection Methodologies for Commercially Available Pyranometers. The IR Loss Corrections Study is investigating how various correction methodologies work for several makes and models of commercially available pyranometers in common use, both when operated in ventilators with DC fans and without ventilators, as when they are typically calibrated.

  4. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review.

    PubMed

    Booth, Andrew

    2016-05-04

    Qualitative systematic reviews or qualitative evidence syntheses (QES) are increasingly recognised as a way to enhance the value of systematic reviews (SRs) of clinical trials. They can explain the mechanisms by which interventions, evaluated within trials, might achieve their effect. They can investigate differences in effects between different population groups. They can identify which outcomes are most important to patients, carers, health professionals and other stakeholders. QES can explore the impact of acceptance, feasibility, meaningfulness and implementation-related factors within a real world setting and thus contribute to the design and further refinement of future interventions. To produce valid, reliable and meaningful QES requires systematic identification of relevant qualitative evidence. Although the methodologies of QES, including methods for information retrieval, are well-documented, little empirical evidence exists to inform their conduct and reporting. This structured methodological overview examines papers on searching for qualitative research identified from the Cochrane Qualitative and Implementation Methods Group Methodology Register and from citation searches of 15 key papers. A single reviewer reviewed 1299 references. Papers reporting methodological guidance, use of innovative methodologies or empirical studies of retrieval methods were categorised under eight topical headings: overviews and methodological guidance, sampling, sources, structured questions, search procedures, search strategies and filters, supplementary strategies and standards. This structured overview presents a contemporaneous view of information retrieval for qualitative research and identifies a future research agenda. This review concludes that poor empirical evidence underpins current information practice in information retrieval of qualitative research. A trend towards improved transparency of search methods and further evaluation of key search procedures offers

  5. The Research Diagnostic Criteria for Temporomandibular Disorders. I: overview and methodology for assessment of validity.

    PubMed

    Schiffman, Eric L; Truelove, Edmond L; Ohrbach, Richard; Anderson, Gary C; John, Mike T; List, Thomas; Look, John O

    2010-01-01

    The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. The aim of this article is to provide an overview of the project's methodology, descriptive statistics, and data for the study participant sample. This article also details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. The Axis I reference standards were based on the consensus of two criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion examination reliability was also assessed within study sites. Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas > or = 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion examiner agreement with reference standards was excellent (k > or = 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods.

  6. Writing a Rhizome: An (Im)plausible Methodology

    ERIC Educational Resources Information Center

    Honan, Eileen

    2007-01-01

    In this paper the author provides an overview of a rhizomatic methodology using illustrations from her doctoral thesis, where she used Deleuze and Guattari's (1987) thinking about rhizomes in three different ways. First, using the figuration of a rhizome allowed her to construct her thesis as non-linear with self-conscious attention paid to the…

  7. What guidance is available for researchers conducting overviews of reviews of healthcare interventions? A scoping review and qualitative metasummary.

    PubMed

    Pollock, Michelle; Fernandes, Ricardo M; Becker, Lorne A; Featherstone, Robin; Hartling, Lisa

    2016-11-14

    Overviews of reviews (overviews) compile data from multiple systematic reviews to provide a single synthesis of relevant evidence for decision-making. Despite their increasing popularity, there is limited methodological guidance available for researchers wishing to conduct overviews. The objective of this scoping review is to identify and collate all published and unpublished documents containing guidance for conducting overviews examining the efficacy, effectiveness, and/or safety of healthcare interventions. Our aims were to provide a map of existing guidance documents; identify similarities, differences, and gaps in the guidance contained within these documents; and identify common challenges involved in conducting overviews. We conducted an iterative and extensive search to ensure breadth and comprehensiveness of coverage. The search involved reference tracking, database and web searches (MEDLINE, EMBASE, DARE, Scopus, Cochrane Methods Studies Database, Google Scholar), handsearching of websites and conference proceedings, and contacting overview producers. Relevant guidance statements and challenges encountered were extracted, edited, grouped, abstracted, and presented using a qualitative metasummary approach. We identified 52 guidance documents produced by 19 research groups. Relatively consistent guidance was available for the first stages of the overview process (deciding when and why to conduct an overview, specifying the scope, and searching for and including systematic reviews). In contrast, there was limited or conflicting guidance for the latter stages of the overview process (quality assessment of systematic reviews and their primary studies, collecting and analyzing data, and assessing quality of evidence), and many of the challenges identified were also related to these stages. An additional, overarching challenge identified was that overviews are limited by the methods, reporting, and coverage of their included systematic reviews. This compilation

  8. Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.

    ERIC Educational Resources Information Center

    Bertrand, Jane T.; And Others

    1989-01-01

    An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)

  9. Validating alternative methodologies to estimate the hydrological regime of temporary streams when flow data are unavailable

    NASA Astrophysics Data System (ADS)

    Llorens, Pilar; Gallart, Francesc; Latron, Jérôme; Cid, Núria; Rieradevall, Maria; Prat, Narcís

    2016-04-01

    ) were examined. In this case, flow permanence metrics were estimated as the proportion of photographs presenting stream flow. Results indicate that for streams being more than 25% of the time dry, interviews systematically underestimated flow, but the qualitative information given by inhabitants was of great interest to understand river dynamics. On the other hand, the use of aerial photographs gave a good estimation of flow permanence, but the seasonality was conditioned to the capture date of the aerial photographs. For these reasons, we recommend to use both methodologies together.

  10. Meta-analyses of Adverse Effects Data Derived from Randomised Controlled Trials as Compared to Observational Studies: Methodological Overview

    PubMed Central

    Golder, Su; Loke, Yoon K.; Bland, Martin

    2011-01-01

    Background There is considerable debate as to the relative merits of using randomised controlled trial (RCT) data as opposed to observational data in systematic reviews of adverse effects. This meta-analysis of meta-analyses aimed to assess the level of agreement or disagreement in the estimates of harm derived from meta-analysis of RCTs as compared to meta-analysis of observational studies. Methods and Findings Searches were carried out in ten databases in addition to reference checking, contacting experts, citation searches, and hand-searching key journals, conference proceedings, and Web sites. Studies were included where a pooled relative measure of an adverse effect (odds ratio or risk ratio) from RCTs could be directly compared, using the ratio of odds ratios, with the pooled estimate for the same adverse effect arising from observational studies. Nineteen studies, yielding 58 meta-analyses, were identified for inclusion. The pooled ratio of odds ratios of RCTs compared to observational studies was estimated to be 1.03 (95% confidence interval 0.93–1.15). There was less discrepancy with larger studies. The symmetric funnel plot suggests that there is no consistent difference between risk estimates from meta-analysis of RCT data and those from meta-analysis of observational studies. In almost all instances, the estimates of harm from meta-analyses of the different study designs had 95% confidence intervals that overlapped (54/58, 93%). In terms of statistical significance, in nearly two-thirds (37/58, 64%), the results agreed (both studies showing a significant increase or significant decrease or both showing no significant difference). In only one meta-analysis about one adverse effect was there opposing statistical significance. Conclusions Empirical evidence from this overview indicates that there is no difference on average in the risk estimate of adverse effects of an intervention derived from meta-analyses of RCTs and meta-analyses of observational

  11. Multi-point estimation of total energy expenditure: a comparison between zinc-reduction and platinum-equilibration methodologies.

    PubMed

    Sonko, Bakary J; Miller, Leland V; Jones, Richard H; Donnelly, Joseph E; Jacobsen, Dennis J; Hill, James O; Fennessey, Paul V

    2003-12-15

    Reducing water to hydrogen gas by zinc or uranium metal for determining D/H ratio is both tedious and time consuming. This has forced most energy metabolism investigators to use the "two-point" technique instead of the "Multi-point" technique for estimating total energy expenditure (TEE). Recently, we purchased a new platinum (Pt)-equilibration system that significantly reduces both time and labor required for D/H ratio determination. In this study, we compared TEE obtained from nine overweight but healthy subjects, estimated using the traditional Zn-reduction method to that obtained from the new Pt-equilibration system. Rate constants, pool spaces, and CO2 production rates obtained from use of the two methodologies were not significantly different. Correlation analysis demonstrated that TEEs estimated using the two methods were significantly correlated (r=0.925, p=0.0001). Sample equilibration time was reduced by 66% compared to those of similar methods. The data demonstrated that the Zn-reduction method could be replaced by the Pt-equilibration method when TEE was estimated using the "Multi-Point" technique. Furthermore, D equilibration time was significantly reduced.

  12. Computerized Adaptive Testing: Overview and Introduction.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; Nering, Michael L.

    1999-01-01

    Provides an overview of computerized adaptive testing (CAT) and introduces contributions to this special issue. CAT elements discussed include item selection, estimation of the latent trait, item exposure, measurement precision, and item-bank development. (SLD)

  13. Estimation of the daily global solar radiation based on the Gaussian process regression methodology in the Saharan climate

    NASA Astrophysics Data System (ADS)

    Guermoui, Mawloud; Gairaa, Kacem; Rabehi, Abdelaziz; Djafer, Djelloul; Benkaciali, Said

    2018-06-01

    Accurate estimation of solar radiation is the major concern in renewable energy applications. Over the past few years, a lot of machine learning paradigms have been proposed in order to improve the estimation performances, mostly based on artificial neural networks, fuzzy logic, support vector machine and adaptive neuro-fuzzy inference system. The aim of this work is the prediction of the daily global solar radiation, received on a horizontal surface through the Gaussian process regression (GPR) methodology. A case study of Ghardaïa region (Algeria) has been used in order to validate the above methodology. In fact, several combinations have been tested; it was found that, GPR-model based on sunshine duration, minimum air temperature and relative humidity gives the best results in term of mean absolute bias error (MBE), root mean square error (RMSE), relative mean square error (rRMSE), and correlation coefficient ( r) . The obtained values of these indicators are 0.67 MJ/m2, 1.15 MJ/m2, 5.2%, and 98.42%, respectively.

  14. Methodological factors affecting joint moments estimation in clinical gait analysis: a systematic review.

    PubMed

    Camomilla, Valentina; Cereatti, Andrea; Cutti, Andrea Giovanni; Fantozzi, Silvia; Stagni, Rita; Vannozzi, Giuseppe

    2017-08-18

    Quantitative gait analysis can provide a description of joint kinematics and dynamics, and it is recognized as a clinically useful tool for functional assessment, diagnosis and intervention planning. Clinically interpretable parameters are estimated from quantitative measures (i.e. ground reaction forces, skin marker trajectories, etc.) through biomechanical modelling. In particular, the estimation of joint moments during motion is grounded on several modelling assumptions: (1) body segmental and joint kinematics is derived from the trajectories of markers and by modelling the human body as a kinematic chain; (2) joint resultant (net) loads are, usually, derived from force plate measurements through a model of segmental dynamics. Therefore, both measurement errors and modelling assumptions can affect the results, to an extent that also depends on the characteristics of the motor task analysed (i.e. gait speed). Errors affecting the trajectories of joint centres, the orientation of joint functional axes, the joint angular velocities, the accuracy of inertial parameters and force measurements (concurring to the definition of the dynamic model), can weigh differently in the estimation of clinically interpretable joint moments. Numerous studies addressed all these methodological aspects separately, but a critical analysis of how these aspects may affect the clinical interpretation of joint dynamics is still missing. This article aims at filling this gap through a systematic review of the literature, conducted on Web of Science, Scopus and PubMed. The final objective is hence to provide clear take-home messages to guide laboratories in the estimation of joint moments for the clinical practice.

  15. An overview of research on waverider design methodology

    NASA Astrophysics Data System (ADS)

    Ding, Feng; Liu, Jun; Shen, Chi-bing; Liu, Zhen; Chen, Shao-hua; Fu, Xiang

    2017-11-01

    A waverider is any supersonic or hypersonic lifting body that is characterized by an attached, or nearly attached, bow shock wave along its leading edge. As a waverider can possess a high lift-to-drag ratio as well as an ideal precompression surface of the inlet system, it has become one of the most promising designs for air-breathing hypersonic vehicles. This paper reviews and classifies waverider design methodologies developed by local and foreign scholars up until 2016. The design concept of a waverider can be summarized as follows: modeling of the basic flow field is used to design the waverider in the streamwise direction and the osculating theory is used to design the waverider in the spanwise direction.

  16. A Novel Methodology to Estimate the Treatment Effect in Presence of Highly Variable Placebo Response

    PubMed Central

    Gomeni, Roberto; Goyal, Navin; Bressolle, Françoise; Fava, Maurizio

    2015-01-01

    One of the main reasons for the inefficiency of multicenter randomized clinical trials (RCTs) in depression is the excessively high level of placebo response. The aim of this work was to propose a novel methodology to analyze RCTs based on the assumption that centers with high placebo response are less informative than the other centers for estimating the ‘true' treatment effect (TE). A linear mixed-effect modeling approach for repeated measures (MMRM) was used as a reference approach. The new method for estimating TE was based on a nonlinear longitudinal modeling of clinical scores (NLMMRM). NLMMRM estimates TE by associating a weighting factor to the data collected in each center. The weight was defined by the posterior probability of detecting a clinically relevant difference between active treatment and placebo at that center. Data from five RCTs in depression were used to compare the performance of MMRM with NLMMRM. The results of the analyses showed an average improvement of ~15% in the TE estimated with NLMMRM when the center effect was included in the analyses. Opposite results were observed with MMRM: TE estimate was reduced by ~4% when the center effect was considered as covariate in the analysis. The novel NLMMRM approach provides a tool for controlling the confounding effect of high placebo response, to increase signal detection and to provide a more reliable estimate of the ‘true' TE by controlling false negative results associated with excessively high placebo response. PMID:25895454

  17. A statistical methodology for estimating transport parameters: Theory and applications to one-dimensional advectivec-dispersive systems

    USGS Publications Warehouse

    Wagner, Brian J.; Gorelick, Steven M.

    1986-01-01

    A simulation nonlinear multiple-regression methodology for estimating parameters that characterize the transport of contaminants is developed and demonstrated. Finite difference contaminant transport simulation is combined with a nonlinear weighted least squares multiple-regression procedure. The technique provides optimal parameter estimates and gives statistics for assessing the reliability of these estimates under certain general assumptions about the distributions of the random measurement errors. Monte Carlo analysis is used to estimate parameter reliability for a hypothetical homogeneous soil column for which concentration data contain large random measurement errors. The value of data collected spatially versus data collected temporally was investigated for estimation of velocity, dispersion coefficient, effective porosity, first-order decay rate, and zero-order production. The use of spatial data gave estimates that were 2–3 times more reliable than estimates based on temporal data for all parameters except velocity. Comparison of estimated linear and nonlinear confidence intervals based upon Monte Carlo analysis showed that the linear approximation is poor for dispersion coefficient and zero-order production coefficient when data are collected over time. In addition, examples demonstrate transport parameter estimation for two real one-dimensional systems. First, the longitudinal dispersivity and effective porosity of an unsaturated soil are estimated using laboratory column data. We compare the reliability of estimates based upon data from individual laboratory experiments versus estimates based upon pooled data from several experiments. Second, the simulation nonlinear regression procedure is extended to include an additional governing equation that describes delayed storage during contaminant transport. The model is applied to analyze the trends, variability, and interrelationship of parameters in a mourtain stream in northern California.

  18. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  19. Methodological approach for the collection and simultaneous estimation of greenhouse gases emission from aquaculture ponds.

    PubMed

    Vasanth, Muthuraman; Muralidhar, Moturi; Saraswathy, Ramamoorthy; Nagavel, Arunachalam; Dayal, Jagabattula Syama; Jayanthi, Marappan; Lalitha, Natarajan; Kumararaja, Periyamuthu; Vijayan, Koyadan Kizhakkedath

    2016-12-01

    Global warming/climate change is the greatest environmental threat of our time. Rapidly developing aquaculture sector is an anthropogenic activity, the contribution of which to global warming is little understood, and estimation of greenhouse gases (GHGs) emission from the aquaculture ponds is a key practice in predicting the impact of aquaculture on global warming. A comprehensive methodology was developed for sampling and simultaneous analysis of GHGs, carbon dioxide (CO 2 ), methane (CH 4 ), and nitrous oxide (N 2 O) from the aquaculture ponds. The GHG fluxes were collected using cylindrical acrylic chamber, air pump, and tedlar bags. A cylindrical acrylic floating chamber was fabricated to collect the GHGs emanating from the surface of aquaculture ponds. The sampling methodology was standardized and in-house method validation was established by achieving linearity, accuracy, precision, and specificity. GHGs flux was found to be stable at 10 ± 2 °C of storage for 3 days. The developed methodology was used to quantify GHGs in the Pacific white shrimp Penaeus vannamei and black tiger shrimp Penaeus monodon culture ponds for a period of 4 months. The rate of emission of carbon dioxide was found to be much greater when compared to other two GHGs. Average GHGs emission in gha -1  day -1 during the culture was comparatively high in P.vannamei culture ponds.

  20. Assessment of the Validity of the Research Diagnostic Criteria for Temporomandibular Disorders: Overview and Methodology

    PubMed Central

    Schiffman, Eric L.; Truelove, Edmond L.; Ohrbach, Richard; Anderson, Gary C.; John, Mike T.; List, Thomas; Look, John O.

    2011-01-01

    AIMS The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. An overview is presented, including Axis I and II methodology and descriptive statistics for the study participant sample. This paper details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. Validity testing for the Axis II biobehavioral instruments was based on previously validated reference standards. METHODS The Axis I reference standards were based on the consensus of 2 criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion exam reliability was also assessed within study sites. RESULTS Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas ≥ 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion exam agreement with reference standards was excellent (k ≥ 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). CONCLUSION The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods. PMID:20213028

  1. K-Means Subject Matter Expert Refined Topic Model Methodology

    DTIC Science & Technology

    2017-01-01

    Refined Topic Model Methodology Topic Model Estimation via K-Means U.S. Army TRADOC Analysis Center-Monterey 700 Dyer Road...January 2017 K-means Subject Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-Means Theodore T. Allen, Ph.D. Zhenhuan...Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-means 5a. CONTRACT NUMBER W9124N-15-P-0022 5b. GRANT NUMBER 5c

  2. Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure

    ERIC Educational Resources Information Center

    Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.

    2014-01-01

    Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…

  3. SU-F-T-687: Comparison of SPECT/CT-Based Methodologies for Estimating Lung Dose from Y-90 Radioembolization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kost, S; Yu, N; Lin, S

    2016-06-15

    Purpose: To compare mean lung dose (MLD) estimates from 99mTc macroaggregated albumin (MAA) SPECT/CT using two published methodologies for patients treated with {sup 90}Y radioembolization for liver cancer. Methods: MLD was estimated retrospectively using two methodologies for 40 patients from SPECT/CT images of 99mTc-MAA administered prior to radioembolization. In these two methods, lung shunt fractions (LSFs) were calculated as the ratio of scanned lung activity to the activity in the entire scan volume or to the sum of activity in the lung and liver respectively. Misregistration of liver activity into the lungs during SPECT acquisition was overcome by excluding lungmore » counts within either 2 or 1.5 cm of the diaphragm apex respectively. Patient lung density was assumed to be 0.3 g/cm{sup 3} or derived from CT densitovolumetry respectively. Results from both approaches were compared to MLD determined by planar scintigraphy (PS). The effect of patient size on the difference between MLD from PS and SPECT/CT was also investigated. Results: Lung density from CT densitovolumetry is not different from the reference density (p = 0.68). The second method resulted in lung dose of an average 1.5 times larger lung dose compared to the first method; however the difference between the means of the two estimates was not significant (p = 0.07). Lung dose from both methods were statistically different from those estimated from 2D PS (p < 0.001). There was no correlation between patient size and the difference between MLD from PS and both SPECT/CT methods (r < 0.22, p > 0.17). Conclusion: There is no statistically significant difference between MLD estimated from the two techniques. Both methods are statistically different from conventional PS, with PS overestimating dose by a factor of three or larger. The difference between lung doses estimated from 2D planar or 3D SPECT/CT is not dependent on patient size.« less

  4. New methodology for estimating biofuel consumption for cooking: Atmospheric emissions of black carbon and sulfur dioxide from India

    NASA Astrophysics Data System (ADS)

    Habib, Gazala; Venkataraman, Chandra; Shrivastava, Manish; Banerjee, Rangan; Stehr, J. W.; Dickerson, Russell R.

    2004-09-01

    The dominance of biofuel combustion emissions in the Indian region, and the inherently large uncertainty in biofuel use estimates based on cooking energy surveys, prompted the current work, which develops a new methodology for estimating biofuel consumption for cooking. This is based on food consumption statistics, and the specific energy for food cooking. Estimated biofuel consumption in India was 379 (247-584) Tg yr-1. New information on the user population of different biofuels was compiled at a state level, to derive the biofuel mix, which varied regionally and was 74:16:10%, respectively, of fuelwood, dung cake and crop waste, at a national level. Importantly, the uncertainty in biofuel use from quantitative error assessment using the new methodology is around 50%, giving a narrower bound than in previous works. From this new activity data and currently used black carbon emission factors, the black carbon (BC) emissions from biofuel combustion were estimated as 220 (65-760) Gg yr-1. The largest BC emissions were from fuelwood (75%), with lower contributions from dung cake (16%) and crop waste (9%). The uncertainty of 245% in the BC emissions estimate is now governed by the large spread in BC emission factors from biofuel combustion (122%), implying the need for reducing this uncertainty through measurements. Emission factors of SO2 from combustion of biofuels widely used in India were measured, and ranged 0.03-0.08 g kg-1 from combustion of two wood species, 0.05-0.20 g kg-1 from 10 crop waste types, and 0.88 g kg-1 from dung cake, significantly lower than currently used emission factors for wood and crop waste. Estimated SO2 emissions from biofuels of 75 (36-160) Gg yr-1 were about a factor of 3 lower than that in recent studies, with a large contribution from dung cake (73%), followed by fuelwood (21%) and crop waste (6%).

  5. An overview of systematic review.

    PubMed

    Baker, Kathy A; Weeks, Susan Mace

    2014-12-01

    Systematic review is an invaluable tool for the practicing clinician. A well-designed systematic review represents the latest and most complete information available on a particular topic or intervention. This article highlights the key elements of systematic review, what it is and is not, and provides an overview of several reputable organizations supporting the methodological development and conduct of systematic review. Important aspects for evaluating the quality of a systematic review are also included. Copyright © 2014 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  6. Methodology for Estimation of Flood Magnitude and Frequency for New Jersey Streams

    USGS Publications Warehouse

    Watson, Kara M.; Schopp, Robert D.

    2009-01-01

    Methodologies were developed for estimating flood magnitudes at the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for unregulated or slightly regulated streams in New Jersey. Regression equations that incorporate basin characteristics were developed to estimate flood magnitude and frequency for streams throughout the State by use of a generalized least squares regression analysis. Relations between flood-frequency estimates based on streamflow-gaging-station discharge and basin characteristics were determined by multiple regression analysis, and weighted by effective years of record. The State was divided into five hydrologically similar regions to refine the regression equations. The regression analysis indicated that flood discharge, as determined by the streamflow-gaging-station annual peak flows, is related to the drainage area, main channel slope, percentage of lake and wetland areas in the basin, population density, and the flood-frequency region, at the 95-percent confidence level. The standard errors of estimate for the various recurrence-interval floods ranged from 48.1 to 62.7 percent. Annual-maximum peak flows observed at streamflow-gaging stations through water year 2007 and basin characteristics determined using geographic information system techniques for 254 streamflow-gaging stations were used for the regression analysis. Drainage areas of the streamflow-gaging stations range from 0.18 to 779 mi2. Peak-flow data and basin characteristics for 191 streamflow-gaging stations located in New Jersey were used, along with peak-flow data for stations located in adjoining States, including 25 stations in Pennsylvania, 17 stations in New York, 16 stations in Delaware, and 5 stations in Maryland. Streamflow records for selected stations outside of New Jersey were included in the present study because hydrologic, physiographic, and geologic boundaries commonly extend beyond political boundaries. The StreamStats web application was developed

  7. Overview of Ongoing NRMRL GI Research

    EPA Science Inventory

    This presentation is an overview of ongoing NRMRL Green Infrastructure research and addresses the question: What do we need to know to present a cogent estimate of the value of Green Infrastructure? Discussions included are: stormwater well study, rain gardens and permeable su...

  8. Population Education: A Source Book on Content and Methodology.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and Oceania.

    A collection of 12 essays provides an overview of population education in Asia and Oceania with regard to concepts, status, approaches in curriculum and materials development, methodologies, and research and evaluation. The collection is presented in five sections. Section I explores general definitions of population education; its role as part of…

  9. Estimation of the laser cutting operating cost by support vector regression methodology

    NASA Astrophysics Data System (ADS)

    Jović, Srđan; Radović, Aleksandar; Šarkoćević, Živče; Petković, Dalibor; Alizamir, Meysam

    2016-09-01

    Laser cutting is a popular manufacturing process utilized to cut various types of materials economically. The operating cost is affected by laser power, cutting speed, assist gas pressure, nozzle diameter and focus point position as well as the workpiece material. In this article, the process factors investigated were: laser power, cutting speed, air pressure and focal point position. The aim of this work is to relate the operating cost to the process parameters mentioned above. CO2 laser cutting of stainless steel of medical grade AISI316L has been investigated. The main goal was to analyze the operating cost through the laser power, cutting speed, air pressure, focal point position and material thickness. Since the laser operating cost is a complex, non-linear task, soft computing optimization algorithms can be used. Intelligent soft computing scheme support vector regression (SVR) was implemented. The performance of the proposed estimator was confirmed with the simulation results. The SVR results are then compared with artificial neural network and genetic programing. According to the results, a greater improvement in estimation accuracy can be achieved through the SVR compared to other soft computing methodologies. The new optimization methods benefit from the soft computing capabilities of global optimization and multiobjective optimization rather than choosing a starting point by trial and error and combining multiple criteria into a single criterion.

  10. Oil and Gas 101: An Overview of Oil and Gas Upstream Activities and Using EPA's Nonpoint Oil and Gas Emission Estimation Tool for the 2014 NEI (2015 EIC)

    EPA Pesticide Factsheets

    provide a general overview of the upstream oil and gas exploration and production processes and emissions covered by the tool; a discussion of EPA’s plans for the 2014 NEI pertaining to oil and gas; use of the tool to compile emissions estimates

  11. 2011 Workplace and Equal Opportunity Survey of Reserve Component Members: Overview Report

    DTIC Science & Technology

    2014-02-06

    2011 Workplace and Equal Opportunity Survey of Reserve Members Overview Report The estimated cost of report or study for the Department of...Ask for report by ADA602626 February 26, 2014 OVERVIEW REPORT Note No. 2013-003 1 2011 Workplace and Equal Opportunity Survey of Reserve...harassment and discrimination within the Reserve components. This overview report discusses findings from the 2011 Workplace and Equal Opportunity

  12. An overview of the impact of rare disease characteristics on research methodology.

    PubMed

    Whicher, Danielle; Philbin, Sarah; Aronson, Naomi

    2018-01-19

    About 30 million individuals in the United States are living with a rare disease, which by definition have a prevalence of 200,000 or fewer cases in the United States ([National Organization for Rare Disorders], [About NORD], [2016]). Disease heterogeneity and geographic dispersion add to the difficulty of completing robust studies in small populations. Improving the ability to conduct research on rare diseases would have a significant impact on population health. The purpose of this paper is to raise awareness of methodological approaches that can address the challenges to conducting robust research on rare diseases. We conducted a landscape review of available methodological and analytic approaches to address the challenges of rare disease research. Our objectives were to: 1. identify algorithms for matching study design to rare disease attributes and the methodological approaches applicable to these algorithms; 2. draw inferences on how research communities and infrastructure can contribute to the efficiency of research on rare diseases; and 3. to describe methodological approaches in the rare disease portfolio of the Patient-Centered Outcomes Research Institute (PCORI), a funder promoting both rare disease research and research infrastructure. We identified three algorithms for matching study design to rare disease or intervention characteristics (Gagne, et.al, BMJ 349:g6802, 2014); (Gupta, et.al, J Clin Epidemiol 64:1085-1094, 2011); (Cornu, et. al, Orphet J Rare Dis 8:48,2012) and summarized the applicable methodological and analytic approaches. From this literature we were also able to draw inferences on how an effective research infrastructure can set an agenda, prioritize studies, accelerate accrual, catalyze patient engagement and terminate poorly performing studies. Of the 24 rare disease projects in the PCORI portfolio, 11 are randomized controlled trials (RCTs) using standard designs. Thirteen are observational studies using case-control, prospective

  13. Methodology for estimating dietary data from the semi-quantitative food frequency questionnaire of the Mexican National Health and Nutrition Survey 2012.

    PubMed

    Ramírez-Silva, Ivonne; Jiménez-Aguilar, Alejandra; Valenzuela-Bravo, Danae; Martinez-Tapia, Brenda; Rodríguez-Ramírez, Sonia; Gaona-Pineda, Elsa Berenice; Angulo-Estrada, Salomón; Shamah-Levy, Teresa

    2016-01-01

    To describe the methodology used to clean up and estimate dietary intake (DI) data from the Semi-Quantitative Food Frequency Questionnaire (SFFQ) of the Mexican National Health and Nutrition Survey 2012. DI was collected through a shortterm SFFQ regarding 140 foods (from October 2011 to May 2012). Energy and nutrient intake was calculated according to a nutrient database constructed specifically for the SFFQ. A total of 133 nutrients including energy and fiber were generated from SFFQ data. Between 4.8 and 9.6% of the survey sample was excluded as a result of the cleaning process.Valid DI data were obtained regarding energy and nutrients consumed by 1 212 pre-school children, 1 323 school children, 1 961 adolescents, 2 027 adults and 526 older adults. We documented the methodology used to clean up and estimate DI from the SFFQ used in national dietary assessments in Mexico.

  14. A proposed standard methodology for estimating the wounding capacity of small calibre projectiles or other missiles.

    PubMed

    Berlin, R H; Janzon, B; Rybeck, B; Schantz, B; Seeman, T

    1982-01-01

    A standard methodology for estimating the energy transfer characteristics of small calibre bullets and other fast missiles is proposed, consisting of firings against targets made of soft soap. The target is evaluated by measuring the size of the permanent cavity remaining in it after the shot. The method is very simple to use and does not require access to any sophisticated measuring equipment. It can be applied under all circumstances, even under field conditions. Adequate methods of calibration to ensure good accuracy are suggested. The precision and limitations of the method are discussed.

  15. Assessing the recent estimates of the global burden of disease for ambient air pollution: Methodological changes and implications for low- and middle-income countries.

    PubMed

    Ostro, Bart; Spadaro, Joseph V; Gumy, Sophie; Mudu, Pierpaolo; Awe, Yewande; Forastiere, Francesco; Peters, Annette

    2018-06-04

    The Global Burden of Disease (GBD) is a comparative assessment of the health impact of the major and well-established risk factors, including ambient air pollution (AAP) assessed by concentrations of PM2.5 (particles less than 2.5 µm) and ozone. Over the last two decades, major improvements have emerged for two important inputs in the methodology for estimating the impacts of PM2.5: the assessment of global exposure to PM2.5 and the development of integrated exposure risk models (IERs) that relate the entire range of global exposures of PM2.5 to cause-specific mortality. As a result, the estimated annual mortality attributed to AAP increased from less than 1 million in 2000 to roughly 3 million for GBD in years 2010 and 2013, to 4.2 million for GBD 2015. However, the magnitude of the recent change and uncertainty regarding its rationale have resulted, in some cases, in skepticism and reduced confidence in the overall estimates. To understand the underlying reasons for the change in mortality, we examined the estimates for the years 2013 and 2015 to determine the quantitative implications of alternative model input assumptions. We calculated that the year 2013 estimates increased by 8% after applying the updated exposure data used in GBD 2015, and increased by 23% with the application of the updated IERs from GBD 2015. The application of both upgraded methodologies together increased the GBD 2013 estimates by 35%, or about one million deaths. We also quantified the impact of the changes in demographics and the assumed threshold level. Since the global estimates of air pollution-related deaths will continue to change over time, a clear documentation of the modifications in the methodology and their impacts is necessary. In addition, there is need for additional monitoring and epidemiological studies to reduce uncertainties in the estimates for low- and medium-income countries, which contribute to about one-half of the mortality. Copyright © 2018. Published by

  16. Stochastic capture zone analysis of an arsenic-contaminated well using the generalized likelihood uncertainty estimator (GLUE) methodology

    NASA Astrophysics Data System (ADS)

    Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro

    2003-06-01

    In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.

  17. Military Jet Engine Acquisition: Technology Basics and Cost-Estimating Methodology

    DTIC Science & Technology

    2002-01-01

    aircraft , rather than by these forms of jet engines . Like the turbofan or turbojet , these engines have a nozzle down- stream of the low-pressure...2.5 illustrates the process of turbine blade cooling. Figure 2.6 illustrates the steady and rapid increase in RIT for turbo - jets , turbofans , and...87 B. AN OVERVIEW OF MILITARY JET ENGINE HISTORY ... 97 C. AIRCRAFT TURBINE ENGINE DEVELOPMENT ...... 121 D.

  18. Integrating Shamanic Methodology into the Spirituality of Addictions Recovery Work

    ERIC Educational Resources Information Center

    Rich, Marcia L.

    2012-01-01

    Responding to an increased recognition of the importance of spirituality in the aetiology and treatment of addictions, this article provides an overview of the potential contributions of both transpersonal psychology and shamanic methodology for the addictions field. A case study is provided to illustrate the integration of conventional,…

  19. Tobacco documents research methodology

    PubMed Central

    McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-01-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings. PMID:21504933

  20. Effectiveness of mHealth interventions for patients with diabetes: An overview of systematic reviews.

    PubMed

    Kitsiou, Spyros; Paré, Guy; Jaana, Mirou; Gerber, Ben

    2017-01-01

    Diabetes is a common chronic disease that places an unprecedented strain on health care systems worldwide. Mobile health technologies such as smartphones, mobile applications, and wearable devices, known as mHealth, offer significant and innovative opportunities for improving patient to provider communication and self-management of diabetes. The purpose of this overview is to critically appraise and consolidate evidence from multiple systematic reviews on the effectiveness of mHealth interventions for patients with diabetes to inform policy makers, practitioners, and researchers. A comprehensive search on multiple databases was performed to identify relevant systematic reviews published between January 1996 and December 2015. Two authors independently selected reviews, extracted data, and assessed the methodological quality of included reviews using AMSTAR. Fifteen systematic reviews published between 2008 and 2014 were eligible for inclusion. The quality of the reviews varied considerably and most of them had important methodological limitations. Focusing on systematic reviews that offered the most direct evidence, this overview demonstrates that on average, mHealth interventions improve glycemic control (HbA1c) compared to standard care or other non-mHealth approaches by as much as 0.8% for patients with type 2 diabetes and 0.3% for patients with type 1 diabetes, at least in the short-term (≤12 months). However, limitations in the overall quality of evidence suggest that further research will likely have an important impact in these estimates of effect. Findings are consistent with clinically relevant improvements, particularly with respect to patients with type 2 diabetes. Similar to home telemonitoring, mHealth interventions represent a promising approach for self-management of diabetes.

  1. Traumatic brain injury: methodological approaches to estimate health and economic outcomes.

    PubMed

    Lu, Juan; Roe, Cecilie; Aas, Eline; Lapane, Kate L; Niemeier, Janet; Arango-Lasprilla, Juan Carlos; Andelic, Nada

    2013-12-01

    The effort to standardize the methodology and adherence to recommended principles for all economic evaluations has been emphasized in medical literature. The objective of this review is to examine whether economic evaluations in traumatic brain injury (TBI) research have been compliant with existing guidelines. Medline search was performed between January 1, 1995 and August 11, 2012. All original TBI-related full economic evaluations were included in the study. Two authors independently rated each study's methodology and data presentation to determine compliance to the 10 methodological principles recommended by Blackmore et al. Descriptive analysis was used to summarize the data. Inter-rater reliability was assessed with Kappa statistics. A total of 28 studies met the inclusion criteria. Eighteen of these studies described cost-effectiveness, seven cost-benefit, and three cost-utility analyses. The results showed a rapid growth in the number of published articles on the economic impact of TBI since 2000 and an improvement in their methodological quality. However, overall compliance with recommended methodological principles of TBI-related economic evaluation has been deficient. On average, about six of the 10 criteria were followed in these publications, and only two articles met all 10 criteria. These findings call for an increased awareness of the methodological standards that should be followed by investigators both in performance of economic evaluation and in reviews of evaluation reports prior to publication. The results also suggest that all economic evaluations should be made by following the guidelines within a conceptual framework, in order to facilitate evidence-based practices in the field of TBI.

  2. The Use of Radar to Improve Rainfall Estimation over the Tennessee and San Joaquin River Valleys

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Gatlin, Patrick N.; Felix, Mariana; Carey, Lawrence D.

    2010-01-01

    This slide presentation provides an overview of the collaborative radar rainfall project between the Tennessee Valley Authority (TVA), the Von Braun Center for Science & Innovation (VCSI), NASA MSFC and UAHuntsville. Two systems were used in this project, Advanced Radar for Meteorological & Operational Research (ARMOR) Rainfall Estimation Processing System (AREPS), a demonstration project of real-time radar rainfall using a research radar and NEXRAD Rainfall Estimation Processing System (NREPS). The objectives, methodology, some results and validation, operational experience and lessons learned are reviewed. The presentation. Another project that is using radar to improve rainfall estimations is in California, specifically the San Joaquin River Valley. This is part of a overall project to develop a integrated tool to assist water management within the San Joaquin River Valley. This involves integrating several components: (1) Radar precipitation estimates, (2) Distributed hydro model, (3) Snowfall measurements and Surface temperature / moisture measurements. NREPS was selected to provide precipitation component.

  3. Military Participants at U.S. Atmospheric Nuclear Weapons Testing— Methodology for Estimating Dose and Uncertainty

    PubMed Central

    Till, John E.; Beck, Harold L.; Aanenson, Jill W.; Grogan, Helen A.; Mohler, H. Justin; Mohler, S. Shawn; Voillequé, Paul G.

    2014-01-01

    Methods were developed to calculate individual estimates of exposure and dose with associated uncertainties for a sub-cohort (1,857) of 115,329 military veterans who participated in at least one of seven series of atmospheric nuclear weapons tests or the TRINITY shot carried out by the United States. The tests were conducted at the Pacific Proving Grounds and the Nevada Test Site. Dose estimates to specific organs will be used in an epidemiological study to investigate leukemia and male breast cancer. Previous doses had been estimated for the purpose of compensation and were generally high-sided to favor the veteran's claim for compensation in accordance with public law. Recent efforts by the U.S. Department of Defense (DOD) to digitize the historical records supporting the veterans’ compensation assessments make it possible to calculate doses and associated uncertainties. Our approach builds upon available film badge dosimetry and other measurement data recorded at the time of the tests and incorporates detailed scenarios of exposure for each veteran based on personal, unit, and other available historical records. Film badge results were available for approximately 25% of the individuals, and these results assisted greatly in reconstructing doses to unbadged persons and in developing distributions of dose among military units. This article presents the methodology developed to estimate doses for selected cancer cases and a 1% random sample of the total cohort of veterans under study. PMID:24758578

  4. Methodology discourses as boundary work in the construction of engineering education.

    PubMed

    Beddoes, Kacey

    2014-04-01

    Engineering education research is a new field that emerged in the social sciences over the past 10 years. This analysis of engineering education research demonstrates that methodology discourses have played a central role in the construction and development of the field of engineering education, and that they have done so primarily through boundary work. This article thus contributes to science and technology studies literature by examining the role of methodology discourses in an emerging social science field. I begin with an overview of engineering education research before situating the case within relevant bodies of literature on methodology discourses and boundary work. I then identify two methodology discourses--rigor and methodological diversity--and discuss how they contribute to the construction and development of engineering education research. The article concludes with a discussion of how the findings relate to prior research on methodology discourses and boundary work and implications for future research.

  5. The Freight Analysis Framework Verson 4 (FAF4) - Building the FAF4 Regional Database: Data Sources and Estimation Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Ho-Ling; Hargrove, Stephanie; Chin, Shih-Miao

    The Freight Analysis Framework (FAF) integrates data from a variety of sources to create a comprehensive national picture of freight movements among states and major metropolitan areas by all modes of transportation. It provides a national picture of current freight flows to, from, and within the United States, assigns the flows to the transportation network, and projects freight flow patterns into the future. The FAF4 is the fourth database of its kind, FAF1 provided estimates for truck, rail, and water tonnage for calendar year 1998, FAF2 provided a more complete picture based on the 2002 Commodity Flow Survey (CFS) andmore » FAF3 made further improvements building on the 2007 CFS. Since the first FAF effort, a number of changes in both data sources and products have taken place. The FAF4 flow matrix described in this report is used as the base-year data to forecast future freight activities, projecting shipment weights and values from year 2020 to 2045 in five-year intervals. It also provides the basis for annual estimates to the FAF4 flow matrix, aiming to provide users with the timeliest data. Furthermore, FAF4 truck freight is routed on the national highway network to produce the FAF4 network database and flow assignments for truck. This report details the data sources and methodologies applied to develop the base year 2012 FAF4 database. An overview of the FAF4 components is briefly discussed in Section 2. Effects on FAF4 from the changes in the 2012 CFS are highlighted in Section 3. Section 4 provides a general discussion on the process used in filling data gaps within the domestic CFS matrix, specifically on the estimation of CFS suppressed/unpublished cells. Over a dozen CFS OOS components of FAF4 are then addressed in Section 5 through Section 11 of this report. This includes discussions of farm-based agricultural shipments in Section 5, shipments from fishery and logging sectors in Section 6. Shipments of municipal solid wastes and debris from

  6. An overview of the dynamic calibration of piezoelectric pressure transducers

    NASA Astrophysics Data System (ADS)

    Theodoro, F. R. F.; Reis, M. L. C. C.; d’ Souto, C.

    2018-03-01

    Dynamic calibration is a research area that is still under development and is of great interest to aerospace and automotive industries. This study discusses some concepts regarding dynamic measurements of pressure quantities and presents an overview of dynamic calibration of pressure transducers. Studies conducted by the Institute of Aeronautics and Space focusing on research regarding piezoelectric pressure transducer calibration in shock tube are presented. We employed the Guide to the Expression of Uncertainty and a Monte Carlo Method in the methodology. The results show that both device and methodology employed are adequate to calibrate the piezoelectric sensor.

  7. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less

  8. [Artificial intelligence in psychiatry-an overview].

    PubMed

    Meyer-Lindenberg, A

    2018-06-18

    Artificial intelligence and the underlying methods of machine learning and neuronal networks (NN) have made dramatic progress in recent years and have allowed computers to reach superhuman performance in domains that used to be thought of as uniquely human. In this overview, the underlying methodological developments that made this possible are briefly delineated and then the applications to psychiatry in three domains are discussed: precision medicine and biomarkers, natural language processing and artificial intelligence-based psychotherapeutic interventions. In conclusion, some of the risks of this new technology are mentioned.

  9. People and Buildings--a Brief Overview of Research. Exchange Bibliography No. 301.

    ERIC Educational Resources Information Center

    Canter, David

    This bibliography and paper provide a brief overview of empirical studies that attempt to apply the insights of modern psychology to architecture. Items are discussed under headings of scientific research, methodology, perceptual studies, the use of space, interpersonal distance, personality development, learning, decision-related research, and…

  10. Methodological Overview of an African American Couple-Based HIV/STD Prevention Trial

    PubMed Central

    2010-01-01

    Objective To provide an overview of the NIMH Multisite HIV/STD Prevention Trial for African American Couples conducted in four urban areas: Atlanta, Los Angeles, New York, and Philadelphia. The rationale, study design methods, proposed data analyses, and study management are described. Design This is a two arm randomized Trial, implementing a modified randomized block design, to evaluate the efficacy of a couples based intervention designed for HIV serodiscordant African American couples. Methods The study phases consisted of formative work, pilot studies, and a randomized clinical trial. The sample is 535 HIV serodiscordant heterosexual African American couples. There are two theoretically derived behavioral interventions with eight group and individual sessions: the Eban HIV/STD Risk Reduction Intervention (treatment) versus the Eban Health Promotion Intervention (control). The treatment intervention was couples based and focused on HIV/STD risk reduction while the control was individual based and focused on health promotion. The two study conditions were structurally similar in length and types of activities. At baseline, participants completed an Audio Computer-assisted Self Interview (ACASI) interview as well as interviewer-administered questionnaire, and provided biological specimens to assess for STDs. Similar follow-up assessments were conducted immediately after the intervention, at 6 months, and at 12 months. Results The Trial results will be analyzed across the four sites by randomization assignment. Generalized estimating equations (GEE) and mixed effects modeling (MEM) are planned to test: (1) the effects of the intervention on STD incidence and condom use as well as on mediator variables of these outcomes, and (2) whether the effects of the intervention differ depending on key moderator variables (e.g., gender of the HIV-seropositive partners, length of relationship, psychological distress, sexual abuse history, and substance abuse history

  11. CPR methodology with new steady-state criterion and more accurate statistical treatment of channel bow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgartner, S.; Bieli, R.; Bergmann, U. C.

    2012-07-01

    An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This ismore » considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)« less

  12. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    PubMed Central

    CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological

  13. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population.

    PubMed

    Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this

  14. FCS Technology Investigation Overview

    NASA Technical Reports Server (NTRS)

    Budinger, James; Gilbert, Tricia

    2007-01-01

    This working paper provides an overview of the Future Communication Study (FCS) technology investigation progress. It includes a description of the methodology applied to technology evaluation; evaluation criteria; and technology screening (down select) results. A comparison of screening results with other similar technology screening activities is provided. Additional information included in this working paper is a description of in-depth studies (including characterization of the L-band aeronautical channel; L-band deployment cost assessment; and performance assessments of candidate technologies in the applicable aeronautical channel) that have been conducted to support technology evaluations. The paper concludes with a description on-going activities leading to conclusion of the technology investigation and the development of technology recommendations.

  15. Comparison of regression coefficient and GIS-based methodologies for regional estimates of forest soil carbon stocks.

    PubMed

    Campbell, J Elliott; Moen, Jeremie C; Ney, Richard A; Schnoor, Jerald L

    2008-03-01

    Estimates of forest soil organic carbon (SOC) have applications in carbon science, soil quality studies, carbon sequestration technologies, and carbon trading. Forest SOC has been modeled using a regression coefficient methodology that applies mean SOC densities (mass/area) to broad forest regions. A higher resolution model is based on an approach that employs a geographic information system (GIS) with soil databases and satellite-derived landcover images. Despite this advancement, the regression approach remains the basis of current state and federal level greenhouse gas inventories. Both approaches are analyzed in detail for Wisconsin forest soils from 1983 to 2001, applying rigorous error-fixing algorithms to soil databases. Resulting SOC stock estimates are 20% larger when determined using the GIS method rather than the regression approach. Average annual rates of increase in SOC stocks are 3.6 and 1.0 million metric tons of carbon per year for the GIS and regression approaches respectively.

  16. A methodology to quantify the release of spent nuclear fuel from dry casks during security-related scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durbin, Samuel G.; Luna, Robert Earl

    Assessing the risk to the public and the environment from a release of radioactive material produced by accidental or purposeful forces/environments is an important aspect of the regulatory process in many facets of the nuclear industry. In particular, the transport and storage of radioactive materials is of particular concern to the public, especially with regard to potential sabotage acts that might be undertaken by terror groups to cause injuries, panic, and/or economic consequences to a nation. For many such postulated attacks, no breach in the robust cask or storage module containment is expected to occur. However, there exists evidence thatmore » some hypothetical attack modes can penetrate and cause a release of radioactive material. This report is intended as an unclassified overview of the methodology for release estimation as well as a guide to useful resource data from unclassified sources and relevant analysis methods for the estimation process.« less

  17. Health effects of electric and magnetic fields: Overview of research recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savitz, D.A.

    We developed a series of articles concerning epidemiologic research on potential health effects of electric and magnetic fields. Our goal was to identify methodological issues that have arisen through past studies of cancer, reproduction, and neurobehavioral outcomes in order to suggest strategies to extend knowledge. Following an overview of relevant physics and engineering principles, cancer epidemiology of electric and magnetic fields is discussed separately with a focus on epidemiologic methods and cancer biology, respectively. Reproductive health studies, many of which focus on exposure from video display terminals are then summarized, followed by an evaluation of the limited literature on neurobehavioralmore » outcomes, including suicide and depression. Methodological issues in exposure assessment are discussed, focusing on the challenges in residential exposure assessment and interpretation of wire configuration codes. An overview offers recommendations for priorities across these topic areas, emphasizing the importance of resolving the question of wire codes and childhood cancer. Collectively, these articles provide an array of observations and suggestions regarding the epidemiologic literature, recognizing the potential benefits to science and public policy. 10 refs.« less

  18. Epidemiology and reporting characteristics of overviews of reviews of healthcare interventions published 2012-2016: protocol for a systematic review.

    PubMed

    Pieper, Dawid; Pollock, Michelle; Fernandes, Ricardo M; Büchter, Roland Brian; Hartling, Lisa

    2017-04-07

    Overviews of systematic reviews (overviews) attempt to systematically retrieve and summarize the results of multiple systematic reviews (SRs) for a given condition or public health problem. Two prior descriptive analyses of overviews found substantial variation in the methodological approaches used in overviews, and deficiencies in reporting of key methodological steps. Since then, new methods have been developed so it is timely to update the prior descriptive analyses. The objectives are to: (1) investigate the epidemiological, descriptive, and reporting characteristics of a random sample of 100 overviews published from 2012 to 2016 and (2) compare these recently published overviews (2012-2016) to those published prior to 2012 (based on the prior descriptive analyses). Medline, EMBASE, and CDSR will be searched for overviews published 2012-2016, using a validated search filter for overviews. Only overviews written in English will be included. All titles and abstracts will be screened by one review author; those deemed not relevant will be verified by a second person for exclusion. Full-texts will be assessed for inclusion by two reviewers independently. Of those deemed relevant, a random sample of 100 overviews will be selected for inclusion. Data extraction will be either performed by one reviewer with verification by a second reviewer or by one reviewer only depending on the complexity of the item. Discrepancies at any stage will be resolved by consensus or consulting a third person. Data will be extracted on the epidemiological, descriptive, and reporting characteristics of each overview. Data will be analyzed descriptively. When data are available for both time points (up to 2011 vs. 2012-2016), we will compare characteristics by calculating risk ratios or applying the Mann-Whitney test. Overviews are becoming increasingly valuable evidence syntheses, and the number of published overviews is increasing. However, former analyses found limitations in the conduct

  19. Dry powder inhalers: An overview of the in vitro dissolution methodologies and their correlation with the biopharmaceutical aspects of the drug products.

    PubMed

    Velaga, Sitaram P; Djuris, Jelena; Cvijic, Sandra; Rozou, Stavroula; Russo, Paola; Colombo, Gaia; Rossi, Alessandra

    2018-02-15

    In vitro dissolution testing is routinely used in the development of pharmaceutical products. Whilst the dissolution testing methods are well established and standardized for oral dosage forms, i.e. tablets and capsules, there are no pharmacopoeia methods or regulatory requirements for testing the dissolution of orally inhaled powders. Despite this, a wide variety of dissolution testing methods for orally inhaled powders has been developed and their bio-relevance has been evaluated. This review provides an overview of the in vitro dissolution methodologies for dry inhalation products, with particular emphasis on dry powder inhalers, where the dissolution behavior of the respirable particles can have a role on duration and absorption of the drug. Dissolution mechanisms of respirable particles as well as kinetic models have been presented. A more recent biorelevant dissolution set-ups and media for studying inhalation biopharmaceutics were also reviewed. In addition, factors affecting interplay between dissolution and absorption of deposited particles in the context of biopharmaceutical considerations of inhalation products were examined. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Developing methodologies for estimation of manure across livestock systems using agricultural census data

    NASA Astrophysics Data System (ADS)

    Khalil, Mohammad I.; Muldowney, John; Osborne, Bruce

    2017-04-01

    Livestock production and management-induced emissions of greenhouse gases (GHGs), comprising 18% of total global anthropogenic emissions together with air pollutants, have major atmospheric and ecosystem-related impacts. Identification of categorical/sub-categorical hotspots associated with these emissions and the estimation of emissions factors (EFs), including the use of the Intergovernmental Panel on Climate Change defaults (Tier 1), are key objectives in the preparation of reasonable, and transparent national reporting inventories (Tier 2). They also provide a basis for assessment of technological/management approaches for emissions reduction. For this, data on manure (solid/FYM and slurry/liquid) production across livestock categories, housing types and periods, storage types and application methodologies are required. However, relevant agricultural activity data are not sufficient to quantify the proportion and timing of the amounts of manure applied to major land use types and for different seasons. We have used the recent Census of Agriculture survey data 2010, collected by the Central Statistics Office, Ireland. Based on the compiled datasheets, several steps have been taken to generate missing information (e.g., number of individual livestock categories/subcategories) and to develop methodologies for calculating the proportion of slurry and manure production and application across farm categories. Among livestock categories, the proportion (%) of slurry over solids was higher for pigs (99:1) than the proportion derived from cattle (61:39). Solid manure production from other livestock systems derived mostly from loose-bedded houses. There were large differences between the proportions estimated using the number of farms and the livestock population. A major proportion of the slurry was applied to grassland (97 vs. 73) and the amounts applied in spring and summer were similar (40-42 vs. 36-39), but significantly higher than the autumn application (18 vs. 24

  1. Overview of causes and costs of injuries in Massachusetts: a methodology for analysis of state data.

    PubMed Central

    Schuster, M; Cohen, B B; Rodgers, C G; Walker, D K; Friedman, D J; Ozonoff, V V

    1995-01-01

    Massachusetts has developed the first State profile of the causes and costs of injury based on the national study, "Cost of Injury in the United States: A Report to Congress." Incidence of fatal injuries is based on Massachusetts data; nonfatal hospitalized injuries, on Massachusetts age and sex rates and U.S. cause data; and nonhospitalized injuries, on U.S. rates applied to Massachusetts census data. Lifetime costs per injured person are based on national data adjusted for higher personal health care expenditures and for higher mean annual earnings in Massachusetts. The estimated total lifetime cost for the 1.4 million injuries that occurred in 1989 is $4.4 billion--$1.7 billion for health care and $2.7 billion for lost earnings. Injuries attributed to motor vehicles and falls account for more than half of the total cost. The other cause categories are poisonings, fire-burns, firearms, drowings-near drownings, and other. For every person who dies from an injury, 17 people are hospitalized, and an estimated 535 people require outpatient treatment, consultation, or restricted activity. Development of a State-based cost report can be useful in monitoring the contribution of injuries to health status and in planning effective injury prevention strategies in a community-based health care system. The methodology described in this paper can be replicated by other States through accessing their State-specific mortality and hospital discharge data bases. PMID:7610211

  2. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  3. Prospective CO 2 saline resource estimation methodology: Refinement of existing US-DOE-NETL methods based on data availability

    DOE PAGES

    Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.

    2016-09-28

    Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less

  4. Prospective CO 2 saline resource estimation methodology: Refinement of existing US-DOE-NETL methods based on data availability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.

    Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less

  5. A novel methodology to estimate the evolution of construction waste in construction sites.

    PubMed

    Katz, Amnon; Baum, Hadassa

    2011-02-01

    This paper focuses on the accumulation of construction waste generated throughout the erection of new residential buildings. A special methodology was developed in order to provide a model that will predict the flow of construction waste. The amount of waste and its constituents, produced on 10 relatively large construction sites (7000-32,000 m(2) of built area) was monitored periodically for a limited time. A model that predicts the accumulation of construction waste was developed based on these field observations. According to the model, waste accumulates in an exponential manner, i.e. smaller amounts are generated during the early stages of construction and increasing amounts are generated towards the end of the project. The total amount of waste from these sites was estimated at 0.2m(3) per 1m(2) floor area. A good correlation was found between the model predictions and actual data from the field survey. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Methodologies for Adaptive Flight Envelope Estimation and Protection

    NASA Technical Reports Server (NTRS)

    Tang, Liang; Roemer, Michael; Ge, Jianhua; Crassidis, Agamemnon; Prasad, J. V. R.; Belcastro, Christine

    2009-01-01

    This paper reports the latest development of several techniques for adaptive flight envelope estimation and protection system for aircraft under damage upset conditions. Through the integration of advanced fault detection algorithms, real-time system identification of the damage/faulted aircraft and flight envelop estimation, real-time decision support can be executed autonomously for improving damage tolerance and flight recoverability. Particularly, a bank of adaptive nonlinear fault detection and isolation estimators were developed for flight control actuator faults; a real-time system identification method was developed for assessing the dynamics and performance limitation of impaired aircraft; online learning neural networks were used to approximate selected aircraft dynamics which were then inverted to estimate command margins. As off-line training of network weights is not required, the method has the advantage of adapting to varying flight conditions and different vehicle configurations. The key benefit of the envelope estimation and protection system is that it allows the aircraft to fly close to its limit boundary by constantly updating the controller command limits during flight. The developed techniques were demonstrated on NASA s Generic Transport Model (GTM) simulation environments with simulated actuator faults. Simulation results and remarks on future work are presented.

  7. Overview of the production of sintered SiC optics and optical sub-assemblies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Deny, P.

    2005-08-01

    The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.

  8. Discussion of band selection and methodologies for the estimation of precipitable water vapour from AVIRIS data

    NASA Technical Reports Server (NTRS)

    Schanzer, Dena; Staenz, Karl

    1992-01-01

    An Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data set acquired over Canal Flats, B.C., on 14 Aug. 1990, was used for the purpose of developing methodologies for surface reflectance retrieval using the 5S atmospheric code. A scene of Rogers Dry Lake, California (23 Jul. 1990), acquired within three weeks of the Canal Flats scene, was used as a potential reference for radiometric calibration purposes and for comparison with other studies using primarily LOWTRAN7. Previous attempts at surface reflectance retrieval indicated that reflectance values in the gaseous absorption bands had the poorest accuracy. Modifications to 5S to use 1 nm step size, in order to make fuller use of the 20 cm(sup -1) resolution of the gaseous absorption data, resulted in some improvement in the accuracy of the retrieved surface reflectance. Estimates of precipitable water vapor using non-linear least squares regression and simple ratioing techniques such as the CIBR (Continuum Interpolated Band Ratio) technique or the narrow/wide technique, which relate ratios of combinations of bands to precipitable water vapor through calibration curves, were found to vary widely. The estimates depended on the bands used for the estimation; none provided entirely satisfactory surface reflectance curves.

  9. A methodology to ensure and improve accuracy of Ki67 labelling index estimation by automated digital image analysis in breast cancer tissue.

    PubMed

    Laurinavicius, Arvydas; Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Meskauskas, Raimundas; Baltrusaityte, Indra; Besusparis, Justinas; Dasevicius, Darius; Elie, Nicolas; Iqbal, Yasir; Bor, Catherine

    2014-01-01

    Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists' VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particularfor the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to

  10. A methodology to ensure and improve accuracy of Ki67 labelling index estimation by automated digital image analysis in breast cancer tissue

    PubMed Central

    2014-01-01

    Introduction Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Methods Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. Results ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists’ VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particular for the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20

  11. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  12. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  13. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  14. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  15. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  16. An overview and methodological assessment of systematic reviews and meta-analyses of enhanced recovery programmes in colorectal surgery

    PubMed Central

    Chambers, Duncan; Paton, Fiona; Wilson, Paul; Eastwood, Alison; Craig, Dawn; Fox, Dave; Jayne, David; McGinnes, Erika

    2014-01-01

    Objectives To identify and critically assess the extent to which systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery differ in their methodology and reported estimates of effect. Design Review of published systematic reviews. We searched the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA) Database from 1990 to March 2013. Systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery were eligible for inclusion. Primary and secondary outcome measures The primary outcome was length of hospital stay. We assessed changes in pooled estimates of treatment effect over time and how these might have been influenced by decisions taken by researchers as well as by the availability of new trials. The quality of systematic reviews was assessed using the Centre for Reviews and Dissemination (CRD) DARE critical appraisal process. Results 10 systematic reviews were included. Systematic reviews of randomised controlled trials have consistently shown a reduction in length of hospital stay with enhanced recovery compared with traditional care. The estimated effect tended to increase from 2006 to 2010 as more trials were published but has not altered significantly in the most recent review, despite the inclusion of several unique trials. The best estimate appears to be an average reduction of around 2.5 days in primary postoperative length of stay. Differences between reviews reflected differences in interpretation of inclusion criteria, searching and analytical methods or software. Conclusions Systematic reviews of enhanced recovery programmes show a high level of research waste, with multiple reviews covering identical or very similar groups of trials. Where multiple reviews exist on a topic, interpretation may require careful attention to apparently minor differences between reviews. Researchers can help readers by

  17. Juvenile body mass estimation: A methodological evaluation.

    PubMed

    Cowgill, Libby

    2018-02-01

    Two attempts have been made to develop body mass prediction formulae specifically for immature remains: Ruff (Ruff, C.C., 2007, Body size prediction from juvenile skeletal remains. American Journal Physical Anthropology 133, 698-716) and Robbins et al. (Robbins, G., Sciulli, P.W., Blatt, S.H., 2010. Estimating body mass in subadult human skeletons. American Journal Physical Anthropology 143, 146-150). While both were developed from the same reference population, they differ in their independent variable selection: Ruff (2008) used measures of metaphyseal and articular surface size to predict body mass in immature remains, whereas Robbins et al. (2010) relied on cross-sectional properties. Both methods perform well on independent testing samples; however, differences between the two methods exist in the predicted values. This research evaluates the differences in the body mass estimates from these two methods in seven geographically diverse skeletal samples under the age of 18 (n = 461). The purpose of this analysis is not to assess which method performs with greater accuracy or precision; instead, differences between the two methods are used as a heuristic device to focus attention on the unique challenges affecting the prediction of immature body mass estimates in particular. The two methods differ by population only in some cases, which may be a reflection of activity variation or nutritional status. In addition, cross-sectional properties almost always produce higher estimates than metaphyseal surface size across all age categories. This highlights the difficulty in teasing apart information related to body mass from that relevant to loading, particularly when the original reference population is urban/industrial. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Minority Elderly Adaptation to Life-Threatening Events: An Overview with Methodological Consideration.

    ERIC Educational Resources Information Center

    Trimble, Joseph E.; And Others

    A review of pertinent research on the adaptation of ethnic minority elderly to life-threatening events (personal, man-made, or natural) exposes voids in the research, presents methodological considerations, and indicates that ethnic minority elderly are disproportionately victimized by life-threatening events. Unusually high numbers of…

  19. Impact of ambient temperature on morbidity and mortality: An overview of reviews.

    PubMed

    Song, Xuping; Wang, Shigong; Hu, Yuling; Yue, Man; Zhang, Tingting; Liu, Yu; Tian, Jinhui; Shang, Kezheng

    2017-05-15

    The objectives were (i) to conduct an overview of systematic reviews to summarize evidence from and evaluate the methodological quality of systematic reviews assessing the impact of ambient temperature on morbidity and mortality; and (ii) to reanalyse meta-analyses of cold-induced cardiovascular morbidity in different age groups. The registration number is PROSPERO-CRD42016047179. PubMed, Embase, the Cochrane Library, Web of Science, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), and Global Health were systematically searched to identify systematic reviews. Two reviewers independently selected studies for inclusion, extracted data, and assessed quality. The Assessment of Multiple Systematic Reviews (AMSTAR) checklist was used to assess the methodological quality of included systematic reviews. Estimates of morbidity and mortality risk in association with heat exposure, cold exposure, heatwaves, cold spells and diurnal temperature ranges (DTRs) were the primary outcomes. Twenty-eight systematic reviews were included in the overview of systematic reviews. (i) The median (interquartile range) AMSTAR scores were 7 (1.75) for quantitative reviews and 3.5 (1.75) for qualitative reviews. (ii) Heat exposure was identified to be associated with increased risk of cardiovascular, cerebrovascular and respiratory mortality, but was not found to have an impact on cardiovascular or cerebrovascular morbidity. (iii) Reanalysis of the meta-analyses indicated that cold-induced cardiovascular morbidity increased in youth and middle-age (RR=1.009, 95% CI: 1.004-1.015) as well as the elderly (RR=1.013, 95% CI: 1.007-1.018). (iv) The definitions of temperature exposure adopted by different studies included various temperature indicators and thresholds. In conclusion, heat exposure seemed to have an adverse effect on mortality and cold-induced cardiovascular morbidity increased in the elderly. Developing definitions of temperature exposure at the regional level may

  20. Methodologies for launcher-payload coupled dynamic analysis

    NASA Astrophysics Data System (ADS)

    Fransen, S. H. J. A.

    2012-06-01

    An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.

  1. The State of Agricultural Extension: An Overview and New Caveats for the Future

    ERIC Educational Resources Information Center

    Benson, Amanda; Jafry, Tahseen

    2013-01-01

    Purpose: This review paper presents an overview of changes in agricultural extension on a global scale and helps to characterise on-going developments in extension practice. Design/methodology/approach: Through a critique and synthesis of literature the paper focuses on global political changes which have led to widespread changes from production-…

  2. Systematic review adherence to methodological or reporting quality.

    PubMed

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle; Mayhew, Alain; Skidmore, Becky; Stevens, Adrienne; Boutron, Isabelle; Sarkis-Onofre, Rafael; Bjerre, Lise M; Hróbjartsson, Asbjørn; Altman, Douglas G; Moher, David

    2017-07-19

    Guidelines for assessing methodological and reporting quality of systematic reviews (SRs) were developed to contribute to implementing evidence-based health care and the reduction of research waste. As SRs assessing a cohort of SRs is becoming more prevalent in the literature and with the increased uptake of SR evidence for decision-making, methodological quality and standard of reporting of SRs is of interest. The objective of this study is to evaluate SR adherence to the Quality of Reporting of Meta-analyses (QUOROM) and PRISMA reporting guidelines and the A Measurement Tool to Assess Systematic Reviews (AMSTAR) and Overview Quality Assessment Questionnaire (OQAQ) quality assessment tools as evaluated in methodological overviews. The Cochrane Library, MEDLINE®, and EMBASE® databases were searched from January 1990 to October 2014. Title and abstract screening and full-text screening were conducted independently by two reviewers. Reports assessing the quality or reporting of a cohort of SRs of interventions using PRISMA, QUOROM, OQAQ, or AMSTAR were included. All results are reported as frequencies and percentages of reports and SRs respectively. Of the 20,765 independent records retrieved from electronic searching, 1189 reports were reviewed for eligibility at full text, of which 56 reports (5371 SRs in total) evaluating the PRISMA, QUOROM, AMSTAR, and/or OQAQ tools were included. Notable items include the following: of the SRs using PRISMA, over 85% (1532/1741) provided a rationale for the review and less than 6% (102/1741) provided protocol information. For reports using QUOROM, only 9% (40/449) of SRs provided a trial flow diagram. However, 90% (402/449) described the explicit clinical problem and review rationale in the introduction section. Of reports using AMSTAR, 30% (534/1794) used duplicate study selection and data extraction. Conversely, 80% (1439/1794) of SRs provided study characteristics of included studies. In terms of OQAQ, 37% (499/1367) of the

  3. Binational arsenic exposure survey: methodology and estimated arsenic intake from drinking water and urinary arsenic concentrations.

    PubMed

    Roberge, Jason; O'Rourke, Mary Kay; Meza-Montenegro, Maria Mercedes; Gutiérrez-Millán, Luis Enrique; Burgess, Jefferey L; Harris, Robin B

    2012-04-01

    The Binational Arsenic Exposure Survey (BAsES) was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic) and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L) whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001), urinary inorganic arsenic concentration (p < 0.001), and urinary sum of species (p < 0.001). Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated.

  4. Binational Arsenic Exposure Survey: Methodology and Estimated Arsenic Intake from Drinking Water and Urinary Arsenic Concentrations

    PubMed Central

    Roberge, Jason; O’Rourke, Mary Kay; Meza-Montenegro, Maria Mercedes; Gutiérrez-Millán, Luis Enrique; Burgess, Jefferey L.; Harris, Robin B.

    2012-01-01

    The Binational Arsenic Exposure Survey (BAsES) was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic) and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L) whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001), urinary inorganic arsenic concentration (p < 0.001), and urinary sum of species (p < 0.001). Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated. PMID:22690182

  5. Marine pollution: an overview

    NASA Astrophysics Data System (ADS)

    Valentukevičienė, Marina; Brannvall, Evelina

    2008-01-01

    This overview of marine pollution follows the methodology as proposed below. Firstly, well-known databases (Science Direct, GeoRef, SpringerLINK, etc.) on technological research were studied. All collected references were divided into 27 sections following the key words associated with marine pollution, oil spills, alien species migration, etc. The most commercially promising research and development (R & D) activities seem to be market-oriented sections: detection of oil spills at sea, containment and recovery of floating oil at sea, detection of oil spills on land, disposal of oil and debris on land, alien species migration prevention from ballast water and underwater hull cleaning in water, NOx and SOx emissions, pollutions from ship-building and repair, and biogeochemical modelling. Great market demands for commercially patented innovations are very attractive for initiating new R & D projects.

  6. School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology

    ERIC Educational Resources Information Center

    Newman, Daniel S.; Clare, Mary M.

    2016-01-01

    The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…

  7. RNA Structural Dynamics As Captured by Molecular Simulations: A Comprehensive Overview.

    PubMed

    Šponer, Jiří; Bussi, Giovanni; Krepl, Miroslav; Banáš, Pavel; Bottaro, Sandro; Cunha, Richard A; Gil-Ley, Alejandro; Pinamonti, Giovanni; Poblete, Simón; Jurečka, Petr; Walter, Nils G; Otyepka, Michal

    2018-04-25

    With both catalytic and genetic functions, ribonucleic acid (RNA) is perhaps the most pluripotent chemical species in molecular biology, and its functions are intimately linked to its structure and dynamics. Computer simulations, and in particular atomistic molecular dynamics (MD), allow structural dynamics of biomolecular systems to be investigated with unprecedented temporal and spatial resolution. We here provide a comprehensive overview of the fast-developing field of MD simulations of RNA molecules. We begin with an in-depth, evaluatory coverage of the most fundamental methodological challenges that set the basis for the future development of the field, in particular, the current developments and inherent physical limitations of the atomistic force fields and the recent advances in a broad spectrum of enhanced sampling methods. We also survey the closely related field of coarse-grained modeling of RNA systems. After dealing with the methodological aspects, we provide an exhaustive overview of the available RNA simulation literature, ranging from studies of the smallest RNA oligonucleotides to investigations of the entire ribosome. Our review encompasses tetranucleotides, tetraloops, a number of small RNA motifs, A-helix RNA, kissing-loop complexes, the TAR RNA element, the decoding center and other important regions of the ribosome, as well as assorted others systems. Extended sections are devoted to RNA-ion interactions, ribozymes, riboswitches, and protein/RNA complexes. Our overview is written for as broad of an audience as possible, aiming to provide a much-needed interdisciplinary bridge between computation and experiment, together with a perspective on the future of the field.

  8. RNA Structural Dynamics As Captured by Molecular Simulations: A Comprehensive Overview

    PubMed Central

    2018-01-01

    With both catalytic and genetic functions, ribonucleic acid (RNA) is perhaps the most pluripotent chemical species in molecular biology, and its functions are intimately linked to its structure and dynamics. Computer simulations, and in particular atomistic molecular dynamics (MD), allow structural dynamics of biomolecular systems to be investigated with unprecedented temporal and spatial resolution. We here provide a comprehensive overview of the fast-developing field of MD simulations of RNA molecules. We begin with an in-depth, evaluatory coverage of the most fundamental methodological challenges that set the basis for the future development of the field, in particular, the current developments and inherent physical limitations of the atomistic force fields and the recent advances in a broad spectrum of enhanced sampling methods. We also survey the closely related field of coarse-grained modeling of RNA systems. After dealing with the methodological aspects, we provide an exhaustive overview of the available RNA simulation literature, ranging from studies of the smallest RNA oligonucleotides to investigations of the entire ribosome. Our review encompasses tetranucleotides, tetraloops, a number of small RNA motifs, A-helix RNA, kissing-loop complexes, the TAR RNA element, the decoding center and other important regions of the ribosome, as well as assorted others systems. Extended sections are devoted to RNA–ion interactions, ribozymes, riboswitches, and protein/RNA complexes. Our overview is written for as broad of an audience as possible, aiming to provide a much-needed interdisciplinary bridge between computation and experiment, together with a perspective on the future of the field. PMID:29297679

  9. A novel methodology for estimating upper limits of major cost drivers for profitable conceptual launch system architectures

    NASA Astrophysics Data System (ADS)

    Rhodes, Russel E.; Byrd, Raymond J.

    1998-01-01

    This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.

  10. Getting quality in qualitative research: a short introduction to feminist methodology and methods.

    PubMed

    Landman, Maeve

    2006-11-01

    The present paper reflects a practical activity undertaken by the Nutrition Society's qualitative research network in October 2005. It reflects the structure of that exercise. First, there is an introduction to feminist methodology and methods. The informing premise is that feminist methodology is of particular interest to practitioners (professional and/or academic) engaged in occupations numerically dominated by women, such as nutritionists. A critical argument is made for a place for feminist methodology in related areas of social research. The discussion points to the differences that exist between various feminist commentators, although the central aims of feminist research are broadly shared. The paper comprises an overview of organizing concepts, discussion and questions posed to stimulate discussion on the design and process of research informed by feminist methodology. Issues arising from that discussion are summarized.

  11. The 2014 Survey on Living with Chronic Diseases in Canada on Mood and Anxiety Disorders: a methodological overview.

    PubMed

    O'Donnell, S; Cheung, R; Bennett, K; Lagacé, C

    2016-12-01

    There is a paucity of information about the impact of mood and anxiety disorders on Canadians and the approaches used to manage them. To address this gap, the 2014 Survey on Living with Chronic Diseases in Canada-Mood and Anxiety Disorders Component (SLCDC-MA) was developed. The purpose of this paper is to describe the methodology of the 2014 SLCDC-MA and examine the sociodemographic characteristics of the final sample. The 2014 SLCDC-MA is a cross-sectional follow-up survey that includes Canadians from the 10 provinces aged 18 years and older with mood and/or anxiety disorders diagnosed by a health professional that are expected to last, or have already lasted, six months or more. The survey was developed by the Public Health Agency of Canada (PHAC) through an iterative, consultative process with Statistics Canada and external experts. Statistics Canada performed content testing, designed the sampling frame and strategies and collected and processed the data. PHAC used descriptive analyses to describe the respondents' sociodemographic characteristics, produced nationally representative estimates using survey weights provided by Statistics Canada, and generated variance estimates using bootstrap methodology. The final 2014 SLCDC-MA sample consists of a total of 3361 respondents (68.9% response rate). Among Canadian adults with mood and/or anxiety disorders, close to twothirds (64%) were female, over half (56%) were married/in a common-law relationship and 60% obtained a post-secondary education. Most were young or middle-aged (85%), Canadian born (88%), of non-Aboriginal status (95%), and resided in an urban setting (82%). Household income was fairly evenly distributed between the adequacy quintiles; however, individuals were more likely to report a household income adequacy within the lowest (23%) versus highest (17%) quintile. Forty-five percent reported having a mood disorder only, 24% an anxiety disorder only and 31% both kinds of disorder. The 2014 SLCDC-MA is

  12. APPROACH FOR ESTIMATING GLOBAL LANDFILL METHANE EMISSIONS

    EPA Science Inventory

    The report is an overview of available country-specific data and modeling approaches for estimating global landfill methane. Current estimates of global landfill methane indicate that landfills account for between 4 and 15% of the global methane budget. The report describes an ap...

  13. Methodology to estimate variations in solar radiation reaching densely forested slopes in mountainous terrain.

    PubMed

    Sypka, Przemysław; Starzak, Rafał; Owsiak, Krzysztof

    2016-12-01

    Solar radiation reaching densely forested slopes is one of the main factors influencing the water balance between the atmosphere, tree stands and the soil. It also has a major impact on site productivity, spatial arrangement of vegetation structure as well as forest succession. This paper presents a methodology to estimate variations in solar radiation reaching tree stands in a small mountain valley. Measurements taken in three inter-forest meadows unambiguously showed the relationship between the amount of solar insolation and the shading effect caused mainly by the contour of surrounding tree stands. Therefore, appropriate knowledge of elevation, aspect and tilt angles of the analysed planes had to be taken into consideration during modelling. At critical times, especially in winter, the diffuse and reflected components of solar radiation only reached some of the sites studied as the beam component of solar radiation was totally blocked by the densely forested mountain slopes in the neighbourhood. The cross-section contours and elevation angles of all obstructions are estimated from a digital surface model including both digital elevation model and the height of tree stands. All the parameters in a simplified, empirical model of the solar insolation reaching a given horizontal surface within the research valley are dependent on the sky view factor (SVF). The presented simplified, empirical model and its parameterisation scheme should be easily adaptable to different complex terrains or mountain valleys characterised by diverse geometry or spatial orientation. The model was developed and validated (R 2  = 0.92 , σ = 0.54) based on measurements taken at research sites located in the Silesian Beskid Mountain Range. A thorough understanding of the factors determining the amount of solar radiation reaching woodlands ought to considerably expand the knowledge of the water exchange balance within forest complexes as well as the estimation of site

  14. Introducing a methodology for estimating duration of surgery in health services research.

    PubMed

    Redelmeier, Donald A; Thiruchelvam, Deva; Daneman, Nick

    2008-09-01

    The duration of surgery is an indicator for the quality, risks, and efficiency of surgical procedures. We introduce a new methodology for assessing the duration of surgery based on anesthesiology billing records, along with reviewing its fundamental logic and limitations. The validity of the methodology was assessed through a population-based cohort of patients (n=480,986) undergoing elective operations in 246 Ontario hospitals with 1,084 anesthesiologists between April 1, 1992 and March 31, 2002 (10 years). The weaknesses of the methodology relate to missing data, self-serving exaggerations by providers, imprecisions from clinical diversity, upper limits due to accounting regulations, fluctuations from updates over the years, national differences in reimbursement schedules, and the general failings of claims base analyses. The strengths of the methodology are in providing data that match clinical experiences, correspond to chart review, are consistent over time, can detect differences where differences would be anticipated, and might have implications for examining patient outcomes after long surgical times. We suggest that an understanding and application of large studies of surgical duration may help scientists explore selected questions concerning postoperative complications.

  15. An Overview of Markov Chain Methods for the Study of Stage-Sequential Developmental Processes

    ERIC Educational Resources Information Center

    Kapland, David

    2008-01-01

    This article presents an overview of quantitative methodologies for the study of stage-sequential development based on extensions of Markov chain modeling. Four methods are presented that exemplify the flexibility of this approach: the manifest Markov model, the latent Markov model, latent transition analysis, and the mixture latent Markov model.…

  16. Estimating abundance

    USGS Publications Warehouse

    Sutherland, Chris; Royle, Andy

    2016-01-01

    This chapter provides a non-technical overview of ‘closed population capture–recapture’ models, a class of well-established models that are widely applied in ecology, such as removal sampling, covariate models, and distance sampling. These methods are regularly adopted for studies of reptiles, in order to estimate abundance from counts of marked individuals while accounting for imperfect detection. Thus, the chapter describes some classic closed population models for estimating abundance, with considerations for some recent extensions that provide a spatial context for the estimation of abundance, and therefore density. Finally, the chapter suggests some software for use in data analysis, such as the Windows-based program MARK, and provides an example of estimating abundance and density of reptiles using an artificial cover object survey of Slow Worms (Anguis fragilis).

  17. A new methodology for automating acoustic emission detection of metallic fatigue fractures in highly demanding aerospace environments: An overview

    NASA Astrophysics Data System (ADS)

    Holford, Karen M.; Eaton, Mark J.; Hensman, James J.; Pullin, Rhys; Evans, Sam L.; Dervilis, Nikolaos; Worden, Keith

    2017-04-01

    The acoustic emission (AE) phenomenon has many attributes that make it desirable as a structural health monitoring or non-destructive testing technique, including the capability to continuously and globally monitor large structures using a sparse sensor array and with no dependency on defect size. However, AE monitoring is yet to fulfil its true potential, due mainly to limitations in location accuracy and signal characterisation that often arise in complex structures with high levels of background noise. Furthermore, the technique has been criticised for a lack of quantitative results and the large amount of operator interpretation required during data analysis. This paper begins by introducing the challenges faced in developing an AE based structural health monitoring system and then gives a review of previous progress made in addresing these challenges. Subsequently an overview of a novel methodology for automatic detection of fatigue fractures in complex geometries and noisy environments is presented, which combines a number of signal processing techniques to address the current limitations of AE monitoring. The technique was developed for monitoring metallic landing gear components during pre-flight certification testing and results are presented from a full-scale steel landing gear component undergoing fatigue loading. Fracture onset was successfully identify automatically at 49,000 fatigue cycles prior to final failure (validated by the use of dye penetrant inspection) and the fracture position was located to within 10 mm of the actual location.

  18. Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach

    ERIC Educational Resources Information Center

    Stevenson, Glenn A.

    2012-01-01

    For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…

  19. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  20. Methodology for rendering household budget and individual nutrition surveys comparable, at the level of the dietary information collected.

    PubMed

    Naska, A; Paterakis, S; Eeckman, H; Remaut, A M; Trygg, K

    2001-10-01

    To describe the methodology applied in order to render comparable, at the level of the dietary information collected, the household budget survey (HBS) and individual nutrition survey (INS) data from four European countries (Belgium, Greece, Norway and the United Kingdom). In Belgium, data from the HBS of 1987-88 were compared with data from the Belgian Interuniversity Research on Nutrition and Health collected from 1980 to 1985. In Greece, data from the HBS undertaken in 1993-94 in the greater Athens area were compared with data collected around 1994 in the same region, in the context of the Greek segment of the European Prospective Investigation on Cancer and Nutrition study. In Norway, data from the HBS carried out in 1992, 1993 and 1994 were compared with the NORKOST study conducted in 1993-94. In the United Kingdom, data from four HBSs carried out in 1985, 1986, 1987 and 1988 were compared with the National Dietary and Nutritional Survey of British adults conducted in 1987-88. INS-generated data were converted into 'HBS-like' estimates with the application of yield factors for weight changes during cooking, recipe-based calculations and edible proportion coefficients taking into account weight changes during the food preparation. The 'HBS-like' estimates thus obtained were compared with the original HBS values, after applying an adjustment factor for food spoiled or given to pets. The methodological considerations overviewed in the present paper indicate that a number of issues need to be taken into account before a proper comparison of the dietary data collected through surveys implemented with varied methodologies is carried out.

  1. Tunnel and Station Cost Methodology : Mined Tunnels

    DOT National Transportation Integrated Search

    1983-01-01

    The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...

  2. Creating and evaluating a new clicker methodology

    NASA Astrophysics Data System (ADS)

    Li, Pengfei

    "Clickers", an in-class polling system, has been used by many instructors to add active learning and formative assessment to previously passive traditional lectures. While considerable research has been conducted on clicker increasing student interaction in class, less research has been reported on the effectiveness of using clicker to help students understand concepts. This thesis reported a systemic project by the OSU Physics Education group to develop and test a new clicker methodology. Clickers question sequences based on a constructivist model of learning were used to improve classroom dynamics and student learning. They also helped students and lecturers understand in real time whether a concept had been assimilated or more effort was required. Chapter 1 provided an introduction to the clicker project. Chapter 2 summarized widely-accepted teaching principles that have arisen from a long history of research and practice in psychology, cognitive science and physics education. The OSU clicker methodology described in this thesis originated partly from our years of teaching experience, but mostly was based on these teaching principles. Chapter 3 provided an overview of the history of clicker technology and different types of clickers. Also, OSU's use of clickers was summarized together with a list of common problems and corresponding solutions. These technical details may be useful for those who want to use clickers. Chapter 4 discussed examples of the type and use of question sequences based on the new clicker methodology. In several years of research, we developed a base of clicker materials for calculus-based introductory physics courses at OSU. As discussed in chapter 5, a year-long controlled quantitative study was conducted to determine whether using clickers helps students learn, how using clickers helps students learn and whether students perceive that clicker has a positive effect on their own learning process. The strategy for this test was based on

  3. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    NASA Astrophysics Data System (ADS)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  4. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  5. Collecting and validating experiential expertise is doable but poses methodological challenges.

    PubMed

    Burda, Marika H F; van den Akker, Marjan; van der Horst, Frans; Lemmens, Paul; Knottnerus, J André

    2016-04-01

    To give an overview of important methodological challenges in collecting, validating, and further processing experiential expertise and how to address these challenges. Based on our own experiences in studying the concept, operationalization, and contents of experiential expertise, we have formulated methodological issues regarding the inventory and application of experiential expertise. The methodological challenges can be categorized in six developmental research stages, comprising the conceptualization of experiential expertise, methods to harvest experiential expertise, the validation of experiential expertise, evaluation of the effectiveness, how to translate experiential expertise into acceptable guidelines, and how to implement these. The description of methodological challenges and ways to handle those are illustrated using diabetes mellitus as an example. Experiential expertise can be defined and operationalized in terms of successful illness-related behaviors and translated into recommendations regarding life domains. Pathways have been identified to bridge the gaps between the world of patients' daily lives and the medical world. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once

  7. Benzene exposure in the petroleum distribution industry associated with leukemia in the United Kingdom: overview of the methodology of a case-control study.

    PubMed Central

    Rushton, L

    1996-01-01

    This paper describes basic principles underlying the methodology for obtaining quantitative estimates of benzene exposure in the petroleum marketing and distribution industry. Work histories for 91 cases of leukemia and 364 matched controls (4 per case) identified for a cohort of oil distribution workers up to the end of 1992 were obtained, primarily from personnel records. Information on the distribution sites, more than 90% of which were closed at the time of data collection, was obtained from site visits and archive material. Industrial hygiene measurements measured under known conditions were assembled for different tasks. These were adjusted for conditions where measured data were not available using variables known to influence exposure, such as temperature, technology, percentage of benzene in fuel handled, products handled, number of loads, and job activity. A quantitative estimate of dermal contact and peak exposure was also made. PMID:9118922

  8. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    NASA Astrophysics Data System (ADS)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  9. Standard Area Diagrams for Aiding Severity Estimation: Scientometrics, Pathosystems, and Methodological Trends in the Last 25 Years.

    PubMed

    Del Ponte, Emerson M; Pethybridge, Sarah J; Bock, Clive H; Michereff, Sami J; Machado, Franklin J; Spolti, Piérri

    2017-10-01

    Standard area diagrams (SAD) have long been used as a tool to aid the estimation of plant disease severity, an essential variable in phytopathometry. Formal validation of SAD was not considered prior to the early 1990s, when considerable effort began to be invested developing SAD and assessing their value for improving accuracy of estimates of disease severity in many pathosystems. Peer-reviewed literature post-1990 was identified, selected, and cataloged in bibliographic software for further scrutiny and extraction of scientometric, pathosystem-related, and methodological-related data. In total, 105 studies (127 SAD) were found and authored by 327 researchers from 10 countries, mainly from Brazil. The six most prolific authors published at least seven studies. The scientific impact of a SAD article, based on annual citations after publication year, was affected by disease significance, the journal's impact factor, and methodological innovation. The reviewed SAD encompassed 48 crops and 103 unique diseases across a range of plant organs. Severity was quantified largely by image analysis software such as QUANT, APS-Assess, or a LI-COR leaf area meter. The most typical SAD comprised five to eight black-and-white drawings of leaf diagrams, with severity increasing nonlinearly. However, there was a trend toward using true-color photographs or stylized representations in a range of color combinations and more linear (equally spaced) increments of severity. A two-step SAD validation approach was used in 78 of 105 studies for which linear regression was the preferred method but a trend toward using Lin's correlation concordance analysis and hypothesis tests to detect the effect of SAD on accuracy was apparent. Reliability measures, when obtained, mainly considered variation among rather than within raters. The implications of the findings and knowledge gaps are discussed. A list of best practices for designing and implementing SAD and a website called SADBank for hosting

  10. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  11. Estimation of lactose interference in vaccines and a proposal of methodological adjustment of total protein determination by the lowry method.

    PubMed

    Kusunoki, Hideki; Okuma, Kazu; Hamaguchi, Isao

    2012-01-01

    For national regulatory testing in Japan, the Lowry method is used for the determination of total protein content in vaccines. However, many substances are known to interfere with the Lowry method, rendering accurate estimation of protein content difficult. To accurately determine the total protein content in vaccines, it is necessary to identify the major interfering substances and improve the methodology for removing such substances. This study examined the effects of high levels of lactose with low levels of protein in freeze-dried, cell culture-derived Japanese encephalitis vaccine (inactivated). Lactose was selected because it is a reducing sugar that is expected to interfere with the Lowry method. Our results revealed that concentrations of ≥ 0.1 mg/mL lactose interfered with the Lowry assays and resulted in overestimation of the protein content in a lactose concentration-dependent manner. On the other hand, our results demonstrated that it is important for the residual volume to be ≤ 0.05 mL after trichloroacetic acid precipitation in order to avoid the effects of lactose. Thus, the method presented here is useful for accurate protein determination by the Lowry method, even when it is used for determining low levels of protein in vaccines containing interfering substances. In this study, we have reported a methodological adjustment that allows accurate estimation of protein content for national regulatory testing, when the vaccine contains interfering substances.

  12. Statistical methodology for estimating the mean difference in a meta-analysis without study-specific variance information.

    PubMed

    Sangnawakij, Patarawan; Böhning, Dankmar; Adams, Stephen; Stanton, Michael; Holling, Heinz

    2017-04-30

    Statistical inference for analyzing the results from several independent studies on the same quantity of interest has been investigated frequently in recent decades. Typically, any meta-analytic inference requires that the quantity of interest is available from each study together with an estimate of its variability. The current work is motivated by a meta-analysis on comparing two treatments (thoracoscopic and open) of congenital lung malformations in young children. Quantities of interest include continuous end-points such as length of operation or number of chest tube days. As studies only report mean values (and no standard errors or confidence intervals), the question arises how meta-analytic inference can be developed. We suggest two methods to estimate study-specific variances in such a meta-analysis, where only sample means and sample sizes are available in the treatment arms. A general likelihood ratio test is derived for testing equality of variances in two groups. By means of simulation studies, the bias and estimated standard error of the overall mean difference from both methodologies are evaluated and compared with two existing approaches: complete study analysis only and partial variance information. The performance of the test is evaluated in terms of type I error. Additionally, we illustrate these methods in the meta-analysis on comparing thoracoscopic and open surgery for congenital lung malformations and in a meta-analysis on the change in renal function after kidney donation. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth

  14. Methodological considerations in cost of illness studies on Alzheimer disease

    PubMed Central

    2012-01-01

    Cost-of-illness studies (COI) can identify and measure all the costs of a particular disease, including the direct, indirect and intangible dimensions. They are intended to provide estimates about the economic impact of costly disease. Alzheimer disease (AD) is a relevant example to review cost of illness studies because of its costliness.The aim of this study was to review relevant published cost studies of AD to analyze the method used and to identify which dimension had to be improved from a methodological perspective. First, we described the key points of cost study methodology. Secondly, cost studies relating to AD were systematically reviewed, focussing on an analysis of the different methods used. The methodological choices of the studies were analysed using an analytical grid which contains the main methodological items of COI studies. Seventeen articles were retained. Depending on the studies, annual total costs per patient vary from $2,935 to $52, 954. The methods, data sources, and estimated cost categories in each study varied widely. The review showed that cost studies adopted different approaches to estimate costs of AD, reflecting a lack of consensus on the methodology of cost studies. To increase its credibility, closer agreement among researchers on the methodological principles of cost studies would be desirable. PMID:22963680

  15. USGS Methodology for Assessing Continuous Petroleum Resources

    USGS Publications Warehouse

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  16. A cost effective and operational methodology for wall to wall Above Ground Biomass (AGB) and carbon stocks estimation and mapping: Nepal REDD+

    NASA Astrophysics Data System (ADS)

    Gilani, H., Sr.; Ganguly, S.; Zhang, G.; Koju, U. A.; Murthy, M. S. R.; Nemani, R. R.; Manandhar, U.; Thapa, G. J.

    2015-12-01

    Nepal is a landlocked country with 39% forest cover of the total land area (147,181 km2). Under the Forest Carbon Partnership Facility (FCPF) and implemented by the World Bank (WB), Nepal chosen as one of four countries best suitable for results-based payment system for Reducing Emissions from Deforestation and Forest Degradation (REDD and REDD+) scheme. At the national level Landsat based, from 1990 to 2000 the forest area has declined by 2%, i.e. by 1467 km2, whereas from 2000 to 2010 it has declined only by 0.12% i.e. 176 km2. A cost effective monitoring and evaluation system for REDD+ requires a balanced approach of remote sensing and ground measurements. This paper provides, for Nepal a cost effective and operational 30 m Above Ground Biomass (AGB) estimation and mapping methodology using freely available satellite data integrated with field inventory. Leaf Area Index (LAI) generated based on propose methodology by Ganguly et al. (2012) using Landsat-8 the OLI cloud free images. To generate tree canopy height map, a density scatter graph between the Geoscience Laser Altimeter System (GLAS) on the Ice, Cloud, and Land Elevation Satellite (ICESat) estimated maximum height and Landsat LAI nearest to the center coordinates of the GLAS shots show a moderate but significant exponential correlation (31.211*LAI0.4593, R2= 0.33, RMSE=13.25 m). From the field well distributed circular (750m2 and 500m2), 1124 field plots (0.001% representation of forest cover) measured which were used for estimation AGB (ton/ha) using Sharma et al. (1990) proposed equations for all tree species of Nepal. A satisfactory linear relationship (AGB = 8.7018*Hmax-101.24, R2=0.67, RMSE=7.2 ton/ha) achieved between maximum canopy height (Hmax) and AGB (ton/ha). This cost effective and operational methodology is replicable, over 5-10 years with minimum ground samples through integration of satellite images. Developed AGB used to produce optimum fuel wood scenarios using population and road

  17. Methodology and implications of maximum paleodischarge estimates for mountain channels, upper Animas River basin, Colorado, U.S.A.

    USGS Publications Warehouse

    Pruess, J.; Wohl, E.E.; Jarrett, R.D.

    1998-01-01

    Historical and geologic records may be used to enhance magnitude estimates for extreme floods along mountain channels, as demonstrated in this study from the San Juan Mountains of Colorado. Historical photographs and local newspaper accounts from the October 1911 flood indicate the likely extent of flooding and damage. A checklist designed to organize and numerically score evidence of flooding was used in 15 field reconnaissance surveys in the upper Animas River valley of southwestern Colorado. Step-backwater flow modeling estimated the discharges necessary to create longitudinal flood bars observed at 6 additional field sites. According to these analyses, maximum unit discharge peaks at approximately 1.3 m3 s-1 km-2 around 2200 m elevation, with decreased unit discharges at both higher and lower elevations. These results (1) are consistent with Jarrett's (1987, 1990, 1993) maximum 2300-m elevation limit for flash-flooding in the Colorado Rocky Mountains, and (2) suggest that current Probable Maximum Flood (PMF) estimates based on a 24-h rainfall of 30 cm at elevations above 2700 m are unrealistically large. The methodology used for this study should be readily applicable to other mountain regions where systematic streamflow records are of short duration or nonexistent.

  18. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    NASA Astrophysics Data System (ADS)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  19. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and

  20. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    NASA Astrophysics Data System (ADS)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  1. Estimating survival of radio-tagged birds

    USGS Publications Warehouse

    Bunck, C.M.; Pollock, K.H.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Parametric and nonparametric methods for estimating survival of radio-tagged birds are described. The general assumptions of these methods are reviewed. An estimate based on the assumption of constant survival throughout the period is emphasized in the overview of parametric methods. Two nonparametric methods, the Kaplan-Meier estimate of the survival funcrion and the log rank test, are explained in detail The link between these nonparametric methods and traditional capture-recapture models is discussed aloag with considerations in designing studies that use telemetry techniques to estimate survival.

  2. Establishing equivalence: methodological progress in group-matching design and analysis.

    PubMed

    Kover, Sara T; Atwoo, Amy K

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, Fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios.

  3. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    PubMed Central

    Kover, Sara T.; Atwood, Amy K.

    2017-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs utilized in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p-values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios. PMID:23301899

  4. Cost benefits of advanced software: A review of methodology used at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1993-01-01

    To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.

  5. Overview of methods in economic analyses of behavioral interventions to promote oral health

    PubMed Central

    O’Connell, Joan M.; Griffin, Susan

    2016-01-01

    Background Broad adoption of interventions that prove effective in randomized clinical trials or comparative effectiveness research may depend to a great extent on their costs and cost-effectiveness (CE). Many studies of behavioral health interventions for oral health promotion and disease prevention lack robust economic assessments of costs and CE. Objective To describe methodologies employed to assess intervention costs, potential savings, net costs, CE, and the financial sustainability of behavioral health interventions to promote oral health. Methods We provide an overview of terminology and strategies for conducting economic evaluations of behavioral interventions to improve oral health based on the recommendations of the Panel of Cost-Effectiveness in Health and Medicine. To illustrate these approaches, we summarize methodologies and findings from a limited number of published studies. The strategies include methods for assessing intervention costs, potential savings, net costs, CE, and financial sustainability from various perspectives (e.g., health-care provider, health system, health payer, employer, society). Statistical methods for estimating short-term and long-term economic outcomes and for examining the sensitivity of economic outcomes to cost parameters are described. Discussion Through the use of established protocols for evaluating costs and savings, it is possible to assess and compare intervention costs, net costs, CE, and financial sustainability. The addition of economic outcomes to outcomes reflecting effectiveness, appropriateness, acceptability, and organizational sustainability strengthens evaluations of oral health interventions and increases the potential that those found to be successful in research settings will be disseminated more broadly. PMID:21656966

  6. Overview of methods in economic analyses of behavioral interventions to promote oral health.

    PubMed

    O'Connell, Joan M; Griffin, Susan

    2011-01-01

    Broad adoption of interventions that prove effective in randomized clinical trials or comparative effectiveness research may depend to a great extent on their costs and cost-effectiveness (CE). Many studies of behavioral health interventions for oral health promotion and disease prevention lack robust economic assessments of costs and CE. To describe methodologies employed to assess intervention costs, potential savings, net costs, CE, and the financial sustainability of behavioral health interventions to promote oral health. We provide an overview of terminology and strategies for conducting economic evaluations of behavioral interventions to improve oral health based on the recommendations of the Panel of Cost-Effectiveness in Health and Medicine. To illustrate these approaches, we summarize methodologies and findings from a limited number of published studies. The strategies include methods for assessing intervention costs, potential savings, net costs, CE, and financial sustainability from various perspectives (e.g., health-care provider, health system, health payer, employer, society). Statistical methods for estimating short-term and long-term economic outcomes and for examining the sensitivity of economic outcomes to cost parameters are described. Through the use of established protocols for evaluating costs and savings, it is possible to assess and compare intervention costs, net costs, CE, and financial sustainability. The addition of economic outcomes to outcomes reflecting effectiveness, appropriateness, acceptability, and organizational sustainability strengthens evaluations of oral health interventions and increases the potential that those found to be successful in research settings will be disseminated more broadly.

  7. Tunnel and Station Cost Methodology Volume II: Stations

    DOT National Transportation Integrated Search

    1981-01-01

    The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...

  8. Bilingualism and Cognitive Reserve: A Critical Overview and a Plea for Methodological Innovations.

    PubMed

    Calvo, Noelia; García, Adolfo M; Manoiloff, Laura; Ibáñez, Agustín

    2015-01-01

    The decline of cognitive skills throughout healthy or pathological aging can be slowed down by experiences which foster cognitive reserve (CR). Recently, some studies on Alzheimer's disease have suggested that CR may be enhanced by life-long bilingualism. However, the evidence is inconsistent and largely based on retrospective approaches featuring several methodological weaknesses. Some studies demonstrated at least 4 years of delay in dementia symptoms, while others did not find such an effect. Moreover, various methodological aspects vary from study to study. The present paper addresses contradictory findings, identifies possible lurking variables, and outlines methodological alternatives thereof. First, we characterize possible confounding factors that may have influenced extant results. Our focus is on the criteria to establish bilingualism, differences in sample design, the instruments used to examine cognitive skills, and the role of variables known to modulate life-long cognition. Second, we propose that these limitations could be largely circumvented through experimental approaches. Proficiency in the non-native language can be successfully assessed by combining subjective and objective measures; confounding variables which have been distinctively associated with certain bilingual groups (e.g., alcoholism, sleep disorders) can be targeted through relevant instruments; and cognitive status might be better tapped via robust cognitive screenings and executive batteries. Moreover, future research should incorporate tasks yielding predictable patterns of contrastive performance between bilinguals and monolinguals. Crucially, these include instruments which reveal bilingual disadvantages in vocabulary, null effects in working memory, and advantages in inhibitory control and other executive functions. Finally, paradigms tapping proactive interference (which assess the disruptive effect of long-term memory on newly learned information) could also offer useful data

  9. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less

  10. IMPAC: An Integrated Methodology for Propulsion and Airframe Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.

    1991-01-01

    The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.

  11. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  12. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  13. An approach to software cost estimation

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.

    1984-01-01

    A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.

  14. Data on the descriptive overview and the quality assessment details of 12 qualitative research papers.

    PubMed

    Barnabishvili, Maia; Ulrichs, Timo; Waldherr, Ruth

    2016-09-01

    This data article presents the supplementary material for the review paper "Role of acceptability barriers in delayed diagnosis of Tuberculosis: Literature review from high burden countries" (Barnabishvili et al., in press) [1]. General overview of 12 qualitative papers, including the details about authors, years of publication, data source locations, study objectives, overview of methods, study population characteristics, as well as the details of intervention and the outcome parameters of the papers are summarized in the first two tables included to the article. Quality assessment process of the methodological strength of 12 papers and the results of the critical appraisal are further described and summarized in the second part of the article.

  15. Multimodal intraoperative monitoring: an overview and proposal of methodology based on 1,017 cases

    PubMed Central

    Eggspuehler, Andreas; Muller, Alfred; Dvorak, Jiri

    2007-01-01

    To describe different currently available tests of multimodal intraoperative monitoring (MIOM) used in spine and spinal cord surgery indicating the technical parameters, application and interpretation as an easy understanding systematic overview to help implementation of MIOM and improve communication between neurophysiologists and spine surgeons. This article aims to give an overview and proposal of the different MIOM-techniques as used daily in spine and spinal cord surgery at our institution. Intensive research in neurophysiology over the past decades has lead to a profound understanding of the spinal cord, nerve functions and their intraoperative functional evaluation in anaesthetised patients. At present, spine surgeons and neurophysiologist are faced with 1,883 publications in PubMed on spinal cord monitoring. The value and the limitations of single monitoring methods are well documented. The diagnostic power of the multimodal approach in a larger study population in spine surgery, as measured with sensitivity and specificity, is dealt with elsewhere in this supplement (Sutter et al. in Eur Spine J Suppl, 2007). This paper aims to give a detailed description of the different modalities used in this study. Description of monitoring techniques of the descending and ascending spinal cord and nerve root pathways by motor evoked potentials of the spinal cord and muscles elicited after transcranial electrical motor cortex, spinal cord, cauda equina and nerve root stimulation, continuous EMG, sensory cortical and spinal evoked potentials, as well as direct spinal cord evoked potentials applied on 1,017 patients. The method of MIOM, continuously adapted according to the site, stage of surgery and potential danger to nerve tissues, proved to be applicable with online results, reliable and furthermore teachable. PMID:17653777

  16. DPP-4 inhibitors for the treatment of type 2 diabetes: a methodology overview of systematic reviews.

    PubMed

    Ling, Juan; Ge, Long; Zhang, Ding-Hua; Wang, Yong-Feng; Xie, Zhuo-Lin; Tian, Jin-Hui; Xiao, Xiao-Hui; Yang, Ke-Hu

    2018-06-01

    To evaluate the methodological quality of systematic reviews (SRs), and summarize evidence of important outcomes from dipeptidyl peptidase-4 inhibitors (DPP4-I) in treating type 2 diabetes mellitus (T2DM). We included SRs of DPP4-I for the treatment of T2DM until January, 2018 by searching the Cochrane Library, PubMed, EMBASE and three Chinese databases. We evaluated the methodological qualities with the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool and the GRADE (The Grading of Recommendations Assessment, Development and Evaluation) approach. Sixty-three SRs (a total of 2,603,140 participants) receiving DPP4-I for the treatment of T2DM were included. The results of AMSTAR showed that the lowest quality was "a list of studies (included and excluded) item" with only one (1.6%) study provided, followed by the "providing a priori design" item with only four (6.3%) studies conforming to this item, the next were "the status of publication (gray literature) used as an inclusion criterion item", with only 18 (28.9%) studies conforming to these items. Only seven (11.1%) studies scored more than nine points in AMSTAR, indicating high methodological quality. For GRADE, of the 128 outcomes, high quality evidence was provided in only 28 (21.9%), moderate in 70 (54.7%), low in 27 (21.1%), and very low in three (2.3%). The methodological quality of SRs of DPP4-I for type 2 diabetes mellitus is not high and there are common areas for improvement. Furthermore, the quality of evidence level is moderate and more high quality evidence is needed.

  17. Eddy Covariance Method: Overview of General Guidelines and Conventional Workflow

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Anderson, D. J.; Amen, J. L.

    2007-12-01

    Atmospheric flux measurements are widely used to estimate water, heat, carbon dioxide and trace gas exchange between the ecosystem and the atmosphere. The Eddy Covariance method is one of the most direct, defensible ways to measure and calculate turbulent fluxes within the atmospheric boundary layer. However, the method is mathematically complex, and requires significant care to set up and process data. These reasons may be why the method is currently used predominantly by micrometeorologists. Modern instruments and software can potentially expand the use of this method beyond micrometeorology and prove valuable for plant physiology, hydrology, biology, ecology, entomology, and other non-micrometeorological areas of research. The main challenge of the method for a non-expert is the complexity of system design, implementation, and processing of the large volume of data. In the past several years, efforts of the flux networks (e.g., FluxNet, Ameriflux, CarboEurope, Fluxnet-Canada, Asiaflux, etc.) have led to noticeable progress in unification of the terminology and general standardization of processing steps. The methodology itself, however, is difficult to unify, because various experimental sites and different purposes of studies dictate different treatments, and site-, measurement- and purpose-specific approaches. Here we present an overview of theory and typical workflow of the Eddy Covariance method in a format specifically designed to (i) familiarize a non-expert with general principles, requirements, applications, and processing steps of the conventional Eddy Covariance technique, (ii) to assist in further understanding the method through more advanced references such as textbooks, network guidelines and journal papers, (iii) to help technicians, students and new researchers in the field deployment of the Eddy Covariance method, and (iv) to assist in its use beyond micrometeorology. The overview is based, to a large degree, on the frequently asked questions

  18. A new time-series methodology for estimating relationships between elderly frailty, remaining life expectancy, and ambient air quality.

    PubMed

    Murray, Christian J; Lipfert, Frederick W

    2012-01-01

    Many publications estimate short-term air pollution-mortality risks, but few estimate the associated changes in life-expectancies. We present a new methodology for analyzing time series of health effects, in which prior frailty is assumed to precede short-term elderly nontraumatic mortality. The model is based on a subpopulation of frail individuals whose entries and exits (deaths) are functions of daily and lagged environmental conditions: ambient temperature/season, airborne particles, and ozone. This frail susceptible population is unknown; its fluctuations cannot be observed but are estimated using maximum-likelihood methods with the Kalman filter. We used an existing 14-y set of daily data to illustrate the model and then tested the assumption of prior frailty with a new generalized model that estimates the portion of the daily death count allocated to nonfrail individuals. In this demonstration dataset, new entries into the high-risk pool are associated with lower ambient temperatures and higher concentrations of particulate matter and ozone. Accounting for these effects on antecedent frailty reduces this at-risk population, yielding frail life expectancies of 5-7 days. Associations between environmental factors and entries to the at-risk pool are about twice as strong as for mortality. Nonfrail elderly deaths are seen to make only small contributions. This new model predicts a small short-lived frail population-at-risk that is stable over a wide range of environmental conditions. The predicted effects of pollution on new entries and deaths are robust and consistent with conventional morbidity/mortality times-series studies. We recommend model verification using other suitable datasets.

  19. SMOS+RAINFALL: Evaluating the ability of different methodologies to improve rainfall estimations using soil moisture data from SMOS

    NASA Astrophysics Data System (ADS)

    Pellarin, Thierry; Brocca, Luca; Crow, Wade; Kerr, Yann; Massari, Christian; Román-Cascón, Carlos; Fernández, Diego

    2017-04-01

    Recent studies have demonstrated the usefulness of soil moisture retrieved from satellite for improving rainfall estimations of satellite based precipitation products (SBPP). The real-time version of these products are known to be biased from the real precipitation observed at the ground. Therefore, the information contained in soil moisture can be used to correct the inaccuracy and uncertainty of these products, since the value and behavior of this soil variable preserve the information of a rain event even for several days. In this work, we take advantage of the soil moisture data from the Soil Moisture and Ocean Salinity (SMOS) satellite, which provides information with a quite appropriate temporal and spatial resolution for correcting rainfall events. Specifically, we test and compare the ability of three different methodologies for this aim: 1) SM2RAIN, which directly relate changes in soil moisture to rainfall quantities; 2) The LMAA methodology, which is based on the assimilation of soil moisture in two models of different complexity (see EGU2017-5324 in this same session); 3) The SMART method, based on the assimilation of soil moisture in a simple hydrological model with a different assimilation/modelling technique. The results are tested for 6 years over 10 sites around the world with different features (land surface, rainfall climatology, orography complexity, etc.). These preliminary and promising results are shown here for the first time to the scientific community, as also the observed limitations of the different methodologies. Specific remarks on the technical configurations, filtering/smoothing of SMOS soil moisture or re-scaling techniques are also provided from the results of different sensitivity experiments.

  20. Population biology of the Florida manatee: An overview

    USGS Publications Warehouse

    O'Shea, Thomas J.; Ackerman, B.B.; O'Shea, Thomas J.; Ackerman, B.B.; Percival, H. Franklin

    1995-01-01

    In the following overview we discuss progress toward meeting the three objectives of the 1992 workshop: to provide a synthesis of existing information about manatee population biology; to evaluate the strengths and weaknesses of current data sets and approaches to research on manatee population biology; and to provide recommendations for research. We discuss progress in six topics that were assigned to working groups at the workshop: aerial survey and estimation of population size, reproduction, age structure, mortality, photoidentification and estimation of survival, and integration and modeling of population data. The overview includes recommendations by working group participants (*2 0'Shea et al. 1992). This workshop on manatee population biology was the most recent conference on the topic since 1978 (*BrowneIl and Rails 1981). Partly as a result of recommendations made at the 1978 workshop, several long-term population-related research projects were established. Therefore, we also measure progress in relation to knowledge available at the time of the earlier workshop. Finally, we provide a brief synopsis of pertinent new information on manatee population biology that became available between the 1992 workshop and publication of the proceedings and our conclusions about the status of the Florida manatee.

  1. An overview: modern techniques for railway vehicle on-board health monitoring systems

    NASA Astrophysics Data System (ADS)

    Li, Chunsheng; Luo, Shihui; Cole, Colin; Spiryagin, Maksym

    2017-07-01

    Health monitoring systems with low-cost sensor networks and smart algorithms are always needed in both passenger trains and heavy haul trains due to the increasing need for reliability and safety in the railway industry. This paper focuses on an overview of existing approaches applied for railway vehicle on-board health monitoring systems. The approaches applied in the data measurement systems and the data analysis systems in railway on-board health monitoring systems are presented in this paper, including methodologies, theories and applications. The pros and cons of the various approaches are analysed to determine appropriate benchmarks for an effective and efficient railway vehicle on-board health monitoring system. According to this review, inertial sensors are the most popular due to their advantages of low cost, robustness and low power consumption. Linearisation methods are required for the model-based methods which would inevitably introduce error to the estimation results, and it is time-consuming to include all possible conditions in the pre-built database required for signal-based methods. Based on this review, future development trends in the design of new low-cost health monitoring systems for railway vehicles are discussed.

  2. Child mortality estimation 2013: an overview of updates in estimation methods by the United Nations Inter-agency Group for Child Mortality Estimation.

    PubMed

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues.

  3. In silico gene expression analysis – an overview

    PubMed Central

    Murray, David; Doran, Peter; MacMathuna, Padraic; Moss, Alan C

    2007-01-01

    Efforts aimed at deciphering the molecular basis of complex disease are underpinned by the availability of high throughput strategies for the identification of biomolecules that drive the disease process. The completion of the human genome-sequencing project, coupled to major technological developments, has afforded investigators myriad opportunities for multidimensional analysis of biological systems. Nowhere has this research explosion been more evident than in the field of transcriptomics. Affordable access and availability to the technology that supports such investigations has led to a significant increase in the amount of data generated. As most biological distinctions are now observed at a genomic level, a large amount of expression information is now openly available via public databases. Furthermore, numerous computational based methods have been developed to harness the power of these data. In this review we provide a brief overview of in silico methodologies for the analysis of differential gene expression such as Serial Analysis of Gene Expression and Digital Differential Display. The performance of these strategies, at both an operational and result/output level is assessed and compared. The key considerations that must be made when completing an in silico expression analysis are also presented as a roadmap to facilitate biologists. Furthermore, to highlight the importance of these in silico methodologies in contemporary biomedical research, examples of current studies using these approaches are discussed. The overriding goal of this review is to present the scientific community with a critical overview of these strategies, so that they can be effectively added to the tool box of biomedical researchers focused on identifying the molecular mechanisms of disease. PMID:17683638

  4. Ab initio quantum chemistry: methodology and applications.

    PubMed

    Friesner, Richard A

    2005-05-10

    This Perspective provides an overview of state-of-the-art ab initio quantum chemical methodology and applications. The methods that are discussed include coupled cluster theory, localized second-order Moller-Plesset perturbation theory, multireference perturbation approaches, and density functional theory. The accuracy of each approach for key chemical properties is summarized, and the computational performance is analyzed, emphasizing significant advances in algorithms and implementation over the past decade. Incorporation of a condensed-phase environment by means of mixed quantum mechanical/molecular mechanics or self-consistent reaction field techniques, is presented. A wide range of illustrative applications, focusing on materials science and biology, are discussed briefly.

  5. Genome-wide scans of genetic variants for psychophysiological endophenotypes: a methodological overview.

    PubMed

    Iacono, William G; Malone, Stephen M; Vaidyanathan, Uma; Vrieze, Scott I

    2014-12-01

    This article provides an introductory overview of the investigative strategy employed to evaluate the genetic basis of 17 endophenotypes examined as part of a 20-year data collection effort from the Minnesota Center for Twin and Family Research. Included are characterization of the study samples, descriptive statistics for key properties of the psychophysiological measures, and rationale behind the steps taken in the molecular genetic study design. The statistical approach included (a) biometric analysis of twin and family data, (b) heritability analysis using 527,829 single nucleotide polymorphisms (SNPs), (c) genome-wide association analysis of these SNPs and 17,601 autosomal genes, (d) follow-up analyses of candidate SNPs and genes hypothesized to have an association with each endophenotype, (e) rare variant analysis of nonsynonymous SNPs in the exome, and (f) whole genome sequencing association analysis using 27 million genetic variants. These methods were used in the accompanying empirical articles comprising this special issue, Genome-Wide Scans of Genetic Variants for Psychophysiological Endophenotypes. Copyright © 2014 Society for Psychophysiological Research.

  6. Bilingualism and Cognitive Reserve: A Critical Overview and a Plea for Methodological Innovations

    PubMed Central

    Calvo, Noelia; García, Adolfo M.; Manoiloff, Laura; Ibáñez, Agustín

    2016-01-01

    The decline of cognitive skills throughout healthy or pathological aging can be slowed down by experiences which foster cognitive reserve (CR). Recently, some studies on Alzheimer's disease have suggested that CR may be enhanced by life-long bilingualism. However, the evidence is inconsistent and largely based on retrospective approaches featuring several methodological weaknesses. Some studies demonstrated at least 4 years of delay in dementia symptoms, while others did not find such an effect. Moreover, various methodological aspects vary from study to study. The present paper addresses contradictory findings, identifies possible lurking variables, and outlines methodological alternatives thereof. First, we characterize possible confounding factors that may have influenced extant results. Our focus is on the criteria to establish bilingualism, differences in sample design, the instruments used to examine cognitive skills, and the role of variables known to modulate life-long cognition. Second, we propose that these limitations could be largely circumvented through experimental approaches. Proficiency in the non-native language can be successfully assessed by combining subjective and objective measures; confounding variables which have been distinctively associated with certain bilingual groups (e.g., alcoholism, sleep disorders) can be targeted through relevant instruments; and cognitive status might be better tapped via robust cognitive screenings and executive batteries. Moreover, future research should incorporate tasks yielding predictable patterns of contrastive performance between bilinguals and monolinguals. Crucially, these include instruments which reveal bilingual disadvantages in vocabulary, null effects in working memory, and advantages in inhibitory control and other executive functions. Finally, paradigms tapping proactive interference (which assess the disruptive effect of long-term memory on newly learned information) could also offer useful data

  7. JEDI Methodology | Jobs and Economic Development Impact Models | NREL

    Science.gov Websites

    Methodology JEDI Methodology The intent of the Jobs and Economic Development Impact (JEDI) models costs) to demonstrate the employment and economic impacts that will likely result during the estimate of overall economic impacts from specific scenarios. Please see Limitations of JEDI Models for

  8. Methodologies on estimating the energy requirements for maintenance and determining the net energy contents of feed ingredients in swine: a review of recent work.

    PubMed

    Li, Zhongchao; Liu, Hu; Li, Yakui; Lv, Zhiqian; Liu, Ling; Lai, Changhua; Wang, Junjun; Wang, Fenglai; Li, Defa; Zhang, Shuai

    2018-01-01

    In the past two decades, a considerable amount of research has focused on the determination of the digestible (DE) and metabolizable energy (ME) contents of feed ingredients fed to swine. Compared with the DE and ME systems, the net energy (NE) system is assumed to be the most accurate estimate of the energy actually available to the animal. However, published data pertaining to the measured NE content of ingredients fed to growing pigs are limited. Therefore, the Feed Data Group at the Ministry of Agricultural Feed Industry Centre (MAFIC) located at China Agricultural University has evaluated the NE content of many ingredients using indirect calorimetry. The present review summarizes the NE research works conducted at MAFIC and compares these results with those from other research groups on methodological aspect. These research projects mainly focus on estimating the energy requirements for maintenance and its impact on the determination, prediction, and validation of the NE content of several ingredients fed to swine. The estimation of maintenance energy is affected by methodology, growth stage, and previous feeding level. The fasting heat production method and the curvilinear regression method were used in MAFIC to estimate the NE requirement for maintenance. The NE contents of different feedstuffs were determined using indirect calorimetry through standard experimental procedure in MAFIC. Previously generated NE equations can also be used to predict NE in situations where calorimeters are not available. Although popular, the caloric efficiency is not a generally accepted method to validate the energy content of individual feedstuffs. In the future, more accurate and dynamic NE prediction equations aiming at specific ingredients should be established, and more practical validation approaches need to be developed.

  9. Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.

    PubMed

    Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan

    2013-01-01

    In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.

  10. U.S. DOE NETL methodology for estimating the prospective CO 2 storage resource of shales at the national and regional scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, Jonathan S.; Fukai, Isis; Soeder, Daniel J.

    While the majority of shale formations will serve as reservoir seals for stored anthropogenic carbon dioxide (CO2), hydrocarbon-bearing shale formations may be potential geologic sinks after depletion through primary production. Here in this paper we present the United States-Department of Energy-National Energy Technology Laboratory (US-DOE-NETL) methodology for screening-level assessment of prospective CO 2 storage resources in shale using a volumetric equation. Volumetric resource estimates are produced from the bulk volume, porosity, and sorptivity of the shale and storage efficiency factors based on formation-scale properties and petrophysical limitations on fluid transport. Prospective shale formations require: (1) prior hydrocarbon production using horizontalmore » drilling and stimulation via staged, high-volume hydraulic fracturing, (2) depths sufficient to maintain CO 2 in a supercritical state, generally >800 m, and (3) an overlying seal. The US-DOE-NETL methodology accounts for storage of CO 2 in shale as a free fluid phase within fractures and matrix pores and as an sorbed phase on organic matter and clays. Uncertainties include but are not limited to poorly-constrained geologic variability in formation thickness, porosity, existing fluid content, organic richness, and mineralogy. Knowledge of how these parameters may be linked to depositional environments, facies, and diagenetic history of the shale will improve the understanding of pore-to-reservoir scale behavior, and provide improved estimates of prospective CO 2 storage.« less

  11. U.S. DOE NETL methodology for estimating the prospective CO 2 storage resource of shales at the national and regional scale

    DOE PAGES

    Levine, Jonathan S.; Fukai, Isis; Soeder, Daniel J.; ...

    2016-05-31

    While the majority of shale formations will serve as reservoir seals for stored anthropogenic carbon dioxide (CO2), hydrocarbon-bearing shale formations may be potential geologic sinks after depletion through primary production. Here in this paper we present the United States-Department of Energy-National Energy Technology Laboratory (US-DOE-NETL) methodology for screening-level assessment of prospective CO 2 storage resources in shale using a volumetric equation. Volumetric resource estimates are produced from the bulk volume, porosity, and sorptivity of the shale and storage efficiency factors based on formation-scale properties and petrophysical limitations on fluid transport. Prospective shale formations require: (1) prior hydrocarbon production using horizontalmore » drilling and stimulation via staged, high-volume hydraulic fracturing, (2) depths sufficient to maintain CO 2 in a supercritical state, generally >800 m, and (3) an overlying seal. The US-DOE-NETL methodology accounts for storage of CO 2 in shale as a free fluid phase within fractures and matrix pores and as an sorbed phase on organic matter and clays. Uncertainties include but are not limited to poorly-constrained geologic variability in formation thickness, porosity, existing fluid content, organic richness, and mineralogy. Knowledge of how these parameters may be linked to depositional environments, facies, and diagenetic history of the shale will improve the understanding of pore-to-reservoir scale behavior, and provide improved estimates of prospective CO 2 storage.« less

  12. Child Mortality Estimation 2013: An Overview of Updates in Estimation Methods by the United Nations Inter-Agency Group for Child Mortality Estimation

    PubMed Central

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    Background In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. Methods We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Findings Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. Conclusions The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues. PMID:25013954

  13. An overview of sensor calibration inter-comparison and applications

    USGS Publications Warehouse

    Xiong, Xiaoxiong; Cao, Changyong; Chander, Gyanesh

    2010-01-01

    Long-term climate data records (CDR) are often constructed using observations made by multiple Earth observing sensors over a broad range of spectra and a large scale in both time and space. These sensors can be of the same or different types operated on the same or different platforms. They can be developed and built with different technologies and are likely operated over different time spans. It has been known that the uncertainty of climate models and data records depends not only on the calibration quality (accuracy and stability) of individual sensors, but also on their calibration consistency across instruments and platforms. Therefore, sensor calibration inter-comparison and validation have become increasingly demanding and will continue to play an important role for a better understanding of the science product quality. This paper provides an overview of different methodologies, which have been successfully applied for sensor calibration inter-comparison. Specific examples using different sensors, including MODIS, AVHRR, and ETM+, are presented to illustrate the implementation of these methodologies.

  14. Dental Evidence in Forensic Identification - An Overview, Methodology and Present Status.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Garg, Arun K

    2015-01-01

    Forensic odontology is primarily concerned with the use of teeth and oral structures for identification in a legal context. Various forensic odontology techniques help in the identification of the human remains in incidents such as terrorists' attacks, airplane, train and road accidents, fires, mass murders, and natural disasters such as tsunamis, earth quakes and floods, etc. (Disaster Victim Identification-DVI). Dental structures are the hardest and well protected structures in the body. These structures resist decomposition and high temperatures and are among the last ones to disintegrate after death. The principal basis of the dental identification lies in the fact that no two oral cavities are alike and the teeth are unique to an individual. The dental evidence of the deceased recovered from the scene of crime/occurrence is compared with the ante-mortem records for identification. Dental features such as tooth morphology, variations in shape and size, restorations, pathologies, missing tooth, wear patterns, crowding of the teeth, colour and position of the tooth, rotations and other peculiar dental anomalies give every individual a unique identity. In absence of ante-mortem dental records for comparison, the teeth can help in the determination of age, sex, race/ethnicity, habits, occupations, etc. which can give further clues regarding the identity of the individuals. This piece of writing gives an overview of dental evidence, its use in forensic identification and its limitations.

  15. Optimized tuner selection for engine performance estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L. (Inventor); Garg, Sanjay (Inventor)

    2013-01-01

    A methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented. This technique specifically addresses the underdetermined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine which seeks to minimize the theoretical mean-squared estimation error. Theoretical Kalman filter estimation error bias and variance values are derived at steady-state operating conditions, and the tuner selection routine is applied to minimize these values. The new methodology yields an improvement in on-line engine performance estimation accuracy.

  16. A normative price for a manufactured product: The SAMICS methodology. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1979-01-01

    A summary for the Solar Array Manufacturing Industry Costing Standards report contains a discussion of capabilities and limitations, a non-technical overview of the methodology, and a description of the input data which must be collected. It also describes the activities that were and are being taken to ensure validity of the results and contains an up-to-date bibliography of related documents.

  17. The speed-accuracy tradeoff: history, physiology, methodology, and behavior

    PubMed Central

    Heitz, Richard P.

    2014-01-01

    There are few behavioral effects as ubiquitous as the speed-accuracy tradeoff (SAT). From insects to rodents to primates, the tendency for decision speed to covary with decision accuracy seems an inescapable property of choice behavior. Recently, the SAT has received renewed interest, as neuroscience approaches begin to uncover its neural underpinnings and computational models are compelled to incorporate it as a necessary benchmark. The present work provides a comprehensive overview of SAT. First, I trace its history as a tractable behavioral phenomenon and the role it has played in shaping mathematical descriptions of the decision process. Second, I present a “users guide” of SAT methodology, including a critical review of common experimental manipulations and analysis techniques and a treatment of the typical behavioral patterns that emerge when SAT is manipulated directly. Finally, I review applications of this methodology in several domains. PMID:24966810

  18. Current target acquisition methodology in force on force simulations

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; Mazz, John P.

    2017-05-01

    The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military community in force on force simulations for training, testing, and analysis. There have been significant improvements to these models over the past few years. The significant improvements are the transition of ACQUIRE TTP-TAS (ACQUIRE Targeting Task Performance Target Angular Size) methodology for all imaging sensors and the development of new discrimination criteria for urban environments and humans. This paper is intended to provide an overview of the current target acquisition modeling approach and provide data for the new discrimination tasks. This paper will discuss advances and changes to the models and methodologies used to: (1) design and compare sensors' performance, (2) predict expected target acquisition performance in the field, (3) predict target acquisition performance for combat simulations, and (4) how to conduct model data validation for combat simulations.

  19. Demonstration of line transect methodologies to estimate urban gray squirrel density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hein, E.W.

    1997-11-01

    Because studies estimating density of gray squirrels (Sciurus carolinensis) have been labor intensive and costly, I demonstrate the use of line transect surveys to estimate gray squirrel density and determine the costs of conducting surveys to achieve precise estimates. Density estimates are based on four transacts that were surveyed five times from 30 June to 9 July 1994. Using the program DISTANCE, I estimated there were 4.7 (95% Cl = 1.86-11.92) gray squirrels/ha on the Clemson University campus. Eleven additional surveys would have decreased the percent coefficient of variation from 30% to 20% and would have cost approximately $114. Estimatingmore » urban gray squirrel density using line transect surveys is cost effective and can provide unbiased estimates of density, provided that none of the assumptions of distance sampling theory are violated.« less

  20. Base heating methodology improvements, volume 1

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Reardon, John E.; Somers, Richard E.; Fulton, Michael S.; Smith, Sheldon D.; Pergament, Harold

    1992-01-01

    This document is the final report for NASA MSFC Contract NAS8-38141. The contracted effort had the broad objective of improving the launch vehicles ascent base heating methodology to improve and simplify the determination of that environment for Advanced Launch System (ALS) concepts. It was pursued as an Advanced Development Plan (ADP) for the Joint DoD/NASA ALS program office with project management assigned to NASA/MSFC. The original study was to be completed in 26 months beginning Sep. 1989. Because of several program changes and emphasis on evolving launch vehicle concepts, the period of performance was extended to the current completion date of Nov. 1992. A computer code incorporating the methodology improvements into a quick prediction tool was developed and is operational for basic configuration and propulsion concepts. The code and its users guide are also provided as part of the contract documentation. Background information describing the specific objectives, limitations, and goals of the contract is summarized. A brief chronology of the ALS/NLS program history is also presented to provide the reader with an overview of the many variables influencing the development of the code over the past three years.

  1. Detection, emission estimation and risk prediction of forest fires in China using satellite sensors and simulation models in the past three decades--an overview.

    PubMed

    Zhang, Jia-Hua; Yao, Feng-Mei; Liu, Cheng; Yang, Li-Min; Boken, Vijendra K

    2011-08-01

    Forest fires have major impact on ecosystems and greatly impact the amount of greenhouse gases and aerosols in the atmosphere. This paper presents an overview in the forest fire detection, emission estimation, and fire risk prediction in China using satellite imagery, climate data, and various simulation models over the past three decades. Since the 1980s, remotely-sensed data acquired by many satellites, such as NOAA/AVHRR, FY-series, MODIS, CBERS, and ENVISAT, have been widely utilized for detecting forest fire hot spots and burned areas in China. Some developed algorithms have been utilized for detecting the forest fire hot spots at a sub-pixel level. With respect to modeling the forest burning emission, a remote sensing data-driven Net Primary productivity (NPP) estimation model was developed for estimating forest biomass and fuel. In order to improve the forest fire risk modeling in China, real-time meteorological data, such as surface temperature, relative humidity, wind speed and direction, have been used as the model input for improving prediction of forest fire occurrence and its behavior. Shortwave infrared (SWIR) and near infrared (NIR) channels of satellite sensors have been employed for detecting live fuel moisture content (FMC), and the Normalized Difference Water Index (NDWI) was used for evaluating the forest vegetation condition and its moisture status.

  2. Gestational age estimation on United States livebirth certificates: a historical overview.

    PubMed

    Wier, Megan L; Pearl, Michelle; Kharrazi, Martin

    2007-09-01

    Gestational age on the birth certificate is the most common source of population-based gestational age data that informs public health policy and practice in the US. Last menstrual period is one of the oldest methods of gestational age estimation and has been on the US Standard Certificate of Live Birth since 1968. The 'clinical estimate of gestation', added to the standard certificate in 1989 to address missing or erroneous last menstrual period data, was replaced by the 'obstetric estimate of gestation' on the 2003 revision, which specifically precludes neonatal assessments. We discuss the strengths and weaknesses of these measures, potential research implications and challenges accompanying the transition to the obstetric estimate.

  3. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  4. Estimating abundance: Chapter 27

    USGS Publications Warehouse

    Royle, J. Andrew

    2016-01-01

    This chapter provides a non-technical overview of ‘closed population capture–recapture’ models, a class of well-established models that are widely applied in ecology, such as removal sampling, covariate models, and distance sampling. These methods are regularly adopted for studies of reptiles, in order to estimate abundance from counts of marked individuals while accounting for imperfect detection. Thus, the chapter describes some classic closed population models for estimating abundance, with considerations for some recent extensions that provide a spatial context for the estimation of abundance, and therefore density. Finally, the chapter suggests some software for use in data analysis, such as the Windows-based program MARK, and provides an example of estimating abundance and density of reptiles using an artificial cover object survey of Slow Worms (Anguis fragilis).

  5. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  6. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  7. A methodology for using borehole temperature-depth profiles under ambient, single and cross-borehole pumping conditions to estimate fracture hydraulic properties

    NASA Astrophysics Data System (ADS)

    Klepikova, Maria V.; Le Borgne, Tanguy; Bour, Olivier; Davy, Philippe

    2011-09-01

    SummaryTemperature profiles in the subsurface are known to be sensitive to groundwater flow. Here we show that they are also strongly related to vertical flow in the boreholes themselves. Based on a numerical model of flow and heat transfer at the borehole scale, we propose a method to invert temperature measurements to derive borehole flow velocities. This method is applied to an experimental site in fractured crystalline rocks. Vertical flow velocities deduced from the inversion of temperature measurements are compared with direct heat-pulse flowmeter measurements showing a good agreement over two orders of magnitudes. Applying this methodology under ambient, single and cross-borehole pumping conditions allows us to estimate fracture hydraulic head and local transmissivity, as well as inter-borehole fracture connectivity. Thus, these results provide new insights on how to include temperature profiles in inverse problems for estimating hydraulic fracture properties.

  8. High School Students' Accuracy in Estimating the Cost of College: A Proposed Methodological Approach and Differences among Racial/Ethnic Groups and College Financial-Related Factors

    ERIC Educational Resources Information Center

    Nienhusser, H. Kenny; Oshio, Toko

    2017-01-01

    High school students' accuracy in estimating the cost of college (AECC) was examined by utilizing a new methodological approach, the absolute-deviation-continuous construct. This study used the High School Longitudinal Study of 2009 (HSLS:09) data and examined 10,530 11th grade students in order to measure their AECC for 4-year public and private…

  9. Recent advances in sortase-catalyzed ligation methodology.

    PubMed

    Antos, John M; Truttmann, Matthias C; Ploegh, Hidde L

    2016-06-01

    The transpeptidation reaction catalyzed by bacterial sortases continues to see increasing use in the construction of novel protein derivatives. In addition to growth in the number of applications that rely on sortase, this field has also seen methodology improvements that enhance reaction performance and scope. In this opinion, we present an overview of key developments in the practice and implementation of sortase-based strategies, including applications relevant to structural biology. Topics include the use of engineered sortases to increase reaction rates, the use of redesigned acyl donors and acceptors to mitigate reaction reversibility, and strategies for expanding the range of substrates that are compatible with a sortase-based approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Simple methodologies to estimate the energy amount stored in a tree due to an explosive seed dispersal mechanism

    NASA Astrophysics Data System (ADS)

    do Carmo, Eduardo; Goncalves Hönnicke, Marcelo

    2018-05-01

    There are different forms to introduce/illustrate the energy concepts for the basic physics students. The explosive seed dispersal mechanism found in a variety of trees could be one of them. Sibipiruna trees carry out fruits (pods) who show such an explosive mechanism. During the explosion, the pods throw out seeds several meters away. In this manuscript we show simple methodologies to estimate the energy amount stored in the Sibipiruna tree due to such a process. Two different physics approaches were used to carry out this study: by monitoring indoor and in situ the explosive seed dispersal mechanism and by measuring the elastic constant of the pod shell. An energy of the order of kJ was found to be stored in a single tree due to such an explosive mechanism.

  11. 2016 Workplace and Gender Relations Survey of Active Duty Members: Overview Report

    DTIC Science & Technology

    2017-05-01

    Grifka 1 Chapter 2: Survey Methodology Ms. Lisa Davis, Mr. Eric Falk, and Mr. Jeff Schneider 19 Chapter 3: Estimated Sexual Assault...assessing the gender relations environment across the Services. Study Background and Methodology Study Background The Defense Research, Surveys, and...gender discrimination. 3 Chapter 1 provides additional information on the construction of these metrics. Survey Methodology OPA conducts DoD cross

  12. The Pathogen- and Incidence-Based DALY Approach: An Appropriated Methodology for Estimating the Burden of Infectious Diseases

    PubMed Central

    Mangen, Marie-Josée J.; Plass, Dietrich; Havelaar, Arie H.; Gibbons, Cheryl L.; Cassini, Alessandro; Mühlberger, Nikolai; van Lier, Alies; Haagsma, Juanita A.; Brooke, R. John; Lai, Taavi; de Waure, Chiara; Kramarz, Piotr; Kretzschmar, Mirjam E. E.

    2013-01-01

    In 2009, the European Centre for Disease Prevention and Control initiated the ‘Burden of Communicable Diseases in Europe (BCoDE)’ project to generate evidence-based and comparable burden-of-disease estimates of infectious diseases in Europe. The burden-of-disease metric used was the Disability-Adjusted Life Year (DALY), composed of years of life lost due to premature death (YLL) and due to disability (YLD). To better represent infectious diseases, a pathogen-based approach was used linking incident cases to sequelae through outcome trees. Health outcomes were included if an evidence-based causal relationship between infection and outcome was established. Life expectancy and disability weights were taken from the Global Burden of Disease Study and alternative studies. Disease progression parameters were based on literature. Country-specific incidence was based on surveillance data corrected for underestimation. Non-typhoidal Salmonella spp. and Campylobacter spp. were used for illustration. Using the incidence- and pathogen-based DALY approach the total burden for Salmonella spp. and Campylobacter spp. was estimated at 730 DALYs and at 1,780 DALYs per year in the Netherlands (average of 2005–2007). Sequelae accounted for 56% and 82% of the total burden of Salmonella spp. and Campylobacter spp., respectively. The incidence- and pathogen-based DALY methodology allows in the case of infectious diseases a more comprehensive calculation of the disease burden as subsequent sequelae are fully taken into account. Not considering subsequent sequelae would strongly underestimate the burden of infectious diseases. Estimates can be used to support prioritisation and comparison of infectious diseases and other health conditions, both within a country and between countries. PMID:24278167

  13. Cardiac rehabilitation for people with heart disease: an overview of Cochrane systematic reviews.

    PubMed

    Anderson, L J; Taylor, R S

    2014-12-15

    Overviews are a new approach to summarising evidence and synthesising results from related systematic reviews. To conduct an overview of Cochrane systematic reviews to provide a contemporary review of the evidence for cardiac rehabilitation (CR), identify opportunities for merging or splitting existing Cochrane reviews, and identify current evidence gaps to inform new review titles. The Cochrane Database of Systematic Reviews was searched to identify reviews that address the objectives of this overview. Data presentation is descriptive with tabular presentations of review- and trial-level characteristics and results. The six included Cochrane systematic reviews were of high methodological quality and included 148 randomised controlled trials in 97,486 participants. Compared to usual care alone, exercise-based CR reduces hospital admissions and improves patient health related quality of life (HRQL) in low to moderate risk heart failure and coronary heart disease (CHD) patients. At 12 months or more follow-up, there was evidence of some reduction in mortality in patients with CHD. Psychological- and education-based interventions appear to have little impact on mortality or morbidity but may improve HRQL. Home- and centre-based programmes are equally effective in improving HRQL at similar costs. Selected interventions can increase the uptake of CR programmes but evidence to support interventions that improve adherence is weak. This overview confirms that exercise-based CR is effective and safe in the management of clinically stable heart failure and post-MI and PCI patients. We discuss the implications of this overview on the future direction of the Cochrane CR reviews portfolio. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Technology advancements for the U.S. manned Space Station - An overview

    NASA Technical Reports Server (NTRS)

    Simon, William E.

    1987-01-01

    The structure and methodology of the Johnson Space Center (JSC) advanced development program is described. An overview of the program is given, and the technology transfer process to other disciplines is described. The test bed and flight experiment programs are described, as is the technology assessment which was performed at the end of the Phase B program. The technology program within each discipline is summarized, and the coordination and integration of the JSC program with the activities of other NASA centers and with work package contractors are discussed.

  15. ADVISORY ON UPDATED METHODOLOGY FOR ...

    EPA Pesticide Factsheets

    The National Academy of Sciences (NAS) published the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in 2006. The Committee analyzed the most recent epidemiology from the important exposed cohorts and factored in changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee also considered relevant radiobiological data, including that from the Department of Energy's low dose effects research program. Based on the review of this information, the Committee proposed a set of models for estimating risks from low-dose ionizing radiation. ORIA then prepared a white paper revising the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This is the first product to be developed as a result of the BEIR VII report. We requested that the SAB conduct an advisory during the development of this methodology. The second product to be prepared will be a revised version of the document,

  16. Developing a methodology for the inverse estimation of root architectural parameters from field based sampling schemes

    NASA Astrophysics Data System (ADS)

    Morandage, Shehan; Schnepf, Andrea; Vanderborght, Jan; Javaux, Mathieu; Leitner, Daniel; Laloy, Eric; Vereecken, Harry

    2017-04-01

    Root traits are increasingly important in breading of new crop varieties. E.g., longer and fewer lateral roots are suggested to improve drought resistance of wheat. Thus, detailed root architectural parameters are important. However, classical field sampling of roots only provides more aggregated information such as root length density (coring), root counts per area (trenches) or root arrival curves at certain depths (rhizotubes). We investigate the possibility of obtaining the information about root system architecture of plants using field based classical root sampling schemes, based on sensitivity analysis and inverse parameter estimation. This methodology was developed based on a virtual experiment where a root architectural model was used to simulate root system development in a field, parameterized for winter wheat. This information provided the ground truth which is normally unknown in a real field experiment. The three sampling schemes coring, trenching, and rhizotubes where virtually applied to and aggregated information computed. Morris OAT global sensitivity analysis method was then performed to determine the most sensitive parameters of root architecture model for the three different sampling methods. The estimated means and the standard deviation of elementary effects of a total number of 37 parameters were evaluated. Upper and lower bounds of the parameters were obtained based on literature and published data of winter wheat root architectural parameters. Root length density profiles of coring, arrival curve characteristics observed in rhizotubes, and root counts in grids of trench profile method were evaluated statistically to investigate the influence of each parameter using five different error functions. Number of branches, insertion angle inter-nodal distance, and elongation rates are the most sensitive parameters and the parameter sensitivity varies slightly with the depth. Most parameters and their interaction with the other parameters show

  17. Distant Healing Intention Therapies: An Overview of the Scientific Evidence

    PubMed Central

    Schlitz, Marilyn; Baur, Christopher

    2015-01-01

    This article provides a broad overview of “distant healing intention” (DHI) therapies, ie, intentional healing modalities claimed to transcend the usual constraints of distance through space or time. We provide a summary of previous reviews and meta-analyses that have explored a diverse array of DHI modalities, outcome measures, and experimental protocols. While some significant experimental effects have been observed, the evidence to date does not yet provide confidence in its clinical efficacy. The purported “nonlocal” nature of DHI raises significant methodological and theoretical challenges. We recommend several avenues for improving future research. PMID:26665044

  18. Spacecraft alignment estimation. [for onboard sensors

    NASA Technical Reports Server (NTRS)

    Shuster, Malcolm D.; Bierman, Gerald J.

    1988-01-01

    A numerically well-behaved factorized methodology is developed for estimating spacecraft sensor alignments from prelaunch and inflight data without the need to compute the spacecraft attitude or angular velocity. Such a methodology permits the estimation of sensor alignments (or other biases) in a framework free of unknown dynamical variables. In actual mission implementation such an algorithm is usually better behaved than one that must compute sensor alignments simultaneously with the spacecraft attitude, for example by means of a Kalman filter. In particular, such a methodology is less sensitive to data dropouts of long duration, and the derived measurement used in the attitude-independent algorithm usually makes data checking and editing of outliers much simpler than would be the case in the filter.

  19. Interventions for postoperative pain in children: An overview of systematic reviews.

    PubMed

    Boric, Krste; Dosenovic, Svjetlana; Jelicic Kadic, Antonia; Batinic, Marijan; Cavar, Marija; Urlic, Marjan; Markovina, Nikolina; Puljak, Livia

    2017-09-01

    The aim of this study was to conduct an overview of systematic reviews that summarizes the results about efficacy and safety from randomized controlled trials involving the various strategies used for postoperative pain management in children. We searched the Cochrane Database of Systematic Reviews, CINAHL, Database of Reviews of Effect, Embase, MEDLINE, and PsycINFO from the earliest date to January 24, 2016. This overview included 45 systematic reviews that evaluated interventions for postoperative pain in children. Out of 45 systematic reviews that investigated various interventions for postoperative pain in children, 19 systematic reviews (42%) presented conclusive evidence of efficacy. Positive conclusive evidence was reported in 18 systematic reviews (40%) for the efficacy of diclofenac, ketamine, caudal analgesia, dexmedetomidine, music therapy, corticosteroid, epidural analgesia, paracetamol, and/or nonsteroidal anti-inflammatory drugs and transversus abdominis plane block. Only one systematic review reported conclusive evidence of equal efficacy that involved a comparison of dexmedetomidine vs morphine and fentanyl. Safety of interventions was reported as conclusive in 14 systematic reviews (31%), with positive conclusive evidence for dexmedetomidine, corticosteroid, epidural analgesia, transversus abdominis plane block, and clonidine. Seven systematic reviews reported equal conclusive safety for epidural infusion, diclofenac intravenous vs ketamine added to opioid analgesia, bupivacaine, ketamine, paracetamol, and dexmedetomidine vs intravenous infusions of various opioid analgesics, oral suspension and suppository of diclofenac, only opioid, normal saline, no treatment, placebo, and midazolam. Negative conclusive statement for safety was reported in one systematic review for caudal analgesia vs noncaudal regional analgesia. More than half of systematic reviews included in this overview were rated as having medium methodological quality. Of 45 included

  20. Estimating Agricultural Nitrous Oxide Emissions

    USDA-ARS?s Scientific Manuscript database

    Nitrous oxide emissions are highly variable in space and time and different methodologies have not agreed closely, especially at small scales. However, as scale increases, so does the agreement between estimates based on soil surface measurements (bottom up approach) and estimates derived from chang...

  1. Overview of the ACT program

    NASA Technical Reports Server (NTRS)

    Davis, John G., Jr.

    1992-01-01

    NASA's Advanced Composites Program (ACT) was initiated in 1988. A National Research Announcement was issued to solicit innovative ideas that could significantly contribute to development and demonstration of an integrated technology data base and confidence level that permits cost-effective use of composite primary structures in transport aircraft. Fifteen contracts were awarded by the Spring of 1989 and the participants include commercial and military airframe manufacturers, materials developers and suppliers, universities, and government laboratories. The program approach is to develop materials, structural mechanics methodology, design concepts, and fabrication procedures that offer the potential to make composite structures cost-effective compared to aluminum structure. Goals for the ACT program included 30-50 percent weight reduction, 20-25 percent acquisition cost reduction, and provided the scientific basis for predicting materials and structures performance. This paper provides an overview of the ACT program status, plans, and selected technical accomplishments. Sixteen additional papers, which provide more detailed information on the research and development accomplishments, are contained in this publication.

  2. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  3. Developing a methodological framework for estimating water productivity indicators in water scarce regions

    NASA Astrophysics Data System (ADS)

    Mubako, S. T.; Fullerton, T. M.; Walke, A.; Collins, T.; Mubako, G.; Walker, W. S.

    2014-12-01

    Water productivity is an area of growing interest in assessing the impact of human economic activities on water resources, especially in arid regions. Indicators of water productivity can assist water users in evaluating sectoral water use efficiency, identifying sources of pressure on water resources, and in supporting water allocation rationale under scarcity conditions. This case study for the water-scarce Middle Rio Grande River Basin aims to develop an environmental-economic accounting approach for water use in arid river basins through a methodological framework that relates water use to human economic activities impacting regional water resources. Water uses are coupled to economic transactions, and the complex but mutual relations between various water using sectors estimated. A comparison is made between the calculated water productivity indicators and representative cost/price per unit volume of water for the main water use sectors. Although it contributes very little to regional economic output, preliminary results confirm that Irrigation is among the sectors with the largest direct water use intensities. High economic value and low water use intensity economic sectors in the study region include Manufacturing, Mining, and Steam Electric Power. Water accounting challenges revealed by the study include differences in water management regimes between jurisdictions, and little understanding of the impact of major economic activities on the interaction between surface and groundwater systems in this region. A more comprehensive assessment would require the incorporation of environmental and social sustainability indicators to the calculated water productivity indicators.

  4. 1. OVERVIEW LOOKING EAST: overview taken from south end BLDG. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. OVERVIEW LOOKING EAST: overview taken from south end BLDG. 1. - Fafnir Bearing Plant, Bounded on North side by Myrtle Street, on South side by Orange Street, on East side by Booth Street & on West side by Grove Street, New Britain, Hartford County, CT

  5. Methodological challenges for the evaluation of clinical effectiveness in the context of accelerated regulatory approval: an overview.

    PubMed

    Woolacott, Nerys; Corbett, Mark; Jones-Diette, Julie; Hodgson, Robert

    2017-10-01

    Regulatory authorities are approving innovative therapies with limited evidence. Although this level of data is sufficient for the regulator to establish an acceptable risk-benefit balance, it is problematic for downstream health technology assessment, where assessment of cost-effectiveness requires reliable estimates of effectiveness relative to existing clinical practice. Some key issues associated with a limited evidence base include using data, from nonrandomized studies, from small single-arm trials, or from single-center trials; and using surrogate end points. We examined these methodological challenges through a pragmatic review of the available literature. Methods to adjust nonrandomized studies for confounding are imperfect. The relative treatment effect generated from single-arm trials is uncertain and may be optimistic. Single-center trial results may not be generalizable. Surrogate end points, on average, overestimate treatment effects. Current methods for analyzing such data are limited, and effectiveness claims based on these suboptimal forms of evidence are likely to be subject to significant uncertainty. Assessments of cost-effectiveness, based on the modeling of such data, are likely to be subject to considerable uncertainty. This uncertainty must not be underestimated by decision makers: methods for its quantification are required and schemes to protect payers from the cost of uncertainty should be implemented. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  6. Detection, Emission Estimation and Risk Prediction of Forest Fires in China Using Satellite Sensors and Simulation Models in the Past Three Decades—An Overview

    PubMed Central

    Zhang, Jia-Hua; Yao, Feng-Mei; Liu, Cheng; Yang, Li-Min; Boken, Vijendra K.

    2011-01-01

    Forest fires have major impact on ecosystems and greatly impact the amount of greenhouse gases and aerosols in the atmosphere. This paper presents an overview in the forest fire detection, emission estimation, and fire risk prediction in China using satellite imagery, climate data, and various simulation models over the past three decades. Since the 1980s, remotely-sensed data acquired by many satellites, such as NOAA/AVHRR, FY-series, MODIS, CBERS, and ENVISAT, have been widely utilized for detecting forest fire hot spots and burned areas in China. Some developed algorithms have been utilized for detecting the forest fire hot spots at a sub-pixel level. With respect to modeling the forest burning emission, a remote sensing data-driven Net Primary productivity (NPP) estimation model was developed for estimating forest biomass and fuel. In order to improve the forest fire risk modeling in China, real-time meteorological data, such as surface temperature, relative humidity, wind speed and direction, have been used as the model input for improving prediction of forest fire occurrence and its behavior. Shortwave infrared (SWIR) and near infrared (NIR) channels of satellite sensors have been employed for detecting live fuel moisture content (FMC), and the Normalized Difference Water Index (NDWI) was used for evaluating the forest vegetation condition and its moisture status. PMID:21909297

  7. Development of regional stump-to-mill logging cost estimators

    Treesearch

    Chris B. LeDoux; John E. Baumgras

    1989-01-01

    Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...

  8. Eigenspace perturbations for structural uncertainty estimation of turbulence closure models

    NASA Astrophysics Data System (ADS)

    Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).

  9. Novel methodology for pharmaceutical expenditure forecast.

    PubMed

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time

  10. Novel methodology for pharmaceutical expenditure forecast

    PubMed Central

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and

  11. On the methodology of Engineering Geodesy

    NASA Astrophysics Data System (ADS)

    Brunner, Fritz K.

    2007-09-01

    Textbooks on geodetic surveying usually describe a very small number of principles which should provide the foundation of geodetic surveying. Here, the author argues that an applied field, such as engineering geodesy, has a methodology as foundation rather than a few principles. Ten methodological elements (ME) are identified: (1) Point discretisation of natural surfaces and objects, (2) distinction between coordinate and observation domain, (3) definition of reference systems, (4) specification of unknown parameters and desired precisions, (5) geodetic network and observation design, (6) quality control of equipment, (7) quality control of measurements, (8) establishment of measurement models, (9) establishment of parameter estimation models, (10) quality control of results. Each ME consists of a suite of theoretical developments, geodetic techniques and calculation procedures, which will be discussed. This paper is to be considered a first attempt at identifying the specific elements of the methodology of engineering geodesy. A better understanding of this methodology could lead to an increased objectivity, to a transformation of subjective practical experiences into objective working methods, and consequently to a new structure for teaching this rather diverse subject.

  12. Megastudies, crowdsourcing, and large datasets in psycholinguistics: An overview of recent developments.

    PubMed

    Keuleers, Emmanuel; Balota, David A

    2015-01-01

    This paper introduces and summarizes the special issue on megastudies, crowdsourcing, and large datasets in psycholinguistics. We provide a brief historical overview and show how the papers in this issue have extended the field by compiling new databases and making important theoretical contributions. In addition, we discuss several studies that use text corpora to build distributional semantic models to tackle various interesting problems in psycholinguistics. Finally, as is the case across the papers, we highlight some methodological issues that are brought forth via the analyses of such datasets.

  13. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  14. Overview of the carbon fiber problem

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Carbon fibers (CF) composite structures are being utilized more as alternatives to metals in both civilian and military applications. They are valued for their light weight and high strength as well as for their ease of designing structures with specific shapes and sizes. However, a problem may exist due to the high conductivity of CF. CF are manufactured from a precursor material which is subjected to great stress and heat treatment causing a change in the physical and electrical properties. The fibers are bound together by a matrix of epoxy. In the event of fire (aircraft accident) the epoxy would burn away releasing these fibers into the atmosphere. When these fibers come in contact with electronic equipment, they might cause damage to by settling on electrical junctions. An overview is given of the objectives for a study, and the approach and methodology developed for determination of risk profiles.

  15. Youth Attitude Tracking Study II Wave 17 -- Fall 1986.

    DTIC Science & Technology

    1987-06-01

    decision, unless so designated by other official documentation. TABLE OF CONTENTS Page PREFACE ................................................. xi...Segmentation Analyses .......................... 2-7 .3. METHODOLOGY OF YATS II....................................... 3-1 A. Sampling Design Overview...Sampling Design , Estimation Procedures and Estimated Sampling Errors ................................. A-i Appendix B: Data Collection Procedures

  16. Dental Evidence in Forensic Identification – An Overview, Methodology and Present Status

    PubMed Central

    Krishan, Kewal; Kanchan, Tanuj; Garg, Arun K

    2015-01-01

    Forensic odontology is primarily concerned with the use of teeth and oral structures for identification in a legal context. Various forensic odontology techniques help in the identification of the human remains in incidents such as terrorists’ attacks, airplane, train and road accidents, fires, mass murders, and natural disasters such as tsunamis, earth quakes and floods, etc. (Disaster Victim Identification-DVI). Dental structures are the hardest and well protected structures in the body. These structures resist decomposition and high temperatures and are among the last ones to disintegrate after death. The principal basis of the dental identification lies in the fact that no two oral cavities are alike and the teeth are unique to an individual. The dental evidence of the deceased recovered from the scene of crime/occurrence is compared with the ante-mortem records for identification. Dental features such as tooth morphology, variations in shape and size, restorations, pathologies, missing tooth, wear patterns, crowding of the teeth, colour and position of the tooth, rotations and other peculiar dental anomalies give every individual a unique identity. In absence of ante-mortem dental records for comparison, the teeth can help in the determination of age, sex, race/ethnicity, habits, occupations, etc. which can give further clues regarding the identity of the individuals. This piece of writing gives an overview of dental evidence, its use in forensic identification and its limitations. PMID:26312096

  17. Reference interval estimation: Methodological comparison using extensive simulations and empirical data.

    PubMed

    Daly, Caitlin H; Higgins, Victoria; Adeli, Khosrow; Grey, Vijay L; Hamid, Jemila S

    2017-12-01

    To statistically compare and evaluate commonly used methods of estimating reference intervals and to determine which method is best based on characteristics of the distribution of various data sets. Three approaches for estimating reference intervals, i.e. parametric, non-parametric, and robust, were compared with simulated Gaussian and non-Gaussian data. The hierarchy of the performances of each method was examined based on bias and measures of precision. The findings of the simulation study were illustrated through real data sets. In all Gaussian scenarios, the parametric approach provided the least biased and most precise estimates. In non-Gaussian scenarios, no single method provided the least biased and most precise estimates for both limits of a reference interval across all sample sizes, although the non-parametric approach performed the best for most scenarios. The hierarchy of the performances of the three methods was only impacted by sample size and skewness. Differences between reference interval estimates established by the three methods were inflated by variability. Whenever possible, laboratories should attempt to transform data to a Gaussian distribution and use the parametric approach to obtain the most optimal reference intervals. When this is not possible, laboratories should consider sample size and skewness as factors in their choice of reference interval estimation method. The consequences of false positives or false negatives may also serve as factors in this decision. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. An inventory of nitrous oxide emissions from agriculture in the UK using the IPCC methodology: emission estimate, uncertainty and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Brown, L.; Armstrong Brown, S.; Jarvis, S. C.; Syed, B.; Goulding, K. W. T.; Phillips, V. R.; Sneath, R. W.; Pain, B. F.

    Nitrous oxide emission from UK agriculture was estimated, using the IPCC default values of all emission factors and parameters, to be 87 Gg N 2O-N in both 1990 and 1995. This estimate was shown, however, to have an overall uncertainty of 62%. The largest component of the emission (54%) was from the direct (soil) sector. Two of the three emission factors applied within the soil sector, EF1 (direct emission from soil) and EF3 PRP (emission from pasture range and paddock) were amongst the most influential on the total estimate, producing a ±31 and +11% to -17% change in emissions, respectively, when varied through the IPCC range from the default value. The indirect sector (from leached N and deposited ammonia) contributed 29% of the total emission, and had the largest uncertainty (126%). The factors determining the fraction of N leached (Frac LEACH) and emissions from it (EF5), were the two most influential. These parameters are poorly specified and there is great potential to improve the emission estimate for this component. Use of mathematical models (NCYCLE and SUNDIAL) to predict Frac LEACH suggested that the IPCC default value for this parameter may be too high for most situations in the UK. Comparison with other UK-derived inventories suggests that the IPCC methodology may overestimate emission. Although the IPCC approach includes additional components to the other inventories (most notably emission from indirect sources), estimates for the common components (i.e. fertiliser and animals), and emission factors used, are higher than those of other inventories. Whilst it is recognised that the IPCC approach is generalised in order to allow widespread applicability, sufficient data are available to specify at least two of the most influential parameters, i.e. EF1 and Frac LEACH, more accurately, and so provide an improved estimate of nitrous oxide emissions from UK agriculture.

  19. Radiological Characterization Methodology of INEEL Stored RH-TRU Waste from ANL-E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajiv N. Bhatt

    2003-02-01

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using this methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  20. Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies

    PubMed Central

    El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush

    2013-01-01

    This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282

  1. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    PubMed

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  2. UW Inventory of Freight Emissions (WIFE3) heavy duty diesel vehicle web calculator methodology.

    DOT National Transportation Integrated Search

    2013-09-01

    This document serves as an overview and technical documentation for the University of Wisconsin Inventory of : Freight Emissions (WIFE3) calculator. The WIFE3 web calculator rapidly estimates future heavy duty diesel : vehicle (HDDV) roadway emission...

  3. Methodological Rigor in Preclinical Cardiovascular Studies

    PubMed Central

    Ramirez, F. Daniel; Motazedian, Pouya; Jung, Richard G.; Di Santo, Pietro; MacDonald, Zachary D.; Moreland, Robert; Simard, Trevor; Clancy, Aisling A.; Russo, Juan J.; Welch, Vivian A.; Wells, George A.

    2017-01-01

    Rationale: Methodological sources of bias and suboptimal reporting contribute to irreproducibility in preclinical science and may negatively affect research translation. Randomization, blinding, sample size estimation, and considering sex as a biological variable are deemed crucial study design elements to maximize the quality and predictive value of preclinical experiments. Objective: To examine the prevalence and temporal patterns of recommended study design element implementation in preclinical cardiovascular research. Methods and Results: All articles published over a 10-year period in 5 leading cardiovascular journals were reviewed. Reports of in vivo experiments in nonhuman mammals describing pathophysiology, genetics, or therapeutic interventions relevant to specific cardiovascular disorders were identified. Data on study design and animal model use were collected. Citations at 60 months were additionally examined as a surrogate measure of research impact in a prespecified subset of studies, stratified by individual and cumulative study design elements. Of 28 636 articles screened, 3396 met inclusion criteria. Randomization was reported in 21.8%, blinding in 32.7%, and sample size estimation in 2.3%. Temporal and disease-specific analyses show that the implementation of these study design elements has overall not appreciably increased over the past decade, except in preclinical stroke research, which has uniquely demonstrated significant improvements in methodological rigor. In a subset of 1681 preclinical studies, randomization, blinding, sample size estimation, and inclusion of both sexes were not associated with increased citations at 60 months. Conclusions: Methodological shortcomings are prevalent in preclinical cardiovascular research, have not substantially improved over the past 10 years, and may be overlooked when basing subsequent studies. Resultant risks of bias and threats to study validity have the potential to hinder progress in

  4. Comparison of least squares and exponential sine sweep methods for Parallel Hammerstein Models estimation

    NASA Astrophysics Data System (ADS)

    Rebillat, Marc; Schoukens, Maarten

    2018-05-01

    Linearity is a common assumption for many real-life systems, but in many cases the nonlinear behavior of systems cannot be ignored and must be modeled and estimated. Among the various existing classes of nonlinear models, Parallel Hammerstein Models (PHM) are interesting as they are at the same time easy to interpret as well as to estimate. One way to estimate PHM relies on the fact that the estimation problem is linear in the parameters and thus that classical least squares (LS) estimation algorithms can be used. In that area, this article introduces a regularized LS estimation algorithm inspired on some of the recently developed regularized impulse response estimation techniques. Another mean to estimate PHM consists in using parametric or non-parametric exponential sine sweeps (ESS) based methods. These methods (LS and ESS) are founded on radically different mathematical backgrounds but are expected to tackle the same issue. A methodology is proposed here to compare them with respect to (i) their accuracy, (ii) their computational cost, and (iii) their robustness to noise. Tests are performed on simulated systems for several values of methods respective parameters and of signal to noise ratio. Results show that, for a given set of data points, the ESS method is less demanding in computational resources than the LS method but that it is also less accurate. Furthermore, the LS method needs parameters to be set in advance whereas the ESS method is not subject to conditioning issues and can be fully non-parametric. In summary, for a given set of data points, ESS method can provide a first, automatic, and quick overview of a nonlinear system than can guide more computationally demanding and precise methods, such as the regularized LS one proposed here.

  5. An Overview of the Geological and Geotechnical Aspects of the New Railway Line in the Lower Inn Valley

    NASA Astrophysics Data System (ADS)

    Eder, Stefan; Poscher, Gerhard; Sedlacek, Christoph

    The new railway line in the lower Inn-valley is part of the Brenner railway axis from Munich to Verona (feeder north). The first section between the villages of Kundl and Radfeld, west of Wörgl, and the village of Baumkirchen, east of Innsbruck, will become one of the biggest infrastructure projects ever built in Austria, with a length of approx. 43 km and an underground portion of approx. 80%. The article gives an overview of the various geologic formations - hard rock sections in the valley slopes, different water-saturated gravel and sand formations in the valley floor and geotechnically difficult conditions in sediments of Quaternary terraces. It also describes the methodology of the soil reconnaissance using groundwater models for hydrogeologic estimations, core drillings for evaluating geologic models and describes the experiences gained from the five approx. 7.5 km long reconnaissance tunnels for geotechnical and hydrogeological testing. The results of the soil reconnaissance were used to plan different construction methods, such as excavation in soft rock under a jet grouting roof and compressed-air, as well as mechanised shield with fluid support.

  6. Cost-of-illness studies of atrial fibrillation: methodological considerations.

    PubMed

    Becker, Christian

    2014-10-01

    Atrial fibrillation (AF) is the most common heart rhythm arrhythmia, which has considerable economic consequences. This study aims to identify the current cost-of-illness estimates of AF; a focus was put on describing the studies' methodology. A literature review was conducted. Twenty-eight cost-of-illness studies were identified. Cost-of-illness estimates exist for health insurance members, hospital and primary care populations. In addition, the cost of stroke in AF patients and the costs of post-operative AF were calculated. The methods used were heterogeneous, mostly studies calculated excess costs. The identified annual excess costs varied, even among studies from the USA (∼US$1900 to ∼US$19,000). While pointing toward considerable costs, the cost-of-illness studies' relevance could be improved by focusing on subpopulations and treatment mixes. As possible starting points for subsequent economic studies, the methodology of cost-of-illness studies should be taken into account using methods, allowing stakeholders to find suitable studies and validate estimates.

  7. Application of ANFIS to Phase Estimation for Multiple Phase Shift Keying

    NASA Technical Reports Server (NTRS)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    2000-01-01

    The paper discusses a novel use of Adaptive Neuro-Fuzzy Inference Systems (ANFIS) for estimating phase in Multiple Phase Shift Keying (M-PSK) modulation. A brief overview of communications phase estimation is provided. The modeling of both general open-loop, and closed-loop phase estimation schemes for M-PSK symbols with unknown structure are discussed. Preliminary performance results from simulation of the above schemes are presented.

  8. Impacts of Different Assimilation Methodologies on Crop Yield Estimates Using Active and Passive Microwave Dataset at L-Band

    NASA Astrophysics Data System (ADS)

    Liu, P.; Bongiovanni, T. E.; Monsivais-Huertero, A.; Bindlish, R.; Judge, J.

    2013-12-01

    estimates. An Ensemble Kalman Filter-based methodology is implemented to incorporate σ0 and TB from Aquarius and SMOS in the DSSAT-A-P model to improve crop yield for two growing seasons of soybean -a normal and a drought affected season- in the rain-fed region of the Brazilian La Plata Basin, South America. Different scenarios of assimilation, including active only, passive only, and combined AP observations were considered. The elements of the state vector included both model states and parameters related to soil and vegetation. The number of elements included in the state vector changed depending upon different scenarios of assimilation and also upon the growth stages. Crop yield estimates were compared for different scenarios during the two seasons. A synthetic experiment conducted previously showed an improvement of crop estimates in the RMSD by 90 kg/ha using combined AP compared to the openloop and active only assimilation over the region.

  9. LEVELS OF SYNTHETIC MUSK COMPOUNDS IN MUNICIPAL WASTEWATER FOR ESTIMATION OF BIOTA EXPOSURE IN RECEIVING WATERS

    EPA Science Inventory

    To be presented is an overview of the chemistry, the monitoring methodology, and the statistical evaluation of concentrations obtained from the analysis of a suite of compounds (e.g., Galaxolide®, musk xylene, and amino musk xylene) in an aquatic ecological site.

  10. Crew Module Overview

    NASA Technical Reports Server (NTRS)

    Redifer, Matthew E.

    2011-01-01

    The presentation presents an overview of the Crew Module development for the Pad Abort 1 flight test. The presentation describes the integration activity from the initial delivery of the primary structure through the installation of vehicle subsystems, then to flight test. A brief overview of flight test results is given.

  11. Economic Overview, 1996.

    ERIC Educational Resources Information Center

    Saskatchewan Inst. of Applied Science and Technology, Saskatoon.

    This report provides an overview of economic trends and their effect on labor market training needs in Saskatchewan. Following a brief introduction, part 2 provides an overview of international economic trends, including data on world demographics, while part 3 examines the Canadian economy, focusing on job stability and the employment of…

  12. High-Penetration Photovoltaic Planning Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to provide an overview of select U.S. utility methodologies for performing high-penetration photovoltaic (HPPV) system planning and impact studies. This report covers the Federal Energy Regulatory Commission's orders related to photovoltaic (PV) power system interconnection, particularly the interconnection processes for the Large Generation Interconnection Procedures and Small Generation Interconnection Procedures. In addition, it includes U.S. state interconnection standards and procedures. The procedures used by these regulatory bodies consider the impacts of HPPV power plants on the networks. Technical interconnection requirements for HPPV voltage regulation include aspects of power monitoring, grounding, synchronization, connection tomore » the overall distribution system, back-feeds, disconnecting means, abnormal operating conditions, and power quality. This report provides a summary of mitigation strategies to minimize the impact of HPPV. Recommendations and revisions to the standards may take place as the penetration level of renewables on the grid increases and new technologies develop in future years.« less

  13. Accuracy or precision: Implications of sample design and methodology on abundance estimation

    USGS Publications Warehouse

    Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.

    2015-01-01

    Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.

  14. Utility of Capture-Recapture Methodology to Estimate Prevalence of Congenital Heart Defects Among Adolescents in 11 New York State Counties: 2008 to 2010.

    PubMed

    Akkaya-Hocagil, Tugba; Hsu, Wan-Hsiang; Sommerhalter, Kristin; McGarry, Claire; Van Zutphen, Alissa

    2017-11-01

    Congenital heart defects (CHDs) are the most common birth defects in the United States, and the population of individuals living with CHDs is growing. Though CHD prevalence in infancy has been well characterized, better prevalence estimates among children and adolescents in the United States are still needed. We used capture-recapture methods to estimate CHD prevalence among adolescents residing in 11 New York counties. The three data sources used for analysis included Statewide Planning and Research Cooperative System (SPARCS) hospital inpatient records, SPARCS outpatient records, and medical records provided by seven pediatric congenital cardiac clinics from 2008 to 2010. Bayesian log-linear models were fit using the R package Conting to account for dataset dependencies and heterogeneous catchability. A total of 2537 adolescent CHD cases were captured in our three data sources. Forty-four cases were identified in all data sources, 283 cases were identified in two of three data sources, and 2210 cases were identified in a single data source. The final model yielded an estimated total adolescent CHD population of 3845, indicating that 66% of the cases in the catchment area were identified in the case-identifying data sources. Based on 2010 Census estimates, we estimated adolescent CHD prevalence as 6.4 CHD cases per 1000 adolescents (95% confidence interval: 6.2-6.6). We used capture-recapture methodology with a population-based surveillance system in New York to estimate CHD prevalence among adolescents. Future research incorporating additional data sources may improve prevalence estimates in this population. Birth Defects Research 109:1423-1429, 2017.© 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  15. Stream habitat analysis using the instream flow incremental methodology

    USGS Publications Warehouse

    Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim

    1998-01-01

    This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.

  16. Josephson 4 K-bit cache memory design for a prototype signal processor. I - General overview

    NASA Astrophysics Data System (ADS)

    Henkels, W. H.; Geppert, L. M.; Kadlec, J.; Epperlein, P. W.; Beha, H.

    1985-09-01

    In the early stages of thg Josephson computer project conducted at an American computer company, it was recognized that a very fast cache memory was needed to complement Josephson logic. A subnanosecond access time memory was implemented experimentally on the basis of a 2.5-micron Pb-alloy technology. It was then decided to switch over to a Nb-base-electrode technology with the objective to alleviate problems with the long-term reliability and aging of Pb-based junctions. The present paper provides a general overview of the status of a 4 x 1 K-bit Josephson cache design employing a 2.5-micron Nb-edge-junction technology. Attention is given to the fabrication process and its implications, aspects of circuit design methodology, an overview of system environment and chip components, design changes and status, and various difficulties and uncertainties.

  17. Estimating Radiogenic Cancer Risks

    EPA Pesticide Factsheets

    This document presents a revised methodology for EPA's estimation of cancer risks due to low-LET radiation exposures developed in light of information that has become available, especially new information on the Japanese atomic bomb survivors.

  18. Respondent-Driven Sampling: An Assessment of Current Methodology.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2010-08-01

    Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample.The current estimators of population averages make strong assumptions in order to treat the data as a probability sample. We evaluate three critical sensitivities of the estimators: to bias induced by the initial sample, to uncontrollable features of respondent behavior, and to the without-replacement structure of sampling.Our analysis indicates: (1) that the convenience sample of seeds can induce bias, and the number of sample waves typically used in RDS is likely insufficient for the type of nodal mixing required to obtain the reputed asymptotic unbiasedness; (2) that preferential referral behavior by respondents leads to bias; (3) that when a substantial fraction of the target population is sampled the current estimators can have substantial bias.This paper sounds a cautionary note for the users of RDS. While current RDS methodology is powerful and clever, the favorable statistical properties claimed for the current estimates are shown to be heavily dependent on often unrealistic assumptions. We recommend ways to improve the methodology.

  19. Simulation methods to estimate design power: an overview for applied research.

    PubMed

    Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E

    2011-06-20

    Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.

  20. Simulation methods to estimate design power: an overview for applied research

    PubMed Central

    2011-01-01

    Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447

  1. Support vector regression methodology for estimating global solar radiation in Algeria

    NASA Astrophysics Data System (ADS)

    Guermoui, Mawloud; Rabehi, Abdelaziz; Gairaa, Kacem; Benkaciali, Said

    2018-01-01

    Accurate estimation of Daily Global Solar Radiation (DGSR) has been a major goal for solar energy applications. In this paper we show the possibility of developing a simple model based on the Support Vector Regression (SVM-R), which could be used to estimate DGSR on the horizontal surface in Algeria based only on sunshine ratio as input. The SVM model has been developed and tested using a data set recorded over three years (2005-2007). The data was collected at the Applied Research Unit for Renewable Energies (URAER) in Ghardaïa city. The data collected between 2005-2006 are used to train the model while the 2007 data are used to test the performance of the selected model. The measured and the estimated values of DGSR were compared during the testing phase statistically using the Root Mean Square Error (RMSE), Relative Square Error (rRMSE), and correlation coefficient (r2), which amount to 1.59(MJ/m2), 8.46 and 97,4%, respectively. The obtained results show that the SVM-R is highly qualified for DGSR estimation using only sunshine ratio.

  2. 75 FR 8649 - Request for Comments on Methodology for Conducting an Independent Study of the Burden of Patent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ...] Request for Comments on Methodology for Conducting an Independent Study of the Burden of Patent-Related... methodologies for performing such a study (Methodology Report). ICF has now provided the USPTO with its Methodology Report, in which ICF recommends methodologies for addressing various topics about estimating the...

  3. A Methodology for the Estimation of the Wind Generator Economic Efficiency

    NASA Astrophysics Data System (ADS)

    Zaleskis, G.

    2017-12-01

    Integration of renewable energy sources and the improvement of the technological base may not only reduce the consumption of fossil fuel and environmental load, but also ensure the power supply in regions with difficult fuel delivery or power failures. The main goal of the research is to develop the methodology of evaluation of the wind turbine economic efficiency. The research has demonstrated that the electricity produced from renewable sources may be much more expensive than the electricity purchased from the conventional grid.

  4. Improved population estimates through the use of auxiliary information

    USGS Publications Warehouse

    Johnson, D.H.; Ralph, C.J.; Scott, J.M.

    1981-01-01

    When estimating the size of a population of birds, the investigator may have, in addition to an estimator based on a statistical sample, information on one of several auxiliary variables, such as: (1) estimates of the population made on previous occasions, (2) measures of habitat variables associated with the size of the population, and (3) estimates of the population sizes of other species that correlate with the species of interest. Although many studies have described the relationships between each of these kinds of data and the population size to be estimated, very little work has been done to improve the estimator by incorporating such auxiliary information. A statistical methodology termed 'empirical Bayes' seems to be appropriate to these situations. The potential that empirical Bayes methodology has for improved estimation of the population size of the Mallard (Anas platyrhynchos) is explored. In the example considered, three empirical Bayes estimators were found to reduce the error by one-fourth to one-half of that of the usual estimator.

  5. The evaluation of the National Long Term Care Demonstration. 2. Estimation methodology.

    PubMed Central

    Brown, R S

    1988-01-01

    Channeling effects were estimated by comparing the post-application experience of the treatment and control groups using multiple regression. A variety of potential threats to the validity of the results, including sample composition issues, data issues, and estimation issues, were identified and assessed. Of all the potential problems examined, the only one determined to be likely to cause widespread distortion of program impact estimates was noncomparability of the baseline data. To avoid this distortion, baseline variables judged to be noncomparably measured were excluded from use as control variables in the regression equation. (Where they existed, screen counterparts to these noncomparable baseline variables were used as substitutes.) All of the other potential problems with the sample, data, or regression estimation approach were found to have little or no actual effect on impact estimates or their interpretation. Broad implementation of special procedures, therefore, was not necessary. The study did find that, because of the frequent use of proxy respondents, the estimated effects of channeling on clients' well-being actually may reflect impacts on the well-being of the informal caregiver rather than the client. This and other isolated cases in which there was some evidence of a potential problem for specific outcome variables were identified and examined in detail in technical reports dealing with those outcomes. Where appropriate, alternative estimates were presented. PMID:3130329

  6. A description of the methodology used in an overview of reviews to evaluate evidence on the treatment, harms, diagnosis/classification, prognosis and outcomes used in the management of neck pain.

    PubMed

    Santaguida, P Lina; Keshavarz, Homa; Carlesso, Lisa C; Lomotan, Margaret; Gross, Anita; Macdermid, Joy C; Walton, David M

    2013-01-01

    Neck Pain (NP) is a common musculoskeletal disorder and the literature provides conflicting evidence about its management. To describe the methodology used to conduct an overview of reviews (OvR) and to characterize the distribution and risk of bias profiles across the evidence for all areas of NP management. Standard systematic review (SR) methodology was employed. MEDLINE, CINAHL, EMBASE, ILC, Cochrane CENTRAL, and LILACS were searched from 2000 to March 2012; Narrative and SR and clinical practice guidelines (CPG) evaluating the efficacy of treatment (benefits and harms), diagnosis/classification, prognosis, and outcomes were eligible. For treatment, articles were limited to SRs from 2005 forward. Risk of bias of SR was assessed with the AMSTAR; the AGREE II was used to critically appraise the CPGs. From 2476 articles, 508 were eligible for full text screening. A total of 341 articles were included. Treatment (n=117) had the greatest yield. Other clinical areas had less literature (diagnosis=54, prognosis=16, outcomes=27, harms=16). There were no SR for classification and narrative reviews were problematic for this topic. There was great overlap across different databases within each clinical area except for those for outcome measures. Risk of bias assessment using the AMSTAR of eligible SRs showed a similar trend across different clinical areas. A summary of methods used to review the literature in five clinical areas of NP management have been described. The challenges of selecting and synthesizing eligible articles in an OvR required customized solutions across different areas of clinical focus.

  7. Collisions involving antiprotons and antihydrogen: an overview

    NASA Astrophysics Data System (ADS)

    Jonsell, S.

    2018-03-01

    I give an overview of experimental and theoretical results for antiproton and antihydrogen scattering with atoms and molecules (in particular H, He). At low energies (>1 keV) there are practically no experimental data available. Instead I compare the results from different theoretical calculations, of various degrees of sophistication. At energies up to a few tens of eV, I focus on simple approximations that give reasonably accurate results, as these allow quick estimates of collision rates without embarking on a research project. This article is part of the Theo Murphy meeting issue `Antiproton physics in the ELENA era'.

  8. A methodology for using borehole temperature-depth profiles under ambient, single and cross-borehole pumping conditions to estimate fracture hydraulic properties

    NASA Astrophysics Data System (ADS)

    Klepikova, M.; Le Borgne, T.; Bour, O.; Lavenant, N.

    2011-12-01

    In fractured aquifers flow generally takes place in a few fractured zones. The identification of these main flow paths is critical as it controls the transfer of fluids in the subsurface. For realistic modeling of the flow the knowledge about the spatial variability of hydraulic properties is required. Inverse problems based on hydraulic head data are generally strongly underconstrained. A possible way of reducing the uncertainty is to combine different type of data, such as flow measurements, temperature profiles or tracer test data. Here, we focus on the use of temperature, which can be seen as a natural tracer of ground water flow. Previous studies used temperature anomalies to quantify vertical or horizontal regional groundwater flow velocities. Most of these studies assume that water in the borehole is stagnant, and, thus, the temperature profile in the well is representative of the temperature in the aquifer. In fractured media, differences in hydraulic head between flow paths connected to a borehole generally create ambient vertical flow within the borehole. These differences in hydraulic head are in general due to regional flow conditions. Estimation of borehole vertical flow is of interest as it can be used to derive large scale hydraulic connections. Under a single-borehole configuration, the estimation of vertical flow can be used to estimate the local transimissivities and the hydraulic head differences driving the flow through the borehole. Under a cross-borehole set up, it can be used to characterize hydraulic connections and estimate their hydraulic properties. Using a flow and heat transfer numerical model, we find that the slope of the temperature profile is related directly to vertical borehole flow velocity. Thus, we propose a method to invert temperature measurements to derive borehole flow velocities and subsequently the fracture zone hydraulic and connectivity properties. The advantage of temperature measurements compared to flowmeter

  9. U.S. Comparative and International Graduate Programs: An Overview of Programmatic Size, Relevance, Philosophy, and Methodology

    ERIC Educational Resources Information Center

    Drake, Timothy A.

    2011-01-01

    Previous work has concentrated on the epistemological foundation of comparative and international education (CIE) graduate programs. This study focuses on programmatic size, philosophy, methodology, and pedagogy. It begins by reviewing previous studies. It then provides a theoretical framework and describes the size, relevance, content, and…

  10. State estimation for advanced control of wave energy converters

    DOE Data Explorer

    Coe, Ryan; Bacelli, Giorgio

    2017-04-25

    A report on state estimation for advanced control of wave energy converters (WECs), with supporting data models and slides from the overview presentation. The methods discussed are intended for use to enable real-time closed loop control of WECs.

  11. Revised estimates for direct-effect recreational jobs in the interior Columbia River basin.

    Treesearch

    Lisa K. Crone; Richard W. Haynes

    1999-01-01

    This paper reviews the methodology used to derive the original estimates for direct employment associated with recreation on Federal lands in the interior Columbia River basin (the basin), and details the changes in methodology and data used to derive new estimates. The new analysis resulted in an estimate of 77,655 direct-effect jobs associated with recreational...

  12. Coal resources available for development; a methodology and pilot study

    USGS Publications Warehouse

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  13. Using flow cytometry to estimate pollen DNA content: improved methodology and applications

    PubMed Central

    Kron, Paul; Husband, Brian C.

    2012-01-01

    Background and Aims Flow cytometry has been used to measure nuclear DNA content in pollen, mostly to understand pollen development and detect unreduced gametes. Published data have not always met the high-quality standards required for some applications, in part due to difficulties inherent in the extraction of nuclei. Here we describe a simple and relatively novel method for extracting pollen nuclei, involving the bursting of pollen through a nylon mesh, compare it with other methods and demonstrate its broad applicability and utility. Methods The method was tested across 80 species, 64 genera and 33 families, and the data were evaluated using established criteria for estimating genome size and analysing cell cycle. Filter bursting was directly compared with chopping in five species, yields were compared with published values for sonicated samples, and the method was applied by comparing genome size estimates for leaf and pollen nuclei in six species. Key Results Data quality met generally applied standards for estimating genome size in 81 % of species and the higher best practice standards for cell cycle analysis in 51 %. In 41 % of species we met the most stringent criterion of screening 10 000 pollen grains per sample. In direct comparison with two chopping techniques, our method produced better quality histograms with consistently higher nuclei yields, and yields were higher than previously published results for sonication. In three binucleate and three trinucleate species we found that pollen-based genome size estimates differed from leaf tissue estimates by 1·5 % or less when 1C pollen nuclei were used, while estimates from 2C generative nuclei differed from leaf estimates by up to 2·5 %. Conclusions The high success rate, ease of use and wide applicability of the filter bursting method show that this method can facilitate the use of pollen for estimating genome size and dramatically improve unreduced pollen production estimation with flow cytometry. PMID

  14. Fracture mechanics approach to estimate rail wear limits

    DOT National Transportation Integrated Search

    2009-10-01

    This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...

  15. Using State Estimation Residuals to Detect Abnormal SCADA Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Jian; Chen, Yousu; Huang, Zhenyu

    2010-06-14

    Detection of manipulated supervisory control and data acquisition (SCADA) data is critically important for the safe and secure operation of modern power systems. In this paper, a methodology of detecting manipulated SCADA data based on state estimation residuals is presented. A framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection process. The BACON algorithm is applied to detect outliers in the state estimation residuals. The IEEE 118-bus system is used asmore » a test case to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less

  16. Comparative study of methodologies for pulse wave velocity estimation.

    PubMed

    Salvi, P; Magnani, E; Valbusa, F; Agnoletti, D; Alecu, C; Joly, L; Benetos, A

    2008-10-01

    Arterial stiffness, estimated by pulse wave velocity (PWV), is an independent predictor of cardiovascular mortality and morbidity. However, the clinical applicability of these measurements and the elaboration of reference PWV values are difficult due to differences between the various devices used. In a population of 50 subjects aged 20-84 years, we compared PWV measurements with three frequently used devices: the Complior and the PulsePen, both of which determine aortic PWV as the delay between carotid and femoral pressure wave and the PulseTrace, which estimates the Stiffness Index (SI) by analyzing photoplethysmographic waves acquired on the fingertip. PWV was measured twice by each device. Coefficient of variation of PWV was 12.3, 12.4 and 14.5% for PulsePen, Complior and PulseTrace, respectively. These measurements were compared with the reference method, that is, a simultaneous acquisition of pressure waves using two tonometers. High correlation coefficients with the reference method were observed for PulsePen (r = 0.99) and Complior (r = 0.83), whereas for PulseTrace correlation with the reference method was much lower (r = 0.55). Upon Bland-Altman analysis, mean differences of values +/- 2s.d. versus the reference method were -0.15 +/- 0.62 m/s, 2.09 +/- 2.68 m/s and -1.12 +/- 4.92 m/s, for PulsePen, Complior and Pulse-Trace, respectively. This study confirms the reliability of Complior and PulsePen devices in estimating PWV, while the SI determined by the PulseTrace device was found to be inappropriate as a surrogate of PWV. The present results indicate the urgent need for evaluation and comparison of the different devices to standardize PWV measurements and establish reference values.

  17. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews.

    PubMed

    Shea, Beverley J; Grimshaw, Jeremy M; Wells, George A; Boers, Maarten; Andersson, Neil; Hamel, Candyce; Porter, Ashley C; Tugwell, Peter; Moher, David; Bouter, Lex M

    2007-02-15

    Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. A 37-item assessment tool was formed by combining 1) the enhanced Overview Quality Assessment Questionnaire (OQAQ), 2) a checklist created by Sacks, and 3) three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR) was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.

  18. Turbofan Engine Core Compartment Vent Aerodynamic Configuration Development Methodology

    NASA Technical Reports Server (NTRS)

    Hebert, Leonard J.

    2006-01-01

    This paper presents an overview of the design methodology used in the development of the aerodynamic configuration of the nacelle core compartment vent for a typical Boeing commercial airplane together with design challenges for future design efforts. Core compartment vents exhaust engine subsystem flows from the space contained between the engine case and the nacelle of an airplane propulsion system. These subsystem flows typically consist of precooler, oil cooler, turbine case cooling, compartment cooling and nacelle leakage air. The design of core compartment vents is challenging due to stringent design requirements, mass flow sensitivity of the system to small changes in vent exit pressure ratio, and the need to maximize overall exhaust system performance at cruise conditions.

  19. Ecological Momentary Assessment is a Neglected Methodology in Suicidology.

    PubMed

    Davidson, Collin L; Anestis, Michael D; Gutierrez, Peter M

    2017-01-02

    Ecological momentary assessment (EMA) is a group of research methods that collect data frequently, in many contexts, and in real-world settings. EMA has been fairly neglected in suicidology. The current article provides an overview of EMA for suicidologists including definitions, data collection considerations, and different sampling strategies. Next, the benefits of EMA in suicidology (i.e., reduced recall bias, accurate tracking of fluctuating variables, testing assumptions of theories, use in interventions), participant safety considerations, and examples of published research that investigate self-directed violence variables using EMA are discussed. The article concludes with a summary and suggested directions for EMA research in suicidology with the particular aim to spur the increased use of this methodology among suicidologists.

  20. Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.

    PubMed

    Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen

    2015-11-01

    Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.

  1. Event-scale power law recession analysis: quantifying methodological uncertainty

    NASA Astrophysics Data System (ADS)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  2. Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    These model-based estimates use two surveys, the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS). The two surveys are combined using novel statistical methodology.

  3. Small-scale temporal and spatial variability in the abundance of plastic pellets on sandy beaches: Methodological considerations for estimating the input of microplastics.

    PubMed

    Moreira, Fabiana Tavares; Prantoni, Alessandro Lívio; Martini, Bruno; de Abreu, Michelle Alves; Stoiev, Sérgio Biato; Turra, Alexander

    2016-01-15

    Microplastics such as pellets have been reported for many years on sandy beaches around the globe. Nevertheless, high variability is observed in their estimates and distribution patterns across the beach environment are still to be unravelled. Here, we investigate the small-scale temporal and spatial variability in the abundance of pellets in the intertidal zone of a sandy beach and evaluate factors that can increase the variability in data sets. The abundance of pellets was estimated during twelve consecutive tidal cycles, identifying the position of the high tide between cycles and sampling drift-lines across the intertidal zone. We demonstrate that beach dynamic processes such as the overlap of strandlines and artefacts of the methods can increase the small-scale variability. The results obtained are discussed in terms of the methodological considerations needed to understand the distribution of pellets in the beach environment, with special implications for studies focused on patterns of input. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Proposed methodology for estimating the impact of highway improvements on urban air pollution.

    DOT National Transportation Integrated Search

    1971-01-01

    The aim of this methodology is to indicate the expected change in ambient air quality in the vicinity of a highway improvement and in the total background level of urban air pollution resulting from the highway improvement. Both the jurisdiction in w...

  5. Methodology for the Model-based Small Area Estimates of Cancer-Related Knowledge - Small Area Estimates

    Cancer.gov

    The HINTS is designed to produce reliable estimates at the national and regional levels. GIS maps using HINTS data have been used to provide a visual representation of possible geographic relationships in HINTS cancer-related variables.

  6. A soft-computing methodology for noninvasive time-spatial temperature estimation.

    PubMed

    Teixeira, César A; Ruano, Maria Graça; Ruano, António E; Pereira, Wagner C A

    2008-02-01

    The safe and effective application of thermal therapies is restricted due to lack of reliable noninvasive temperature estimators. In this paper, the temporal echo-shifts of backscattered ultrasound signals, collected from a gel-based phantom, were tracked and assigned with the past temperature values as radial basis functions neural networks input information. The phantom was heated using a piston-like therapeutic ultrasound transducer. The neural models were assigned to estimate the temperature at different intensities and points arranged across the therapeutic transducer radial line (60 mm apart from the transducer face). Model inputs, as well as the number of neurons were selected using the multiobjective genetic algorithm (MOGA). The best attained models present, in average, a maximum absolute error less than 0.5 degrees C, which is pointed as the borderline between a reliable and an unreliable estimator in hyperthermia/diathermia. In order to test the spatial generalization capacity, the best models were tested using spatial points not yet assessed, and some of them presented a maximum absolute error inferior to 0.5 degrees C, being "elected" as the best models. It should be also stressed that these best models present implementational low-complexity, as desired for real-time applications.

  7. A correction in the CDM methodological tool for estimating methane emissions from solid waste disposal sites.

    PubMed

    Santos, M M O; van Elk, A G P; Romanel, C

    2015-12-01

    Solid waste disposal sites (SWDS) - especially landfills - are a significant source of methane, a greenhouse gas. Although having the potential to be captured and used as a fuel, most of the methane formed in SWDS is emitted to the atmosphere, mainly in developing countries. Methane emissions have to be estimated in national inventories. To help this task the Intergovernmental Panel on Climate Change (IPCC) has published three sets of guidelines. In addition, the Kyoto Protocol established the Clean Development Mechanism (CDM) to assist the developed countries to offset their own greenhouse gas emissions by assisting other countries to achieve sustainable development while reducing emissions. Based on methodologies provided by the IPCC regarding SWDS, the CDM Executive Board has issued a tool to be used by project developers for estimating baseline methane emissions in their project activities - on burning biogas from landfills or on preventing biomass to be landfilled and so avoiding methane emissions. Some inconsistencies in the first two IPCC guidelines have already been pointed out in an Annex of IPCC latest edition, although with hidden details. The CDM tool uses a model for methane estimation that takes on board parameters, factors and assumptions provided in the latest IPCC guidelines, while using in its core equation the one of the second IPCC edition with its shortcoming as well as allowing a misunderstanding of the time variable. Consequences of wrong ex-ante estimation of baseline emissions regarding CDM project activities can be of economical or environmental type. Example of the first type is the overestimation of 18% in an actual project on biogas from landfill in Brazil that harms its developers; of the second type, the overestimation of 35% in a project preventing municipal solid waste from being landfilled in China, which harms the environment, not for the project per se but for the undue generated carbon credits. In a simulated landfill - the same

  8. TEST (Toxicity Estimation Software Tool) Ver 4.1

    EPA Science Inventory

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  9. Modeling of plant in vitro cultures: overview and estimation of biotechnological processes.

    PubMed

    Maschke, Rüdiger W; Geipel, Katja; Bley, Thomas

    2015-01-01

    Plant cell and tissue cultivations are of growing interest for the production of structurally complex and expensive plant-derived products, especially in pharmaceutical production. Problems with up-scaling, low yields, and high-priced process conditions result in an increased demand for models to provide comprehension, simulation, and optimization of production processes. In the last 25 years, many models have evolved in plant biotechnology; the majority of them are specialized models for a few selected products or nutritional conditions. In this article we review, delineate, and discuss the concepts and characteristics of the most commonly used models. Therefore, the authors focus on models for plant suspension and submerged hairy root cultures. The article includes a short overview of modeling and mathematics and integrated parameters, as well as the application scope for each model. The review is meant to help researchers better understand and utilize the numerous models published for plant cultures, and to select the most suitable model for their purposes. © 2014 Wiley Periodicals, Inc.

  10. Organizational Change Efforts: Methodologies for Assessing Organizational Effectiveness and Program Costs versus Benefits.

    ERIC Educational Resources Information Center

    Macy, Barry A.; Mirvis, Philip H.

    1982-01-01

    A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…

  11. Evaluating Payments for Environmental Services: Methodological Challenges

    PubMed Central

    2016-01-01

    Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850

  12. Comparing risk estimates following diagnostic CT radiation exposures employing different methodological approaches.

    PubMed

    Kashcheev, Valery V; Pryakhin, Evgeny A; Menyaylo, Alexander N; Chekin, Sergey Yu; Ivanov, Viktor K

    2014-06-01

    The current study has two aims: the first is to quantify the difference between radiation risks estimated with the use of organ or effective doses, particularly when planning pediatric and adult computed tomography (CT) examinations. The second aim is to determine the method of calculating organ doses and cancer risk using dose-length product (DLP) for typical routine CT examinations. In both cases, the radiation-induced cancer risks from medical CT examinations were evaluated as a function of gender and age. Lifetime attributable risk values from CT scanning were estimated with the use of ICRP (Publication 103) risk models and Russian national medical statistics data. For populations under the age of 50 y, the risk estimates based on organ doses usually are 30% higher than estimates based on effective doses. In older populations, the difference can be up to a factor of 2.5. The typical distributions of organ doses were defined for Chest Routine, Abdominal Routine, and Head Routine examinations. The distributions of organ doses were dependent on the anatomical region of scanning. The most exposed organs/tissues were thyroid, breast, esophagus, and lungs in cases of Chest Routine examination; liver, stomach, colon, ovaries, and bladder in cases of Abdominal Routine examination; and brain for Head Routine examinations. The conversion factors for calculation of typical organ doses or tissues at risk using DLP were determined. Lifetime attributable risk of cancer estimated with organ doses calculated from DLP was compared with the risk estimated on the basis of organ doses measured with the use of silicon photodiode dosimeters. The estimated difference in LAR is less than 29%.

  13. Work participation and arthritis: a systematic overview of challenges, adaptations and opportunities for interventions.

    PubMed

    Hoving, Jan L; van Zwieten, Myra C B; van der Meer, Marrit; Sluiter, Judith K; Frings-Dresen, Monique H W

    2013-07-01

    Understanding the factors that play a role in maintaining people with inflammatory arthritis in the workforce may aid the design of interventions to support work participation. The objective of this systematic overview is to summarize qualitative studies that explore experiences of patients with inflammatory arthritis to remain employed or return to work. Bibliographic databases including MEDLINE, EMBASE and PsycInfo were searched until December 2011 to identify any qualitative studies that focused on experiences, challenges or adaptations of patients with inflammatory arthritis to remain employed. Thematic analyses were used to identify any first or higher order themes for which all data were entered into MAXQDA software. In addition, methodological quality was assessed using an eight-item checklist. Of 6338 citations, 10 studies were included. RA was the condition in eight studies. Individual interviews (six studies) were used more frequently than group interviews (four studies). Methodological quality varied from 2 to 8 points and had no effect on the number of themes identified. Thematic analyses showed seven key concepts important to patients, including disease symptoms, management of the disease, socioeconomic issues, work conditions and adaptations, emotional challenges, interpersonal issues affecting work and family life and meaning of work. By including studies from different countries and settings, we show a comprehensive overview of themes considered important by patients and strengthen our belief that these factors should be considered in interventions that aim to improve work participation for patients with inflammatory arthritis.

  14. Gastric Cancer Incidence Estimation in a Resource-Limited Nation: Use of Endoscopy Registry Methodology

    PubMed Central

    Dominguez, Ricardo L.; Crockett, Seth D.; Lund, Jennifer L.; Suazo, Lia P.; Heidt-Davis, Paris; Martin, Christopher; Morgan, Douglas R.

    2013-01-01

    performed for each new diagnosis of gastric cancer. The ASIRs for the decade were 30.8 for males and 13.9 for females. Clinically, 60.3% of gastric cancers were Borrmann type 3 (ulcerated mass), and evidence of advanced disease with pyloric obstruction was common (35.2%). Subtypes by the Lauren classification were distributed among diffuse (56%), intestinal (34%) and indeterminate (9.9%), in subjects with available pathology (526/670). Conclusions The endoscopy procedure registry may serve as a complimentary data resource for gastric cancer incidence estimation in resource-limited nation settings wherein pathology services and cancer registries are absent. The results remain an underestimation in this setting due to the challenges of access-to-care and related factors. The methodology helps to more fully characterize the high incidence of gastric cancer in western Honduras and this region of Central America, and demonstrate the need for additional epidemiology research and interventions focused on prevention and treatment. PMID:23263776

  15. Systematic reviews, overviews of reviews and comparative effectiveness reviews: a discussion of approaches to knowledge synthesis.

    PubMed

    Hartling, Lisa; Vandermeer, Ben; Fernandes, Ricardo M

    2014-06-01

    The Cochrane Collaboration has been at the forefront of developing methods for knowledge synthesis internationally. We discuss three approaches to synthesize evidence for healthcare interventions: systematic reviews (SRs), overviews of reviews and comparative effectiveness reviews. We illustrate these approaches with examples from knowledge syntheses on interventions for bronchiolitis, a common acute paediatric condition. Some of the differences among these approaches are subtle and methods are not necessarily mutually exclusive to a single review type. Systematic reviews bring together evidence from multiple studies in a rigorous fashion for a single intervention or group of interventions. Systematic reviews, as they have developed within healthcare, often focus on single or select interventions and direct pairwise comparisons; therefore, end-users may need to read several individual SRs to inform decision making. Overviews of reviews compile information from multiple SRs relevant to a single health problem. Overviews provide the end-user with a quick overview of the available evidence; however, overviews are dependent on the methods and decisions employed at the SR level. Furthermore, overviews do not often integrate evidence from different SRs quantitatively. Comparative effectiveness reviews, as we define them here, synthesize relevant evidence from individual studies to describe the relative benefits (or harms) of a range of interventions. Comparative effectiveness reviews may use statistical methods (network meta-analysis) to incorporate direct and indirect evidence; therefore, they can provide stronger inferences about the relative effectiveness (or safety) of interventions. While potentially more expensive and time-consuming to produce, a comparative effectiveness review provides a synthesis of a range of interventions for a given condition and the relative efficacy across interventions using consistent and standardized methodology. Copyright © 2014 The

  16. Methodology for building confidence measures

    NASA Astrophysics Data System (ADS)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  17. [Statistical (Poisson) motor unit number estimation. Methodological aspects and normal results in the extensor digitorum brevis muscle of healthy subjects].

    PubMed

    Murga Oporto, L; Menéndez-de León, C; Bauzano Poley, E; Núñez-Castaín, M J

    Among the differents techniques for motor unit number estimation (MUNE) there is the statistical one (Poisson), in which the activation of motor units is carried out by electrical stimulation and the estimation performed by means of a statistical analysis based on the Poisson s distribution. The study was undertaken in order to realize an approximation to the MUNE Poisson technique showing a coprehensible view of its methodology and also to obtain normal results in the extensor digitorum brevis muscle (EDB) from a healthy population. One hundred fourteen normal volunteers with age ranging from 10 to 88 years were studied using the MUNE software contained in a Viking IV system. The normal subjects were divided into two age groups (10 59 and 60 88 years). The EDB MUNE from all them was 184 49. Both, the MUNE and the amplitude of the compound muscle action potential (CMAP) were significantly lower in the older age group (p< 0.0001), showing the MUNE a better correlation with age than CMAP amplitude ( 0.5002 and 0.4142, respectively p< 0.0001). Statistical MUNE method is an important way for the assessment to the phisiology of the motor unit. The value of MUNE correlates better with the neuromuscular aging process than CMAP amplitude does.

  18. Area estimation using multiyear designs and partial crop identification

    NASA Technical Reports Server (NTRS)

    Sielken, R. L., Jr.

    1983-01-01

    Progress is reported for the following areas: (1) estimating the stratum's crop acreage proportion using the multiyear area estimation model; (2) assessment of multiyear sampling designs; and (3) development of statistical methodology for incorporating partially identified sample segments into crop area estimation.

  19. Atmospheric Turbulence Estimates from a Pulsed Lidar

    NASA Technical Reports Server (NTRS)

    Pruis, Matthew J.; Delisi, Donald P.; Ahmad, Nash'at N.; Proctor, Fred H.

    2013-01-01

    Estimates of the eddy dissipation rate (EDR) were obtained from measurements made by a coherent pulsed lidar and compared with estimates from mesoscale model simulations and measurements from an in situ sonic anemometer at the Denver International Airport and with EDR estimates from the last observation time of the trailing vortex pair. The estimates of EDR from the lidar were obtained using two different methodologies. The two methodologies show consistent estimates of the vertical profiles. Comparison of EDR derived from the Weather Research and Forecast (WRF) mesoscale model with the in situ lidar estimates show good agreement during the daytime convective boundary layer, but the WRF simulations tend to overestimate EDR during the nighttime. The EDR estimates from a sonic anemometer located at 7.3 meters above ground level are approximately one order of magnitude greater than both the WRF and lidar estimates - which are from greater heights - during the daytime convective boundary layer and substantially greater during the nighttime stable boundary layer. The consistency of the EDR estimates from different methods suggests a reasonable ability to predict the temporal evolution of a spatially averaged vertical profile of EDR in an airport terminal area using a mesoscale model during the daytime convective boundary layer. In the stable nighttime boundary layer, there may be added value to EDR estimates provided by in situ lidar measurements.

  20. A methodology for overall consequence modeling in chemical industry.

    PubMed

    Arunraj, N S; Maiti, J

    2009-09-30

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  1. Sampling based State of Health estimation methodology for Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Camci, Fatih; Ozkurt, Celil; Toker, Onur; Atamuradov, Vepa

    2015-03-01

    Storage and management of energy is becoming a more and more important problem every day, especially for electric and hybrid vehicle applications. Li-ion battery is one of the most important technological alternatives for high capacity energy storage and related industrial applications. State of Health (SoH) of Li-ion batteries plays a critical role in their deployment from economic, safety, and availability aspects. Most, if not all, of the studies related to SoH estimation focus on the measurement of a new parameter/physical phenomena related to SoH, or development of new statistical/computational methods using several parameters. This paper presents a new approach for SoH estimation for Li-ion battery systems with multiple battery cells: The main idea is a new circuit topology which enables separation of battery cells into two groups, main and test batteries, whenever a SoH related measurement is to be conducted. All battery cells will be connected to the main battery during the normal mode of operation. When a measurement is needed for SoH estimation, some of the cells will be separated from the main battery, and SoH estimation related measurements will be performed on these units. Compared to classical SoH measurement methods which deal with whole battery system, the proposed method estimates the SoH of the system by separating a small but representative set of cells. While SoH measurements are conducted on these isolated cells, remaining cells in the main battery continue to function in normal mode, albeit in slightly reduced performance levels. Preliminary experimental results are quite promising, and validate the feasibility of the proposed approach. Technical details of the proposed circuit architecture are also summarized in the paper.

  2. Tribology and Mechanical Components Branch Overview

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.

    2010-01-01

    An overview of NASA Glenn Research Center's Tribology & Mechanical Components Branch is provided. Work in space mechanisms, seals, oil-free turbomachinery, and mechanical components is presented. An overview of current research for these technology areas is contained in this overview.

  3. The microwave-assisted ionic-liquid method: a promising methodology in nanomaterials.

    PubMed

    Ma, Ming-Guo; Zhu, Jie-Fang; Zhu, Ying-Jie; Sun, Run-Cang

    2014-09-01

    In recent years, the microwave-assisted ionic-liquid method has been accepted as a promising methodology for the preparation of nanomaterials and cellulose-based nanocomposites. Applications of this method in the preparation of cellulose-based nanocomposites comply with the major principles of green chemistry, that is, they use an environmentally friendly method in environmentally preferable solvents to make use of renewable materials. This minireview focuses on the recent development of the synthesis of nanomaterials and cellulose-based nanocomposites by means of the microwave-assisted ionic-liquid method. We first discuss the preparation of nanomaterials including noble metals, metal oxides, complex metal oxides, metal sulfides, and other nanomaterials by means of this method. Then we provide an overview of the synthesis of cellulose-based nanocomposites by using this method. The emphasis is on the synthesis, microstructure, and properties of nanostructured materials obtained through this methodology. Our recent research on nanomaterials and cellulose-based nanocomposites by this rapid method is summarized. In addition, the formation mechanisms involved in the microwave-assisted ionic-liquid synthesis of nanostructured materials are discussed briefly. Finally, the future perspectives of this methodology in the synthesis of nanostructured materials are proposed. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Methodology for earthquake rupture rate estimates of fault networks: example for the western Corinth rift, Greece

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Lyon-Caen, Hélène; Boiselet, Aurélien

    2017-10-01

    Modeling the seismic potential of active faults is a fundamental step of probabilistic seismic hazard assessment (PSHA). An accurate estimation of the rate of earthquakes on the faults is necessary in order to obtain the probability of exceedance of a given ground motion. Most PSHA studies consider faults as independent structures and neglect the possibility of multiple faults or fault segments rupturing simultaneously (fault-to-fault, FtF, ruptures). The Uniform California Earthquake Rupture Forecast version 3 (UCERF-3) model takes into account this possibility by considering a system-level approach rather than an individual-fault-level approach using the geological, seismological and geodetical information to invert the earthquake rates. In many places of the world seismological and geodetical information along fault networks is often not well constrained. There is therefore a need to propose a methodology relying on geological information alone to compute earthquake rates of the faults in the network. In the proposed methodology, a simple distance criteria is used to define FtF ruptures and consider single faults or FtF ruptures as an aleatory uncertainty, similarly to UCERF-3. Rates of earthquakes on faults are then computed following two constraints: the magnitude frequency distribution (MFD) of earthquakes in the fault system as a whole must follow an a priori chosen shape and the rate of earthquakes on each fault is determined by the specific slip rate of each segment depending on the possible FtF ruptures. The modeled earthquake rates are then compared to the available independent data (geodetical, seismological and paleoseismological data) in order to weight different hypothesis explored in a logic tree.The methodology is tested on the western Corinth rift (WCR), Greece, where recent advancements have been made in the understanding of the geological slip rates of the complex network of normal faults which are accommodating the ˜ 15 mm yr-1 north

  5. Introduction to the special collection of papers on the San Luis Basin Sustainability Metrics Project: a methodology for evaluating regional sustainability.

    PubMed

    Heberling, Matthew T; Hopton, Matthew E

    2012-11-30

    This paper introduces a collection of four articles describing the San Luis Basin Sustainability Metrics Project. The Project developed a methodology for evaluating regional sustainability. This introduction provides the necessary background information for the project, description of the region, overview of the methods, and summary of the results. Although there are a multitude of scientifically based sustainability metrics, many are data intensive, difficult to calculate, and fail to capture all aspects of a system. We wanted to see if we could develop an approach that decision-makers could use to understand if their system was moving toward or away from sustainability. The goal was to produce a scientifically defensible, but straightforward and inexpensive methodology to measure and monitor environmental quality within a regional system. We initiated an interdisciplinary pilot project in the San Luis Basin, south-central Colorado, to test the methodology. The objectives were: 1) determine the applicability of using existing datasets to estimate metrics of sustainability at a regional scale; 2) calculate metrics through time from 1980 to 2005; and 3) compare and contrast the results to determine if the system was moving toward or away from sustainability. The sustainability metrics, chosen to represent major components of the system, were: 1) Ecological Footprint to capture the impact and human burden on the system; 2) Green Net Regional Product to represent economic welfare; 3) Emergy to capture the quality-normalized flow of energy through the system; and 4) Fisher information to capture the overall dynamic order and to look for possible regime changes. The methodology, data, and results of each metric are presented in the remaining four papers of the special collection. Based on the results of each metric and our criteria for understanding the sustainability trends, we find that the San Luis Basin is moving away from sustainability. Although we understand

  6. Population forecasts for Bangladesh, using a Bayesian methodology.

    PubMed

    Mahsin, Md; Hossain, Syed Shahadat

    2012-12-01

    Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.

  7. Estimates of alcohol involvement in fatal crashes : new alcohol methodology

    DOT National Transportation Integrated Search

    2002-01-01

    The National Highway Traffic Safety Administration (NHTSA) has adopted a new method to : estimate missing blood alcohol concentration (BAC) test result data. This new method, multiple : imputation, will be used by NHTSAs National Center for Statis...

  8. Opportunities and methodological challenges in EEG and MEG resting state functional brain network research.

    PubMed

    van Diessen, E; Numan, T; van Dellen, E; van der Kooi, A W; Boersma, M; Hofman, D; van Lutterveld, R; van Dijk, B W; van Straaten, E C W; Hillebrand, A; Stam, C J

    2015-08-01

    Electroencephalogram (EEG) and magnetoencephalogram (MEG) recordings during resting state are increasingly used to study functional connectivity and network topology. Moreover, the number of different analysis approaches is expanding along with the rising interest in this research area. The comparison between studies can therefore be challenging and discussion is needed to underscore methodological opportunities and pitfalls in functional connectivity and network studies. In this overview we discuss methodological considerations throughout the analysis pipeline of recording and analyzing resting state EEG and MEG data, with a focus on functional connectivity and network analysis. We summarize current common practices with their advantages and disadvantages; provide practical tips, and suggestions for future research. Finally, we discuss how methodological choices in resting state research can affect the construction of functional networks. When taking advantage of current best practices and avoid the most obvious pitfalls, functional connectivity and network studies can be improved and enable a more accurate interpretation and comparison between studies. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Methodological quality of systematic reviews analyzing the use of laser therapy in restorative dentistry.

    PubMed

    Salmos, Janaina; Gerbi, Marleny E M M; Braz, Rodivan; Andrade, Emanuel S S; Vasconcelos, Belmiro C E; Bessa-Nogueira, Ricardo V

    2010-01-01

    The purpose of this study was to identify systematic reviews (SRs) that compared laser with other dental restorative procedures and to evaluate their methodological quality. A search strategy was developed and implemented for MEDLINE, the Cochrane Library, LILACS, and the Brazilian Dentistry Bibliography (1966- 2007). Inclusion criteria were: the article had to be an SR (+/- meta-analysis); primary focus was the use of laser in restorative dentistry; published in English, Spanish, Portuguese, Italian, German. Two investigators independently selected and evaluated the SRs. The overview quality assessment questionnaire (OQAQ) was used to evaluate methodological quality, and the results were averaged. There were 145 references identified, of which seven were SRs that met the inclusion criteria (kappa=0.81). Of the SRs, 71.4% appraised lasers in dental caries diagnosis. The mean overall OQAQ score was 4.4 [95% confidence interval (CI) 2.4- 6.5]. Of the SRs, 57.1% had major flaws, scoring < or = 4. SR methodological quality is low; therefore, clinicians should critically appraise them prior to considering their recommendations to guide patient care.

  10. Land Remote Sensing Overview

    NASA Technical Reports Server (NTRS)

    Byrnes, Ray

    2007-01-01

    A general overview of the USGS land remote sensing program is presented. The contents include: 1) Brief overview of USGS land remote sensing program; 2) Highlights of JACIE work at USGS; 3) Update on NASA/USGS Landsat Data Continuity Mission; and 4) Notes on alternative data sources.

  11. Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site

    NASA Astrophysics Data System (ADS)

    Albarello, D.; Mucciarelli, M.

    - A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.

  12. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Therkelsen, Peter L.; Rao, Prakash; Aghajanzadeh, Arian

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performancemore » improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.« less

  13. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  14. Allometric scaling theory applied to FIA biomass estimation

    Treesearch

    David C. Chojnacky

    2002-01-01

    Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...

  15. Asynchronous Processing of a Constellation of Geostationary and Polar-Orbiting Satellites for Fire Detection and Smoke Estimation

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Peterson, D. A.; Curtis, C. A.; Schmidt, C. C.; Hoffman, J.; Prins, E. M.

    2014-12-01

    The Fire Locating and Monitoring of Burning Emissions (FLAMBE) system converts satellite observations of thermally anomalous pixels into spatially and temporally continuous estimates of smoke release from open biomass burning. This system currently processes data from a constellation of 5 geostationary and 2 polar-orbiting sensors. Additional sensors, including NPP VIIRS and the imager on the Korea COMS-1 geostationary satellite, will soon be added. This constellation experiences schedule changes and outages of various durations, making the set of available scenes for fire detection highly variable on an hourly and daily basis. Adding to the complexity, the latency of the satellite data is variable between and within sensors. FLAMBE shares with many fire detection systems the goal of detecting as many fires as possible as early as possible, but the FLAMBE system must also produce a consistent estimate of smoke production with minimal artifacts from the changing constellation. To achieve this, NRL has developed a system of asynchronous processing and cross-calibration that permits satellite data to be used as it arrives, while preserving the consistency of the smoke emission estimates. This talk describes the asynchronous data ingest methodology, including latency statistics for the constellation. We also provide an overview and show results from the system we have developed to normalize multi-sensor fire detection for consistency.

  16. Design and Implementation of an Intelligent Cost Estimation Model for Decision Support System Software

    DTIC Science & Technology

    1990-09-01

    following two chapters. 28 V. COCOMO MODEL A. OVERVIEW The COCOMO model which stands for COnstructive COst MOdel was developed by Barry Boehm and is...estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W. Boehm and...cost estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W

  17. A methodology for designing robust multivariable nonlinear control systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Grunberg, D. B.

    1986-01-01

    A new methodology is described for the design of nonlinear dynamic controllers for nonlinear multivariable systems providing guarantees of closed-loop stability, performance, and robustness. The methodology is an extension of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery (LQG/LTR) methodology for linear systems, thus hinging upon the idea of constructing an approximate inverse operator for the plant. A major feature of the methodology is a unification of both the state-space and input-output formulations. In addition, new results on stability theory, nonlinear state estimation, and optimal nonlinear regulator theory are presented, including the guaranteed global properties of the extended Kalman filter and optimal nonlinear regulators.

  18. Overview: The Design, Adoption, and Analysis of a Visual Document Mining Tool for Investigative Journalists.

    PubMed

    Brehmer, Matthew; Ingram, Stephen; Stray, Jonathan; Munzner, Tamara

    2014-12-01

    For an investigative journalist, a large collection of documents obtained from a Freedom of Information Act request or a leak is both a blessing and a curse: such material may contain multiple newsworthy stories, but it can be difficult and time consuming to find relevant documents. Standard text search is useful, but even if the search target is known it may not be possible to formulate an effective query. In addition, summarization is an important non-search task. We present Overview, an application for the systematic analysis of large document collections based on document clustering, visualization, and tagging. This work contributes to the small set of design studies which evaluate a visualization system "in the wild", and we report on six case studies where Overview was voluntarily used by self-initiated journalists to produce published stories. We find that the frequently-used language of "exploring" a document collection is both too vague and too narrow to capture how journalists actually used our application. Our iterative process, including multiple rounds of deployment and observations of real world usage, led to a much more specific characterization of tasks. We analyze and justify the visual encoding and interaction techniques used in Overview's design with respect to our final task abstractions, and propose generalizable lessons for visualization design methodology.

  19. Daily estimates of soil ingestion in children.

    PubMed Central

    Stanek, E J; Calabrese, E J

    1995-01-01

    Soil ingestion estimates play an important role in risk assessment of contaminated sites, and estimates of soil ingestion in children are of special interest. Current estimates of soil ingestion are trace-element specific and vary widely among elements. Although expressed as daily estimates, the actual estimates have been constructed by averaging soil ingestion over a study period of several days. The wide variability has resulted in uncertainty as to which method of estimation of soil ingestion is best. We developed a methodology for calculating a single estimate of soil ingestion for each subject for each day. Because the daily soil ingestion estimate represents the median estimate of eligible daily trace-element-specific soil ingestion estimates for each child, this median estimate is not trace-element specific. Summary estimates for individuals and weeks are calculated using these daily estimates. Using this methodology, the median daily soil ingestion estimate for 64 children participating in the 1989 Amherst soil ingestion study is 13 mg/day or less for 50% of the children and 138 mg/day or less for 95% of the children. Mean soil ingestion estimates (for up to an 8-day period) were 45 mg/day or less for 50% of the children, whereas 95% of the children reported a mean soil ingestion of 208 mg/day or less. Daily soil ingestion estimates were used subsequently to estimate the mean and variance in soil ingestion for each child and to extrapolate a soil ingestion distribution over a year, assuming that soil ingestion followed a log-normal distribution. Images Figure 1. Figure 2. Figure 3. Figure 4. PMID:7768230

  20. A Systematic Overview of Reviews for Complementary and Alternative Therapies in the Treatment of the Fibromyalgia Syndrome

    PubMed Central

    Häuser, Winfried; Dobos, Gustav; Langhorst, Jost

    2015-01-01

    Objectives. This systematic overview of reviews aimed to summarize evidence and methodological quality from systematic reviews of complementary and alternative medicine (CAM) for the fibromyalgia syndrome (FMS). Methods. The PubMed/MEDLINE, Cochrane Library, and Scopus databases were screened from their inception to Sept 2013 to identify systematic reviews and meta-analyses of CAM interventions for FMS. Methodological quality of reviews was rated using the AMSTAR instrument. Results. Altogether 25 systematic reviews were found; they investigated the evidence of CAM in general, exercised-based CAM therapies, manipulative therapies, Mind/Body therapies, acupuncture, hydrotherapy, phytotherapy, and homeopathy. Methodological quality of reviews ranged from lowest to highest possible quality. Consistently positive results were found for tai chi, yoga, meditation and mindfulness-based interventions, hypnosis or guided imagery, electromyogram (EMG) biofeedback, and balneotherapy/hydrotherapy. Inconsistent results concerned qigong, acupuncture, chiropractic interventions, electroencephalogram (EEG) biofeedback, and nutritional supplements. Inconclusive results were found for homeopathy and phytotherapy. Major methodological flaws included missing details on data extraction process, included or excluded studies, study details, and adaption of conclusions based on quality assessment. Conclusions. Despite a growing body of scientific evidence of CAM therapies for the management of FMS systematic reviews still show methodological flaws limiting definite conclusions about their efficacy and safety. PMID:26246841

  1. A Systematic Overview of Reviews for Complementary and Alternative Therapies in the Treatment of the Fibromyalgia Syndrome.

    PubMed

    Lauche, Romy; Cramer, Holger; Häuser, Winfried; Dobos, Gustav; Langhorst, Jost

    2015-01-01

    Objectives. This systematic overview of reviews aimed to summarize evidence and methodological quality from systematic reviews of complementary and alternative medicine (CAM) for the fibromyalgia syndrome (FMS). Methods. The PubMed/MEDLINE, Cochrane Library, and Scopus databases were screened from their inception to Sept 2013 to identify systematic reviews and meta-analyses of CAM interventions for FMS. Methodological quality of reviews was rated using the AMSTAR instrument. Results. Altogether 25 systematic reviews were found; they investigated the evidence of CAM in general, exercised-based CAM therapies, manipulative therapies, Mind/Body therapies, acupuncture, hydrotherapy, phytotherapy, and homeopathy. Methodological quality of reviews ranged from lowest to highest possible quality. Consistently positive results were found for tai chi, yoga, meditation and mindfulness-based interventions, hypnosis or guided imagery, electromyogram (EMG) biofeedback, and balneotherapy/hydrotherapy. Inconsistent results concerned qigong, acupuncture, chiropractic interventions, electroencephalogram (EEG) biofeedback, and nutritional supplements. Inconclusive results were found for homeopathy and phytotherapy. Major methodological flaws included missing details on data extraction process, included or excluded studies, study details, and adaption of conclusions based on quality assessment. Conclusions. Despite a growing body of scientific evidence of CAM therapies for the management of FMS systematic reviews still show methodological flaws limiting definite conclusions about their efficacy and safety.

  2. Simplified Methodology to Estimate the Maximum Liquid Helium (LHe) Cryostat Pressure from a Vacuum Jacket Failure

    NASA Technical Reports Server (NTRS)

    Ungar, Eugene K.; Richards, W. Lance

    2015-01-01

    The aircraft-based Stratospheric Observatory for Infrared Astronomy (SOFIA) is a platform for multiple infrared astronomical observation experiments. These experiments carry sensors cooled to liquid helium temperatures. The liquid helium supply is contained in large (i.e., 10 liters or more) vacuum-insulated dewars. Should the dewar vacuum insulation fail, the inrushing air will condense and freeze on the dewar wall, resulting in a large heat flux on the dewar's contents. The heat flux results in a rise in pressure and the actuation of the dewar pressure relief system. A previous NASA Engineering and Safety Center (NESC) assessment provided recommendations for the wall heat flux that would be expected from a loss of vacuum and detailed an appropriate method to use in calculating the maximum pressure that would occur in a loss of vacuum event. This method involved building a detailed supercritical helium compressible flow thermal/fluid model of the vent stack and exercising the model over the appropriate range of parameters. The experimenters designing science instruments for SOFIA are not experts in compressible supercritical flows and do not generally have access to the thermal/fluid modeling packages that are required to build detailed models of the vent stacks. Therefore, the SOFIA Program engaged the NESC to develop a simplified methodology to estimate the maximum pressure in a liquid helium dewar after the loss of vacuum insulation. The method would allow the university-based science instrument development teams to conservatively determine the cryostat's vent neck sizing during preliminary design of new SOFIA Science Instruments. This report details the development of the simplified method, the method itself, and the limits of its applicability. The simplified methodology provides an estimate of the dewar pressure after a loss of vacuum insulation that can be used for the initial design of the liquid helium dewar vent stacks. However, since it is not an exact

  3. Baseline Study Methodology for Future Phases of Research on Nuclear Power Plant Control Room Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya Lee; Bower, Gordon Ross; Hill, Rachael Ann

    In order to provide a basis for industry adoption of advanced technologies, the Control Room Upgrades Benefits Research Project will investigate the benefits of including advanced technologies as part of control room modernization This report describes the background, methodology, and research plan for the first in a series of full-scale studies to test the effects of advanced technology in NPP control rooms. This study will test the effect of Advanced Overview Displays in the partner Utility’s control room simulator

  4. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial

    PubMed Central

    Hallgren, Kevin A.

    2012-01-01

    Many research designs require the assessment of inter-rater reliability (IRR) to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR. PMID:22833776

  5. Decentralized and self-centered estimation architecture for formation flying of spacecraft

    NASA Technical Reports Server (NTRS)

    Kang, B. H.; Hadaegh, F. Y.; Scharf, D. P.; Ke, N. -P.

    2001-01-01

    Formation estimation methodologies for distributed spacecraft systems are formulated and analyzed. A generic form of the formation estimation problem is described by defining a common hardware configuration, observation graph, and feasible estimation topologies.

  6. Estimation of fecundability from survey data.

    PubMed

    Goldman, N; Westoff, C F; Paul, L E

    1985-01-01

    The estimation of fecundability from survey data is plagued by methodological problems such as misreporting of dates of birth and marriage and the occurrence of premarital exposure to the risk of conception. Nevertheless, estimates of fecundability from World Fertility Survey data for women married in recent years appear to be plausible for most of the surveys analyzed here and are quite consistent with estimates reported in earlier studies. The estimates presented in this article are all derived from the first interval, the interval between marriage or consensual union and the first live birth conception.

  7. Combat Stress: A Collateral Effect in the Operational Effectiveness Loss Multiplier (OELM) Methodology

    DTIC Science & Technology

    2015-02-01

    5202, Draft Final (Alexandria, VA: IDA, April 2015), 10-4. 14 North Atlantic Treaty Organization (NATO) Standardization Agency ( NSA ), NATO Glossary of...Belgium: NSA , 2012), 2-C-2. 15 Disraelly et al., “A New Methodology for CBRN Casualty Estimation,” 228. 16 Disraelly et al., A Methodology for...20 NATO NSA , AAP-06, 2-K-1. 21 Ibid., 2-D-6. 22 Disraelly et al., A Methodology for Examining Collateral Effects on Military Operations during

  8. Cost Estimation of Naval Ship Acquisition.

    DTIC Science & Technology

    1983-12-01

    one a 9-sub- system model , the other a single total cost model . The models were developed using the linear least squares regression tech- nique with...to Linear Statistical Models , McGraw-Hill, 1961. 11. Helmer, F. T., Bibliography on Pricing Methodology and Cost Estimating, Dept. of Economics and...SUPPI.EMSaTARY NOTES IS. KWRo" (Cowaft. en tever aide of ..aesep M’ Idab~t 6 Week ONNa.) Cost estimation; Acquisition; Parametric cost estimate; linear

  9. Methodology for Estimating Thermal and Neutron Embrittlement of Cast Austenitic Stainless Steels During Service in Light Water Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chopra, O. K.; Rao, A. S.

    2016-04-28

    Cast austenitic stainless steel (CASS) materials, which have a duplex structure consisting of austenite and ferrite phases, are susceptible to thermal embrittlement during reactor service. In addition, the prolonged exposure of these materials, which are used in reactor core internals, to neutron irradiation changes their microstructure and microchemistry, and these changes degrade their fracture properties even further. This paper presents a revision of the procedure and correlations presented in NUREG/CR-4513, Rev. 1 (Aug. 1994) for predicting the change in fracture toughness and tensile properties of CASS components due to thermal aging during service in light water reactors (LWRs) at 280–330more » °C (535–625 °F). The methodology is applicable to CF-3, CF-3M, CF-8, and CF-8M materials with a ferrite content of up to 40%. The fracture toughness, tensile strength, and Charpy-impact energy of aged CASS materials are estimated from known material information. Embrittlement is characterized in terms of room-temperature (RT) Charpy-impact energy. The extent or degree of thermal embrittlement at “saturation” (i.e., the minimum impact energy that can be achieved for a material after long-term aging) is determined from the chemical composition of the material. Charpy-impact energy as a function of the time and temperature of reactor service is estimated from the kinetics of thermal embrittlement, which are also determined from the chemical composition. The fracture toughness J-R curve for the aged material is then obtained by correlating RT Charpy-impact energy with fracture toughness parameters. A common “predicted lower-bound” J-R curve for CASS materials of unknown chemical composition is also defined for a given grade of material, range of ferrite content, and temperature. In addition, guidance is provided for evaluating the combined effects of thermal and neutron embrittlement of CASS materials used in the reactor core internal components. The

  10. Postmortem aviation forensic toxicology: an overview.

    PubMed

    Chaturvedi, Arvind K

    2010-05-01

    An overview of the subtopic aviation combustion toxicology of the field of aerospace toxicology has been published. In a continuation of the overview, the findings associated with postmortem aviation forensic toxicology are being summarized in the present overview. A literature search for the period of 1960-2007 was performed. The important findings related to postmortem toxicology were evaluated. In addition to a brief introduction, this overview is divided into the sections of analytical methods; carboxyhemoglobin and blood cyanide ion; ethanol; drugs; result interpretation; glucose and hemoglobin A(1c); and references. Specific details of the subject matter were discussed. It is anticipated that this overview will be an outline source for aviation forensic toxicology within the field of aerospace toxicology.

  11. NOAA Office of Exploration and Research > Education > Overview

    Science.gov Websites

    Exploration Marine Archaeology Ocean and Coastal Mapping Advancing Technology Overview Technology Initiatives Coastal Mapping Advancing Technology Overview Technology Initiatives Science Overview Data Access Overview

  12. Methodology and Estimates of Scour at Selected Bridge Sites in Alaska

    USGS Publications Warehouse

    Heinrichs, Thomas A.; Kennedy, Ben W.; Langley, Dustin E.; Burrows, Robert L.

    2001-01-01

    The U.S. Geological Survey estimated scour depths at 325 bridges in Alaska as part of a cooperative agreement with the Alaska Department of Transportation and Public Facilities. The department selected these sites from approximately 806 State-owned bridges as potentially susceptible to scour during extreme floods. Pier scour and contraction scour were computed for the selected bridges by using methods recommended by the Federal Highway Administration. The U.S. Geological Survey used a four-step procedure to estimate scour: (1) Compute magnitudes of the 100- and 500-year floods. (2) Determine cross-section geometry and hydraulic properties for each bridge site. (3) Compute the water-surface profile for the 100- and 500-year floods. (4) Compute contraction and pier scour. This procedure is unique because the cross sections were developed from existing data on file to make a quantitative estimate of scour. This screening method has the advantage of providing scour depths and bed elevations for comparison with bridge-foundation elevations without the time and expense of a field survey. Four examples of bridge-scour analyses are summarized in the appendix.

  13. Measures and Indicators of Vgi Quality: AN Overview

    NASA Astrophysics Data System (ADS)

    Antoniou, V.; Skopeliti, A.

    2015-08-01

    The evaluation of VGI quality has been a very interesting and popular issue amongst academics and researchers. Various metrics and indicators have been proposed for evaluating VGI quality elements. Various efforts have focused on the use of well-established methodologies for the evaluation of VGI quality elements against authoritative data. In this paper, a number of research papers have been reviewed and summarized in a detailed report on measures for each spatial data quality element. Emphasis is given on the methodology followed and the data used in order to assess and evaluate the quality of the VGI datasets. However, as the use of authoritative data is not always possible many researchers have turned their focus on the analysis of new quality indicators that can function as proxies for the understanding of VGI quality. In this paper, the difficulties in using authoritative datasets are briefly presented and new proposed quality indicators are discussed, as recorded through the literature review. We classify theses new indicators in four main categories that relate with: i) data, ii) demographics, iii) socio-economic situation and iv) contributors. This paper presents a dense, yet comprehensive overview of the research on this field and provides the basis for the ongoing academic effort to create a practical quality evaluation method through the use of appropriate quality indicators.

  14. Overview of Automotive Core Tools: Applications and Benefits

    NASA Astrophysics Data System (ADS)

    Doshi, Jigar A.; Desai, Darshak

    2017-08-01

    Continuous improvement of product and process quality is always challenging and creative task in today's era of globalization. Various quality tools are available and used for the same. Some of them are successful and few of them are not. Considering the complexity in the continuous quality improvement (CQI) process various new techniques are being introduced by the industries, as well as proposed by researchers and academia. Lean Manufacturing, Six Sigma, Lean Six Sigma is some of the techniques. In recent years, there are new tools being opted by the industry, especially automotive, called as Automotive Core Tools (ACT). The intention of this paper is to review the applications and benefits along with existing research on Automotive Core Tools with special emphasis on continuous quality improvement. The methodology uses an extensive review of literature through reputed publications—journals, conference proceedings, research thesis, etc. This paper provides an overview of ACT, its enablers, and exertions, how it evolved into sophisticated methodologies and benefits used in organisations. It should be of value to practitioners of Automotive Core Tools and to academics who are interested in how CQI can be achieved using ACT. It needs to be stressed here that this paper is not intended to scorn Automotive Core Tools, rather, its purpose is limited only to provide a balance on the prevailing positive views toward ACT.

  15. Methodology for computing the burden of disease of adverse events following immunization.

    PubMed

    McDonald, Scott A; Nijsten, Danielle; Bollaerts, Kaatje; Bauwens, Jorgen; Praet, Nicolas; van der Sande, Marianne; Bauchau, Vincent; de Smedt, Tom; Sturkenboom, Miriam; Hahné, Susan

    2018-03-24

    Composite disease burden measures such as disability-adjusted life-years (DALY) have been widely used to quantify the population-level health impact of disease or injury, but application has been limited for the estimation of the burden of adverse events following immunization. Our objective was to assess the feasibility of adapting the DALY approach for estimating adverse event burden. We developed a practical methodological framework, explicitly describing all steps involved: acquisition of relative or absolute risks and background event incidence rates, selection of disability weights and durations, and computation of the years lived with disability (YLD) measure, with appropriate estimation of uncertainty. We present a worked example, in which YLD is computed for 3 recognized adverse reactions following 3 childhood vaccination types, based on background incidence rates and relative/absolute risks retrieved from the literature. YLD provided extra insight into the health impact of an adverse event over presentation of incidence rates only, as severity and duration are additionally incorporated. As well as providing guidance for the deployment of DALY methodology in the context of adverse events associated with vaccination, we also identified where data limitations potentially occur. Burden of disease methodology can be applied to estimate the health burden of adverse events following vaccination in a systematic way. As with all burden of disease studies, interpretation of the estimates must consider the quality and accuracy of the data sources contributing to the DALY computation. © 2018 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  16. Overview and benchmark analysis of fuel cell parameters estimation for energy management purposes

    NASA Astrophysics Data System (ADS)

    Kandidayeni, M.; Macias, A.; Amamou, A. A.; Boulon, L.; Kelouwani, S.; Chaoui, H.

    2018-03-01

    Proton exchange membrane fuel cells (PEMFCs) have become the center of attention for energy conversion in many areas such as automotive industry, where they confront a high dynamic behavior resulting in their characteristics variation. In order to ensure appropriate modeling of PEMFCs, accurate parameters estimation is in demand. However, parameter estimation of PEMFC models is highly challenging due to their multivariate, nonlinear, and complex essence. This paper comprehensively reviews PEMFC models parameters estimation methods with a specific view to online identification algorithms, which are considered as the basis of global energy management strategy design, to estimate the linear and nonlinear parameters of a PEMFC model in real time. In this respect, different PEMFC models with different categories and purposes are discussed first. Subsequently, a thorough investigation of PEMFC parameter estimation methods in the literature is conducted in terms of applicability. Three potential algorithms for online applications, Recursive Least Square (RLS), Kalman filter, and extended Kalman filter (EKF), which has escaped the attention in previous works, have been then utilized to identify the parameters of two well-known semi-empirical models in the literature, Squadrito et al. and Amphlett et al. Ultimately, the achieved results and future challenges are discussed.

  17. Orion Passive Thermal: Control Overview

    NASA Technical Reports Server (NTRS)

    Alvarez-Hermandez, Angel; Miller, Stephen W.

    2009-01-01

    A general overview of the NASA Orion Passive Thermal Control System (PTCS) is presented. The topics include: 1) Orion in CxP Hierarchy; 2) General Orion Description/Orientation; and 3) Orion PTCS Overview.

  18. Pseudo-spectral methodology for a quantitative assessment of the cover of in-stream vegetation in small streams

    NASA Astrophysics Data System (ADS)

    Hershkovitz, Yaron; Anker, Yaakov; Ben-Dor, Eyal; Schwartz, Guy; Gasith, Avital

    2010-05-01

    In-stream vegetation is a key ecosystem component in many fluvial ecosystems, having cascading effects on stream conditions and biotic structure. Traditionally, ground-level surveys (e.g. grid and transect analyses) are commonly used for estimating cover of aquatic macrophytes. Nonetheless, this methodological approach is highly time consuming and usually yields information which is practically limited to habitat and sub-reach scales. In contrast, remote-sensing techniques (e.g. satellite imagery and airborne photography), enable collection of large datasets over section, stream and basin scales, in relatively short time and reasonable cost. However, the commonly used spatial high resolution (1m) is often inadequate for examining aquatic vegetation on habitat or sub-reach scales. We examined the utility of a pseudo-spectral methodology, using RGB digital photography for estimating the cover of in-stream vegetation in a small Mediterranean-climate stream. We compared this methodology with that obtained by traditional ground-level grid methodology and with an airborne hyper-spectral remote sensing survey (AISA-ES). The study was conducted along a 2 km section of an intermittent stream (Taninim stream, Israel). When studied, the stream was dominated by patches of watercress (Nasturtium officinale) and mats of filamentous algae (Cladophora glomerata). The extent of vegetation cover at the habitat and section scales (100 and 104 m, respectively) were estimated by the pseudo-spectral methodology, using an airborne Roli camera with a Phase-One P 45 (39 MP) CCD image acquisition unit. The swaths were taken in elevation of about 460 m having a spatial resolution of about 4 cm (NADIR). For measuring vegetation cover at the section scale (104 m) we also used a 'push-broom' AISA-ES hyper-spectral swath having a sensor configuration of 182 bands (350-2500 nm) at elevation of ca. 1,200 m (i.e. spatial resolution of ca. 1 m). Simultaneously, with every swath we used an Analytical

  19. Overview of Meta-Analyses of the Prevention of Mental Health, Substance Use and Conduct Problems

    PubMed Central

    Sandler, Irwin; Wolchik, Sharlene A.; Cruden, Gracelyn; Mahrer, Nicole E.; Ahn, Soyeon; Brincks, Ahnalee; Brown, C. Hendricks

    2014-01-01

    This paper presents findings from an overview of meta-analyses of the effects of prevention and promotion programs to prevent mental health, substance use and conduct problems. The review of 48 meta-analyses found small but significant effects to reduce depression, anxiety, anti-social behavior and substance use. Further, the effects are sustained over time. Meta-analyses often found that the effects were heterogeneous. A conceptual model is proposed to guide the study of moderators of program effects in future meta-analyses and methodological issues in synthesizing findings across preventive interventions are discussed. PMID:24471372

  20. Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building

    PubMed Central

    Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo

    2013-01-01

    This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999

  1. Indirect Lightning Safety Assessment Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ong, M M; Perkins, M P; Brown, C G

    2009-04-24

    Lightning is a safety hazard for high-explosives (HE) and their detonators. In the However, the current flowing from the strike point through the rebar of the building The methodology for estimating the risk from indirect lighting effects will be presented. It has two parts: a method to determine the likelihood of a detonation given a lightning strike, and an approach for estimating the likelihood of a strike. The results of these two parts produce an overall probability of a detonation. The probability calculations are complex for five reasons: (1) lightning strikes are stochastic and relatively rare, (2) the quality ofmore » the Faraday cage varies from one facility to the next, (3) RF coupling is inherently a complex subject, (4) performance data for abnormally stressed detonators is scarce, and (5) the arc plasma physics is not well understood. Therefore, a rigorous mathematical analysis would be too complex. Instead, our methodology takes a more practical approach combining rigorous mathematical calculations where possible with empirical data when necessary. Where there is uncertainty, we compensate with conservative approximations. The goal is to determine a conservative estimate of the odds of a detonation. In Section 2, the methodology will be explained. This report will discuss topics at a high-level. The reasons for selecting an approach will be justified. For those interested in technical details, references will be provided. In Section 3, a simple hypothetical example will be given to reinforce the concepts. While the methodology will touch on all the items shown in Figure 1, the focus of this report is the indirect effect, i.e., determining the odds of a detonation from given EM fields. Professor Martin Uman from the University of Florida has been characterizing and defining extreme lightning strikes. Using Professor Uman's research, Dr. Kimball Merewether at Sandia National Laboratory in Albuquerque calculated the EM fields inside a Faraday

  2. Estimating lifetime and age-conditional probabilities of developing cancer.

    PubMed

    Wun, L M; Merrill, R M; Feuer, E J

    1998-01-01

    Lifetime and age-conditional risk estimates of developing cancer provide a useful summary to the public of the current cancer risk and how this risk compares with earlier periods and among select subgroups of society. These reported estimates, commonly quoted in the popular press, have the potential to promote early detection efforts, to increase cancer awareness, and to serve as an aid in study planning. However, they can also be easily misunderstood and frightening to the general public. The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute and the American Cancer Society have recently begun including in annual reports lifetime and age-conditional risk estimates of developing cancer. These risk estimates are based on incidence rates that reflect new cases of the cancer in a population free of the cancer. To compute these estimates involves a cancer prevalence adjustment that is computed cross-sectionally from current incidence and mortality data derived within a multiple decrement life table. This paper presents a detailed description of the methodology for deriving lifetime and age-conditional risk estimates of developing cancer. In addition, an extension is made which, using a triple decrement life table, adjusts for a surgical procedure that removes individuals from the risk of developing a given cancer. Two important results which provide insights into the basic methodology are included in the discussion. First, the lifetime risk estimate does not depend on the cancer prevalence adjustment, although this is not the case for age-conditional risk estimates. Second, the lifetime risk estimate is always smaller when it is corrected for a surgical procedure that takes people out of the risk pool to develop the cancer. The methodology is applied to corpus and uterus NOS cancers, with a correction made for hysterectomy prevalence. The interpretation and limitations of risk estimates are also discussed.

  3. The effects of survey question wording on rape estimates: evidence from a quasi-experimental design.

    PubMed

    Fisher, Bonnie S

    2009-02-01

    The measurement of rape is among the leading methodological issues in the violence against women field. Methodological discussion continues to focus on decreasing measurement errors and improving the accuracy of rape estimates. The current study used a quasi-experimental design to examine the effect of survey question wording on estimates of completed and attempted rape and verbal threats of rape. Specifically, the study statistically compares self-reported rape estimates from two nationally representative studies of college women's sexual victimization experiences, the National College Women Sexual Victimization study and the National Violence Against College Women study. Results show significant differences between the two sets of rape estimates, with National Violence Against College Women study rape estimates ranging from 4.4% to 10.4% lower than the National College Women Sexual Victimization study rape estimates. Implications for future methodological research are discussed.

  4. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  5. Status of Activities to Implement a Sustainable System of MC&A Equipment and Methodological Support at Rosatom Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.D. Sanders

    Under the U.S.-Russian Material Protection, Control and Accounting (MPC&A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC&A measurement system. These efforts have resulted in the development of a MC&A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC&A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP,more » as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.« less

  6. A collaborative approach for estimating terrestrial wildlife abundance

    USGS Publications Warehouse

    Ransom, Jason I.; Kaczensky, Petra; Lubow, Bruce C.; Ganbaatar, Oyunsaikhan; Altansukh, Nanjid

    2012-01-01

    Accurately estimating abundance of wildlife is critical for establishing effective conservation and management strategies. Aerial methodologies for estimating abundance are common in developed countries, but they are often impractical for remote areas of developing countries where many of the world's endangered and threatened fauna exist. The alternative terrestrial methodologies can be constrained by limitations on access, technology, and human resources, and have rarely been comprehensively conducted for large terrestrial mammals at landscape scales. We attempted to overcome these problems by incorporating local peoples into a simultaneous point count of Asiatic wild ass (Equus hemionus) and goitered gazelle (Gazella subgutturosa) across the Great Gobi B Strictly Protected Area, Mongolia. Paired observers collected abundance and covariate metrics at 50 observation points and we estimated population sizes using distance sampling theory, but also assessed individual observer error to examine potential bias introduced by the large number of minimally trained observers. We estimated 5671 (95% CI = 3611–8907) wild asses and 5909 (95% CI = 3762–9279) gazelle inhabited the 11,027 km2 study area at the time of our survey and found that the methodology developed was robust at absorbing the logistical challenges and wide range of observer abilities. This initiative serves as a functional model for estimating terrestrial wildlife abundance while integrating local people into scientific and conservation projects. This, in turn, creates vested interest in conservation by the people who are most influential in, and most affected by, the outcomes.

  7. Thunderstorm Program General Overview

    DTIC Science & Technology

    2014-12-19

    DISTRIBUTION A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION IS UNLIMITED. Thunderstorm Program General Overview Report Documentation Page Form...COVERED - 4. TITLE AND SUBTITLE Thunderstorm Program General Overview 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...original document contains color images. 14. ABSTRACT Thunderstorm provides OSD, interagency partners, Combatant Commanders, Services, academia

  8. Ares I Operability Overview

    NASA Technical Reports Server (NTRS)

    Shaughnessy, Raymond W.

    2009-01-01

    A general overview of Ares I Operability is presented. The contents include: 1) Vehicle and Ops Concept Overviews; 2) What does operability mean to the Ares I Project?; 3) What is the Ares Project doing to influence operability into the flight hardware designs?; and 4) How do we measure Ares I Project success in infusing operability?

  9. Non-surgical interventions for adolescents with idiopathic scoliosis: an overview of systematic reviews.

    PubMed

    Płaszewski, Maciej; Bettany-Saltikov, Josette

    2014-01-01

    Non-surgical interventions for adolescents with idiopathic scoliosis remain highly controversial. Despite the publication of numerous reviews no explicit methodological evaluation of papers labeled as, or having a layout of, a systematic review, addressing this subject matter, is available. Analysis and comparison of the content, methodology, and evidence-base from systematic reviews regarding non-surgical interventions for adolescents with idiopathic scoliosis. Systematic overview of systematic reviews. Articles meeting the minimal criteria for a systematic review, regarding any non-surgical intervention for adolescent idiopathic scoliosis, with any outcomes measured, were included. Multiple general and systematic review specific databases, guideline registries, reference lists and websites of institutions were searched. The AMSTAR tool was used to critically appraise the methodology, and the Oxford Centre for Evidence Based Medicine and the Joanna Briggs Institute's hierarchies were applied to analyze the levels of evidence from included reviews. From 469 citations, twenty one papers were included for analysis. Five reviews assessed the effectiveness of scoliosis-specific exercise treatments, four assessed manual therapies, five evaluated bracing, four assessed different combinations of interventions, and one evaluated usual physical activity. Two reviews addressed the adverse effects of bracing. Two papers were high quality Cochrane reviews, Three were of moderate, and the remaining sixteen were of low or very low methodological quality. The level of evidence of these reviews ranged from 1 or 1+ to 4, and in some reviews, due to their low methodological quality and/or poor reporting, this could not be established. Higher quality reviews indicate that generally there is insufficient evidence to make a judgment on whether non-surgical interventions in adolescent idiopathic scoliosis are effective. Papers labeled as systematic reviews need to be considered in terms

  10. Aerial survey methodology for bison population estimation in Yellowstone National Park

    USGS Publications Warehouse

    Hess, Steven C.

    2002-01-01

    I developed aerial survey methods for statistically rigorous bison population estimation in Yellowstone National Park to support sound resource management decisions and to understand bison ecology. Survey protocols, data recording procedures, a geographic framework, and seasonal stratifications were based on field observations from February 1998-September 2000. The reliability of this framework and strata were tested with long-term data from 1970-1997. I simulated different sample survey designs and compared them to high-effort censuses of well-defined large areas to evaluate effort, precision, and bias. Sample survey designs require much effort and extensive information on the current spatial distribution of bison and therefore do not offer any substantial reduction in time and effort over censuses. I conducted concurrent ground surveys, or 'double sampling' to estimate detection probability during aerial surveys. Group size distribution and habitat strongly affected detection probability. In winter, 75% of the groups and 92% of individual bison were detected on average from aircraft, while in summer, 79% of groups and 97% of individual bison were detected. I also used photography to quantify the bias due to counting large groups of bison accurately and found that undercounting increased with group size and could reach 15%. I compared survey conditions between seasons and identified optimal time windows for conducting surveys in both winter and summer. These windows account for the habitats and total area bison occupy, and group size distribution. Bison became increasingly scattered over the Yellowstone region in smaller groups and more occupied unfavorable habitats as winter progressed. Therefore, the best conditions for winter surveys occur early in the season (Dec-Jan). In summer, bison were most spatially aggregated and occurred in the largest groups by early August. Low variability between surveys and high detection probability provide population estimates

  11. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    PubMed

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  12. Estimating animal population density using passive acoustics.

    PubMed

    Marques, Tiago A; Thomas, Len; Martin, Stephen W; Mellinger, David K; Ward, Jessica A; Moretti, David J; Harris, Danielle; Tyack, Peter L

    2013-05-01

    Reliable estimation of the size or density of wild animal populations is very important for effective wildlife management, conservation and ecology. Currently, the most widely used methods for obtaining such estimates involve either sighting animals from transect lines or some form of capture-recapture on marked or uniquely identifiable individuals. However, many species are difficult to sight, and cannot be easily marked or recaptured. Some of these species produce readily identifiable sounds, providing an opportunity to use passive acoustic data to estimate animal density. In addition, even for species for which other visually based methods are feasible, passive acoustic methods offer the potential for greater detection ranges in some environments (e.g. underwater or in dense forest), and hence potentially better precision. Automated data collection means that surveys can take place at times and in places where it would be too expensive or dangerous to send human observers. Here, we present an overview of animal density estimation using passive acoustic data, a relatively new and fast-developing field. We review the types of data and methodological approaches currently available to researchers and we provide a framework for acoustics-based density estimation, illustrated with examples from real-world case studies. We mention moving sensor platforms (e.g. towed acoustics), but then focus on methods involving sensors at fixed locations, particularly hydrophones to survey marine mammals, as acoustic-based density estimation research to date has been concentrated in this area. Primary among these are methods based on distance sampling and spatially explicit capture-recapture. The methods are also applicable to other aquatic and terrestrial sound-producing taxa. We conclude that, despite being in its infancy, density estimation based on passive acoustic data likely will become an important method for surveying a number of diverse taxa, such as sea mammals, fish, birds

  13. Estimating animal population density using passive acoustics

    PubMed Central

    Marques, Tiago A; Thomas, Len; Martin, Stephen W; Mellinger, David K; Ward, Jessica A; Moretti, David J; Harris, Danielle; Tyack, Peter L

    2013-01-01

    Reliable estimation of the size or density of wild animal populations is very important for effective wildlife management, conservation and ecology. Currently, the most widely used methods for obtaining such estimates involve either sighting animals from transect lines or some form of capture-recapture on marked or uniquely identifiable individuals. However, many species are difficult to sight, and cannot be easily marked or recaptured. Some of these species produce readily identifiable sounds, providing an opportunity to use passive acoustic data to estimate animal density. In addition, even for species for which other visually based methods are feasible, passive acoustic methods offer the potential for greater detection ranges in some environments (e.g. underwater or in dense forest), and hence potentially better precision. Automated data collection means that surveys can take place at times and in places where it would be too expensive or dangerous to send human observers. Here, we present an overview of animal density estimation using passive acoustic data, a relatively new and fast-developing field. We review the types of data and methodological approaches currently available to researchers and we provide a framework for acoustics-based density estimation, illustrated with examples from real-world case studies. We mention moving sensor platforms (e.g. towed acoustics), but then focus on methods involving sensors at fixed locations, particularly hydrophones to survey marine mammals, as acoustic-based density estimation research to date has been concentrated in this area. Primary among these are methods based on distance sampling and spatially explicit capture-recapture. The methods are also applicable to other aquatic and terrestrial sound-producing taxa. We conclude that, despite being in its infancy, density estimation based on passive acoustic data likely will become an important method for surveying a number of diverse taxa, such as sea mammals, fish, birds

  14. Overview of entry risk predictions

    NASA Astrophysics Data System (ADS)

    Mrozinski, R.; Mendeck, G.; Cutri-Kohart, R.

    Risk to people on the ground from uncontrolled entries of spacecraft is a primary concern when analyzing end-of-life disposal options for satellites. Countries must balance this risk with the need to mitigate an exponentially growing space debris population. Currently the United States does this via guidelines that call for a satellite to be disposed of in a controlled manner if an uncontrolled entry would be too risky to people on the ground. This risk is measured by a quantity called "casualty expectation", or E , where casualty expectation is defined as the expectedc number of people suffering death or injury due to a spacecraft entry event. If Ec exceeds 1 in 10,000, U. S. guidelines state that the entry should be controlled rather than uncontrolled. Since this guideline can have serious impacts on the cost, lifetime, and even the mission and functionality of a satellite, it is critical that this quantity be estimated well, and decision makers understand all assumptions and limitations inherent in the resulting value. This paper discusses several issues regarding estimates of casualty expectation, beginning with an overview of relevant United States policies and guidelines. The equation the space industry typically uses to estimate casualty expectation is presented, along with a look at the sensitivity of the results to the typical assumptions, models, and initial condition uncertainties. Differences in these modeling issues with respect to launch failure Ec estimates are included in the discussion. An alternate quantity to assess risks due to spacecraft entries is introduced. "Probability of casualty", or Pc , is defined as the probability of one or more instances of people suffering death or injury due to a spacecraft entry event. The equation to estimate Pc is derived, where the same assumptions, modeling, and initial condition issues for Ec apply. Several examples are then given of both Ec and Pc estimate calculations. Due to the difficult issues in

  15. Regression to fuzziness method for estimation of remaining useful life in power plant components

    NASA Astrophysics Data System (ADS)

    Alamaniotis, Miltiadis; Grelle, Austin; Tsoukalas, Lefteri H.

    2014-10-01

    Mitigation of severe accidents in power plants requires the reliable operation of all systems and the on-time replacement of mechanical components. Therefore, the continuous surveillance of power systems is a crucial concern for the overall safety, cost control, and on-time maintenance of a power plant. In this paper a methodology called regression to fuzziness is presented that estimates the remaining useful life (RUL) of power plant components. The RUL is defined as the difference between the time that a measurement was taken and the estimated failure time of that component. The methodology aims to compensate for a potential lack of historical data by modeling an expert's operational experience and expertise applied to the system. It initially identifies critical degradation parameters and their associated value range. Once completed, the operator's experience is modeled through fuzzy sets which span the entire parameter range. This model is then synergistically used with linear regression and a component's failure point to estimate the RUL. The proposed methodology is tested on estimating the RUL of a turbine (the basic electrical generating component of a power plant) in three different cases. Results demonstrate the benefits of the methodology for components for which operational data is not readily available and emphasize the significance of the selection of fuzzy sets and the effect of knowledge representation on the predicted output. To verify the effectiveness of the methodology, it was benchmarked against the data-based simple linear regression model used for predictions which was shown to perform equal or worse than the presented methodology. Furthermore, methodology comparison highlighted the improvement in estimation offered by the adoption of appropriate of fuzzy sets for parameter representation.

  16. Interventions for the prevention of OHSS in ART cycles: an overview of Cochrane reviews.

    PubMed

    Mourad, Selma; Brown, Julie; Farquhar, Cindy

    2017-01-23

    Ovarian hyperstimulation syndrome (OHSS) in assisted reproductive technology (ART) cycles is a treatment-induced disease that has an estimated prevalence of 20% to 33% in its mild form and 3% to 8% in its moderate or severe form. These numbers might even be higher for high-risk women such as those with polycystic ovaries or a high oocyte yield from ovum pickup. The objective of this overview is to identify and summarise all evidence from Cochrane systematic reviews on interventions for prevention or treatment of moderate, severe and overall OHSS in couples with subfertility who are undergoing ART cycles. Published Cochrane systematic reviews reporting on moderate, severe or overall OHSS as an outcome in ART cycles were eligible for inclusion in this overview. We also identified Cochrane submitted protocols and title registrations for future inclusion in the overview. The evidence is current to 12 December 2016. We identified reviews, protocols and titles by searching the Cochrane Gynaecology and Fertility Group Database of Systematic Reviews and Archie (the Cochrane information management system) in July 2016 on the effectiveness of interventions for outcomes of moderate, severe and overall OHSS. We undertook in duplicate selection of systematic reviews, data extraction and quality assessment. We used the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool to assess the quality of included reviews, and we used GRADE methods to assess the quality of the evidence for each outcome. We summarised the characteristics of included reviews in the text and in additional tables. We included a total of 27 reviews in this overview. The reviews were generally of high quality according to AMSTAR ratings, and included studies provided evidence that ranged from very low to high in quality. Ten reviews had not been updated in the past three years. Seven reviews described interventions that provided a beneficial effect in reducing OHSS rates, and we categorised

  17. Estimating Children's Soil/Dust Ingestion Rates through ...

    EPA Pesticide Factsheets

    Background: Soil/dust ingestion rates are important variables in assessing children’s health risks in contaminated environments. Current estimates are based largely on soil tracer methodology, which is limited by analytical uncertainty, small sample size, and short study duration. Objectives: The objective was to estimate site-specific soil/dust ingestion rates through reevaluation of the lead absorption dose–response relationship using new bioavailability data from the Bunker Hill Mining and Metallurgical Complex Superfund Site (BHSS) in Idaho, USA. Methods: The U.S. Environmental Protection Agency (EPA) in vitro bioavailability methodology was applied to archived BHSS soil and dust samples. Using age-specific biokinetic slope factors, we related bioavailable lead from these sources to children’s blood lead levels (BLLs) monitored during cleanup from 1988 through 2002. Quantitative regression analyses and exposure assessment guidance were used to develop candidate soil/dust source partition scenarios estimating lead intake, allowing estimation of age-specific soil/dust ingestion rates. These ingestion rate and bioavailability estimates were simultaneously applied to the U.S. EPA Integrated Exposure Uptake Biokinetic Model for Lead in Children to determine those combinations best approximating observed BLLs. Results: Absolute soil and house dust bioavailability averaged 33% (SD ± 4%) and 28% (SD ± 6%), respectively. Estimated BHSS age-specific soil/du

  18. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research.

    PubMed

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System's underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored.

  19. On methods of estimating cosmological bulk flows

    NASA Astrophysics Data System (ADS)

    Nusser, Adi

    2016-01-01

    We explore similarities and differences between several estimators of the cosmological bulk flow, B, from the observed radial peculiar velocities of galaxies. A distinction is made between two theoretical definitions of B as a dipole moment of the velocity field weighted by a radial window function. One definition involves the three-dimensional (3D) peculiar velocity, while the other is based on its radial component alone. Different methods attempt at inferring B for either of these definitions which coincide only for the case of a velocity field which is constant in space. We focus on the Wiener Filtering (WF) and the Constrained Minimum Variance (CMV) methodologies. Both methodologies require a prior expressed in terms of the radial velocity correlation function. Hoffman et al. compute B in Top-Hat windows from a WF realization of the 3D peculiar velocity field. Feldman et al. infer B directly from the observed velocities for the second definition of B. The WF methodology could easily be adapted to the second definition, in which case it will be equivalent to the CMV with the exception of the imposed constraint. For a prior with vanishing correlations or very noisy data, CMV reproduces the standard Maximum Likelihood estimation for B of the entire sample independent of the radial weighting function. Therefore, this estimator is likely more susceptible to observational biases that could be present in measurements of distant galaxies. Finally, two additional estimators are proposed.

  20. Temporal variability patterns in solar radiation estimations

    NASA Astrophysics Data System (ADS)

    Vindel, José M.; Navarro, Ana A.; Valenzuela, Rita X.; Zarzalejo, Luis F.

    2016-06-01

    In this work, solar radiation estimations obtained from a satellite and a numerical weather prediction model in mainland Spain have been compared. Similar comparisons have been formerly carried out, but in this case, the methodology used is different: the temporal variability of both sources of estimation has been compared with the annual evolution of the radiation associated to the different study climate zones. The methodology is based on obtaining behavior patterns, using a Principal Component Analysis, following the annual evolution of solar radiation estimations. Indeed, the adjustment degree to these patterns in each point (assessed from maps of correlation) may be associated with the annual radiation variation (assessed from the interquartile range), which is associated, in turn, to different climate zones. In addition, the goodness of each estimation source has been assessed comparing it with data obtained from the radiation measurements in ground by pyranometers. For the study, radiation data from Satellite Application Facilities and data corresponding to the reanalysis carried out by the European Centre for Medium-Range Weather Forecasts have been used.

  1. The National Visitor Use Monitoring methodology and final results for round 1

    Treesearch

    S.J. Zarnoch; E.M. White; D.B.K. English; Susan M. Kocis; Ross Arnold

    2011-01-01

    A nationwide, systematic monitoring process has been developed to provide improved estimates of recreation visitation on National Forest System lands. Methodology is presented to provide estimates of site visits and national forest visits based on an onsite sampling design of site-days and last-exiting recreationists. Stratification of the site days, based on site type...

  2. Guidebook on Methods to Estimate Non-Motorized Travel : Overview of Methods

    DOT National Transportation Integrated Search

    1999-07-01

    This guidebook provides a means for practitioner to better understand and estimate bicycle and pedestrian travel and to address transportation planning needs. The guidebook describes and compares the various methods that can be used to forecast non-m...

  3. How EIA Estimates Natural Gas Production

    EIA Publications

    2004-01-01

    The Energy Information Administration (EIA) publishes estimates monthly and annually of the production of natural gas in the United States. The estimates are based on data EIA collects from gas producing states and data collected by the U. S. Minerals Management Service (MMS) in the Department of Interior. The states and MMS collect this information from producers of natural gas for various reasons, most often for revenue purposes. Because the information is not sufficiently complete or timely for inclusion in EIA's Natural Gas Monthly (NGM), EIA has developed estimation methodologies to generate monthly production estimates that are described in this document.

  4. Experimental Methodology for Estimation of Local Heat Fluxes and Burning Rates in Steady Laminar Boundary Layer Diffusion Flames.

    PubMed

    Singh, Ajay V; Gollner, Michael J

    2016-06-01

    Modeling the realistic burning behavior of condensed-phase fuels has remained out of reach, in part because of an inability to resolve the complex interactions occurring at the interface between gas-phase flames and condensed-phase fuels. The current research provides a technique to explore the dynamic relationship between a combustible condensed fuel surface and gas-phase flames in laminar boundary layers. Experiments have previously been conducted in both forced and free convective environments over both solid and liquid fuels. A unique methodology, based on the Reynolds Analogy, was used to estimate local mass burning rates and flame heat fluxes for these laminar boundary layer diffusion flames utilizing local temperature gradients at the fuel surface. Local mass burning rates and convective and radiative heat feedback from the flames were measured in both the pyrolysis and plume regions by using temperature gradients mapped near the wall by a two-axis traverse system. These experiments are time-consuming and can be challenging to design as the condensed fuel surface burns steadily for only a limited period of time following ignition. The temperature profiles near the fuel surface need to be mapped during steady burning of a condensed fuel surface at a very high spatial resolution in order to capture reasonable estimates of local temperature gradients. Careful corrections for radiative heat losses from the thermocouples are also essential for accurate measurements. For these reasons, the whole experimental setup needs to be automated with a computer-controlled traverse mechanism, eliminating most errors due to positioning of a micro-thermocouple. An outline of steps to reproducibly capture near-wall temperature gradients and use them to assess local burning rates and heat fluxes is provided.

  5. ACID RAIN MODELING

    EPA Science Inventory

    This paper provides an overview of existing statistical methodologies for the estimation of site-specific and regional trends in wet deposition. The interaction of atmospheric processes and emissions tend to produce wet deposition data patterns that show large spatial and tempora...

  6. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    NASA Astrophysics Data System (ADS)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  7. Cognitive training and plasticity: Theoretical perspective and methodological consequences

    PubMed Central

    Willis, Sherry L.; Schaie, K. Warner

    2013-01-01

    Purpose To provide an overview of cognitive plasticity concepts and findings from a lifespan developmental perspective. Methods After an evaluation of the general concept of cognitive plasticity, the most important approaches to study behavioral and brain plasticity are reviewed. This includes intervention studies, experimental approaches, cognitive trainings, the study of facilitating factors for strategy learning and strategy use, practice, and person-environment interactions. Transfer and durability of training-induced plasticity is discussed. Results The review indicates that methodological and conceptual advances are needed to improve the match between levels of behavioral and brain plasticity targeted in current developmental research and study designs. Conclusions The results suggest that the emphasis of plasticity studies on treatment effectiveness needs to be complemented by a strong commitment to the grounding of the intervention in a conceptual framework. PMID:19847065

  8. REVIEW OF INDOOR EMISSION SOURCE MODELS: PART 2. PARAMETER ESTIMATION

    EPA Science Inventory

    This review consists of two sections. Part I provides an overview of 46 indoor emission source models. Part 2 (this paper) focuses on parameter estimation, a topic that is critical to modelers but has never been systematically discussed. A perfectly valid model may not be a usefu...

  9. Cell Science-02 Payload Overview

    NASA Technical Reports Server (NTRS)

    Mitchell, Sarah Diane

    2014-01-01

    The presentation provides an general overview of the Cell Science-02 science and payload operations to the NASA Payload Operations Integrated Working Group. The overview includes a description of the science objectives and specific aims, manifest status, and operations concept.

  10. Balancing benefit and risk of medicines: a systematic review and classification of available methodologies.

    PubMed

    Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah

    2014-07-01

    The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Refining a methodology for determining the economic impacts of transportation improvements.

    DOT National Transportation Integrated Search

    2012-07-01

    Estimating the economic impact of transportation improvements has previously proven to be a difficult task. After an exhaustive literature review, it was clear that the transportation profession lacked standards and methodologies for determining econ...

  12. Estimating Soil Hydraulic Parameters using Gradient Based Approach

    NASA Astrophysics Data System (ADS)

    Rai, P. K.; Tripathi, S.

    2017-12-01

    The conventional way of estimating parameters of a differential equation is to minimize the error between the observations and their estimates. The estimates are produced from forward solution (numerical or analytical) of differential equation assuming a set of parameters. Parameter estimation using the conventional approach requires high computational cost, setting-up of initial and boundary conditions, and formation of difference equations in case the forward solution is obtained numerically. Gaussian process based approaches like Gaussian Process Ordinary Differential Equation (GPODE) and Adaptive Gradient Matching (AGM) have been developed to estimate the parameters of Ordinary Differential Equations without explicitly solving them. Claims have been made that these approaches can straightforwardly be extended to Partial Differential Equations; however, it has been never demonstrated. This study extends AGM approach to PDEs and applies it for estimating parameters of Richards equation. Unlike the conventional approach, the AGM approach does not require setting-up of initial and boundary conditions explicitly, which is often difficult in real world application of Richards equation. The developed methodology was applied to synthetic soil moisture data. It was seen that the proposed methodology can estimate the soil hydraulic parameters correctly and can be a potential alternative to the conventional method.

  13. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Yao, Juan; He, Hua

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  14. Modular Exposure Disaggregation Methodologies for Catastrophe Modelling using GIS and Remotely-Sensed Data

    NASA Astrophysics Data System (ADS)

    Foulser-Piggott, R.; Saito, K.; Spence, R.

    2012-04-01

    Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake

  15. Disability Overview

    MedlinePlus

    ... About CDC.gov . Disability & Health Home Disability Overview Disability Inclusion Barriers to Inclusion Inclusion Strategies Inclusion in Programs & Activities Resources Healthy Living Disability & Physical Activity Disability & Obesity Disability & Smoking Disability & Breast ...

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.

    PubMed

    Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María

    2017-08-01

    In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.

  18. Artificial Neural Networks: an overview and their use in the analysis of the AMPHORA-3 dataset.

    PubMed

    Buscema, Paolo Massimo; Massini, Giulia; Maurelli, Guido

    2014-10-01

    The Artificial Adaptive Systems (AAS) are theories with which generative algebras are able to create artificial models simulating natural phenomenon. Artificial Neural Networks (ANNs) are the more diffused and best-known learning system models in the AAS. This article describes an overview of ANNs, noting its advantages and limitations for analyzing dynamic, complex, non-linear, multidimensional processes. An example of a specific ANN application to alcohol consumption in Spain, as part of the EU AMPHORA-3 project, during 1961-2006 is presented. Study's limitations are noted and future needed research using ANN methodologies are suggested.

  19. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the

  20. Validating a new methodology for strain estimation from cardiac cine MRI

    NASA Astrophysics Data System (ADS)

    Elnakib, Ahmed; Beache, Garth M.; Gimel'farb, Georgy; Inanc, Tamer; El-Baz, Ayman

    2013-10-01

    This paper focuses on validating a novel framework for estimating the functional strain from cine cardiac magnetic resonance imaging (CMRI). The framework consists of three processing steps. First, the left ventricle (LV) wall borders are segmented using a level-set based deformable model. Second, the points on the wall borders are tracked during the cardiac cycle based on solving the Laplace equation between the LV edges. Finally, the circumferential and radial strains are estimated at the inner, mid-wall, and outer borders of the LV wall. The proposed framework is validated using synthetic phantoms of the material strains that account for the physiological features and the LV response during the cardiac cycle. Experimental results on simulated phantom images confirm the accuracy and robustness of our method.

  1. A revised load estimation procedure for the Susquehanna, Potomac, Patuxent, and Choptank rivers

    USGS Publications Warehouse

    Yochum, Steven E.

    2000-01-01

    The U.S. Geological Survey?s Chesapeake Bay River Input Program has updated the nutrient and suspended-sediment load data base for the Susquehanna, Potomac, Patuxent, and Choptank Rivers using a multiple-window, center-estimate regression methodology. The revised method optimizes the seven-parameter regression approach that has been used historically by the program. The revised method estimates load using the fifth or center year of a sliding 9-year window. Each year a new model is run for each site and constituent, the most recent year is added, and the previous 4 years of estimates are updated. The fifth year in the 9-year window is considered the best estimate and is kept in the data base. The last year of estimation shows the most change from the previous year?s estimate and this change approaches a minimum at the fifth year. Differences between loads computed using this revised methodology and the loads populating the historical data base have been noted but the load estimates do not typically change drastically. The data base resulting from the application of this revised methodology is populated by annual and monthly load estimates that are known with greater certainty than in the previous load data base.

  2. OVERVIEW.

    ERIC Educational Resources Information Center

    ROSENBERG, SHELDON

    THIS OVERVIEW CHAPTER INTRODUCES THE FORTHCOMING "DEVELOPMENTS IN APPLIED PSYCHOLINGUISTICS RESEARCH," S. ROSENBERG AND J.H. KOPLIN, EDITORS, WHICH WILL BE PUBLISHED IN 1968 BY MACMILLAN COMPANY. IT WAS DESIGNED TO SERVE AN INTEGRATIVE FUNCTION--TO IDENTIFY SOME OF THE MAJOR IDEAS AND CONCERNS OF THE CONTRIBUTORS, TO IDENTIFY SOME OF THEIR…

  3. Automotive Manufacturer Risk Analysis : Meeting the Automotive Fuel Economy Standards

    DOT National Transportation Integrated Search

    1979-08-01

    An overview of the methodology and some findings are presented of a study which assessed the impact of the automotive fuel economy standards (AFES) on the four major U.S. automakers. A risk model was used to estimate the financial performance of the ...

  4. Multilevel Modeling: A Review of Methodological Issues and Applications

    ERIC Educational Resources Information Center

    Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.

    2009-01-01

    This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…

  5. 13 CFR 142.1 - Overview of regulations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Overview of regulations. 142.1 Section 142.1 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION PROGRAM FRAUD CIVIL REMEDIES ACT REGULATIONS Overview and Definitions § 142.1 Overview of regulations. (a) Statutory basis. This...

  6. 40 CFR 1065.601 - Overview.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Overview. 1065.601 Section 1065.601 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.601 Overview. (a) This subpart describes how to— (1) Use...

  7. 40 CFR 1065.601 - Overview.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Overview. 1065.601 Section 1065.601 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.601 Overview. (a) This subpart describes how to— (1) Use...

  8. 40 CFR 1065.601 - Overview.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Overview. 1065.601 Section 1065.601 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.601 Overview. (a) This subpart describes how to— (1) Use...

  9. 40 CFR 1065.601 - Overview.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Overview. 1065.601 Section 1065.601 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.601 Overview. (a) This subpart describes how to— (1) Use...

  10. 40 CFR 1065.601 - Overview.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Overview. 1065.601 Section 1065.601 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.601 Overview. (a) This subpart describes how to— (1) Use...

  11. Methadone Treatment: Overview and Bibliography.

    ERIC Educational Resources Information Center

    Greenfield, Lawrence; Tang, Beth Archibald

    This overview focuses on methadone treatment. Briefly, it describes the clinical uses of methadone for substance abuse treatment, explores dosage guidelines, and discusses counseling components. This overview also reviews research data on the application of methadone treatment to special populations, such as pregnant women, polydrug users, and…

  12. An Energy Overview of Romania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    anon.

    2003-10-20

    The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Romania. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less

  13. An Energy Overview of Venezuela

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    anon.

    2003-10-20

    The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Venezuela. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less

  14. An Energy Overview of Argentina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    anon.

    2003-10-20

    The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Argentina. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less

  15. Constellation Architecture Team-Lunar Scenario 12.0 Habitation Overview

    NASA Technical Reports Server (NTRS)

    Kennedy, Kriss J.; Toups, Larry D.; Rudisill, Marianne

    2010-01-01

    This paper will describe an overview of the Constellation Architecture Team Lunar Scenario 12.0 (LS-12) surface habitation approach and concept performed during the study definition. The Lunar Scenario 12 architecture study focused on two primary habitation approaches: a horizontally-oriented habitation module (LS-12.0) and a vertically-oriented habitation module (LS-12.1). This paper will provide an overview of the 12.0 lunar surface campaign, the associated outpost architecture, habitation functionality, concept description, system integration strategy, mass and power resource estimates. The Scenario 12 architecture resulted from combining three previous scenario attributes from Scenario 4 "Optimized Exploration", Scenario 5 "Fission Surface Power System" and Scenario 8 "Initial Extensive Mobility" into Scenario 12 along with an added emphasis on defining the excursion ConOps while the crew is away from the outpost location. This paper will describe an overview of the CxAT-Lunar Scenario 12.0 habitation concepts and their functionality. The Crew Operations area includes basic crew accommodations such as sleeping, eating, hygiene and stowage. The EVA Operations area includes additional EVA capability beyond the suitlock function such as suit maintenance, spares stowage, and suit stowage. The Logistics Operations area includes the enhanced accommodations for 180 days such as enhanced life support systems hardware, consumable stowage, spares stowage, interconnection to the other habitation elements, a common interface mechanism for future growth, and mating to a pressurized rover or Pressurized Logistics Module (PLM). The Mission & Science Operations area includes enhanced outpost autonomy such as an IVA glove box, life support, medical operations, and exercise equipment.

  16. Quantitative software models for the estimation of cost, size, and defects

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.

    2002-01-01

    The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.

  17. JMAT 2.0 Operating Room Requirements Estimation Study

    DTIC Science & Technology

    2011-05-25

    Health Research Center 140 Sylvester Rd. San Diego, CA 92106-3521 Report No. 11-10J, supported by the Office of the Assistant...expected-value methodology for estimating OR requirements in a theater hospital; (b) algorithms for estimating a special case OR table requirement...assuming the probabilities of entering the OR are either 1 or 0; and (c) an Excel worksheet that calculates the special case OR table estimates

  18. Research on atmospheric volcanic emissions - An overview

    NASA Technical Reports Server (NTRS)

    Friend, J. P.; Bandy, A. R.; Moyers, J. L.; Zoller, W. H.; Stoiber, R. E.; Torres, A. L.; Rose, W. I., Jr.; Mccormick, M. P.; Woods, D. C.

    1982-01-01

    Atmospheric abundances and the geochemical cycle of certain volatile compounds and elements may be largely influenced or entirely controlled by magmatic sources. However, better estimates of the magnitude and variability of volcanic emissions are required if the importance of this natural source of atmospheric constituents and the resulting effect on atmospheric chemistry are to be elucidated. The project 'Research on Atmospheric Volcanic Emissions' (RAVE) is concerned with the improvement of knowledge of both geological and chemical phenomena attending these emissions by means of comprehensive instrumentation on board a research aircraft making simultaneous measurements of plume constituents. A description is presented of the equipment and the procedures used in the RAVE field study of Mt. St. Helens' plume. An overview of the results is also provided.

  19. High-intensity interval training using whole-body exercises: training recommendations and methodological overview.

    PubMed

    Machado, Alexandre F; Baker, Julien S; Figueira Junior, Aylton J; Bocalini, Danilo S

    2017-05-04

    HIIT whole body (HWB)-based exercise is a new calisthenics exercise programme approach that can be considered an effective and safe method to improve physical fitness and body composition. HWB is a method that can be applied to different populations and ages. The purpose of this study was to describe possible methodologies for performing physical training based on whole-body exercise in healthy subjects. The HWB sessions consist of a repeated stimulus based on high-intensity exercise that also include monitoring time to effort, time to recuperation and session time. The exercise intensity is related to the maximal number of movements possible in a given time; therefore, the exercise sessions can be characterized as maximal. The intensity can be recorded using ratings of perceived exertion. Weekly training frequency and exercise selection should be structured according to individual subject functional fitness. Using this simple method, there is potential for greater adherence to physical activity which can promote health benefits to all members of society. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  20. U.S. DOE methodology for the development of geologic storage potential for carbon dioxide at the national and regional scale

    USGS Publications Warehouse

    Goodman, Angela; Hakala, J. Alexandra; Bromhal, Grant; Deel, Dawn; Rodosta, Traci; Frailey, Scott; Small, Michael; Allen, Doug; Romanov, Vyacheslav; Fazio, Jim; Huerta, Nicolas; McIntyre, Dustin; Kutchko, Barbara; Guthrie, George

    2011-01-01

    A detailed description of the United States Department of Energy (US-DOE) methodology for estimating CO2 storage potential for oil and gas reservoirs, saline formations, and unmineable coal seams is provided. The oil and gas reservoirs are assessed at the field level, while saline formations and unmineable coal seams are assessed at the basin level. The US-DOE methodology is intended for external users such as the Regional Carbon Sequestration Partnerships (RCSPs), future project developers, and governmental entities to produce high-level CO2 resource assessments of potential CO2 storage reservoirs in the United States and Canada at the regional and national scale; however, this methodology is general enough that it could be applied globally. The purpose of the US-DOE CO2 storage methodology, definitions of storage terms, and a CO2 storage classification are provided. Methodology for CO2 storage resource estimate calculation is outlined. The Log Odds Method when applied with Monte Carlo Sampling is presented in detail for estimation of CO2 storage efficiency needed for CO2 storage resource estimates at the regional and national scale. CO2 storage potential reported in the US-DOE's assessment are intended to be distributed online by a geographic information system in NatCarb and made available as hard-copy in the Carbon Sequestration Atlas of the United States and Canada. US-DOE's methodology will be continuously refined, incorporating results of the Development Phase projects conducted by the RCSPs from 2008 to 2018. Estimates will be formally updated every two years in subsequent versions of the Carbon Sequestration Atlas of the United States and Canada.

  1. CSBF Engineering Overview

    NASA Astrophysics Data System (ADS)

    Orr, Dwayne

    CSBF Engineering Overview Dwayne Orr (Presenting Author) Columbia Scientific Balloon Facility, Palestine, Texas (USA) Dwayne.Orr@csbf.nasa.gov The Columbia Scientific Balloon Facility (CSBF) at Palestine, Texas provides operational and engineering support for the launch of NASA Scientific Balloons. Over the years with the support of the NASA Balloon Program Office, CSBF has developed unique flight systems with the focus of providing a highly reliable, cost effective medium for giving Scientist’s access to a near space environment. This paper will provide an overview of the CSBF flight systems with an emphasis on recent developments and plans for the future.

  2. Association between component costs, study methodologies, and foodborne illness-related factors with the cost of nontyphoidal Salmonella illness.

    PubMed

    McLinden, Taylor; Sargeant, Jan M; Thomas, M Kate; Papadopoulos, Andrew; Fazil, Aamir

    2014-09-01

    Nontyphoidal Salmonella spp. are one of the most common causes of bacterial foodborne illness. Variability in cost inventories and study methodologies limits the possibility of meaningfully interpreting and comparing cost-of-illness (COI) estimates, reducing their usefulness. However, little is known about the relative effect these factors have on a cost-of-illness estimate. This is important for comparing existing estimates and when designing new cost-of-illness studies. Cost-of-illness estimates, identified through a scoping review, were used to investigate the association between descriptive, component cost, methodological, and foodborne illness-related factors such as chronic sequelae and under-reporting with the cost of nontyphoidal Salmonella spp. illness. The standardized cost of nontyphoidal Salmonella spp. illness from 30 estimates reported in 29 studies ranged from $0.01568 to $41.22 United States dollars (USD)/person/year (2012). The mean cost of nontyphoidal Salmonella spp. illness was $10.37 USD/person/year (2012). The following factors were found to be significant in multiple linear regression (p≤0.05): the number of direct component cost categories included in an estimate (0-4, particularly long-term care costs) and chronic sequelae costs (inclusion/exclusion), which had positive associations with the cost of nontyphoidal Salmonella spp. illness. Factors related to study methodology were not significant. Our findings indicated that study methodology may not be as influential as other factors, such as the number of direct component cost categories included in an estimate and costs incurred due to chronic sequelae. Therefore, these may be the most important factors to consider when designing, interpreting, and comparing cost of foodborne illness studies.

  3. 49 CFR 1.1 - Overview.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false Overview. 1.1 Section 1.1 Transportation Office of the Secretary of Transportation ORGANIZATION AND DELEGATION OF POWERS AND DUTIES General § 1.1 Overview. This part describes the organization of the United States Department of Transportation and...

  4. 49 CFR 1.1 - Overview.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false Overview. 1.1 Section 1.1 Transportation Office of the Secretary of Transportation ORGANIZATION AND DELEGATION OF POWERS AND DUTIES General § 1.1 Overview. This part describes the organization of the United States Department of Transportation and...

  5. 49 CFR 1.1 - Overview.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false Overview. 1.1 Section 1.1 Transportation Office of the Secretary of Transportation ORGANIZATION AND DELEGATION OF POWERS AND DUTIES General § 1.1 Overview. This part describes the organization of the United States Department of Transportation and...

  6. Overview of vegetation monitoring data, 1952--1983. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duncan, J.P.

    1994-03-01

    This report is a result of the Hanford Environmental Dose Reconstruction (HEDR) Project. The goal of the HEDR Project is to estimate the radiation dose that individuals could have received from emissions since 1944 at the Hanford Site near Richland, Washington. Members of the HEDR Project`s Environmental Monitoring Data Task have developed databases of historical environmental measurements of such emissions. The HEDR Project is conducted by Battelle, Pacific Northwest Laboratories. This report is the third in a series that documents the information available on measurements of iodine-131 concentrations in vegetation. The first two reports provide the data for 1945--1951. Thismore » report provides an overview of the historical documents, which contain vegetation data for 1952--1983. The overview is organized according to the documents available for any given year. Each section, covering one year, contains a discussion of the media sampled, the sampling locations, significant events if there were any, emission quantities, constituents measured, and a list of the documents with complete reference information. Because the emissions which affected vegetation were significantly less after 1951, the vegetation monitoring data after that date have not been used in the HEDR Project. However, access to these data may be of interest to the public. This overview is, therefore, being published.« less

  7. Top Level Space Cost Methodology (TLSCM)

    DTIC Science & Technology

    1997-12-02

    Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and

  8. Experimental Methodology for Estimation of Local Heat Fluxes and Burning Rates in Steady Laminar Boundary Layer Diffusion Flames

    PubMed Central

    Singh, Ajay V.; Gollner, Michael J.

    2016-01-01

    Modeling the realistic burning behavior of condensed-phase fuels has remained out of reach, in part because of an inability to resolve the complex interactions occurring at the interface between gas-phase flames and condensed-phase fuels. The current research provides a technique to explore the dynamic relationship between a combustible condensed fuel surface and gas-phase flames in laminar boundary layers. Experiments have previously been conducted in both forced and free convective environments over both solid and liquid fuels. A unique methodology, based on the Reynolds Analogy, was used to estimate local mass burning rates and flame heat fluxes for these laminar boundary layer diffusion flames utilizing local temperature gradients at the fuel surface. Local mass burning rates and convective and radiative heat feedback from the flames were measured in both the pyrolysis and plume regions by using temperature gradients mapped near the wall by a two-axis traverse system. These experiments are time-consuming and can be challenging to design as the condensed fuel surface burns steadily for only a limited period of time following ignition. The temperature profiles near the fuel surface need to be mapped during steady burning of a condensed fuel surface at a very high spatial resolution in order to capture reasonable estimates of local temperature gradients. Careful corrections for radiative heat losses from the thermocouples are also essential for accurate measurements. For these reasons, the whole experimental setup needs to be automated with a computer-controlled traverse mechanism, eliminating most errors due to positioning of a micro-thermocouple. An outline of steps to reproducibly capture near-wall temperature gradients and use them to assess local burning rates and heat fluxes is provided. PMID:27285827

  9. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  10. Mortality estimation from carcass searches using the R-package carcass: a tutorial

    USGS Publications Warehouse

    Korner-Nievergelt, Fränzi; Behr, Oliver; Brinkmann, Robert; Etterson, Matthew A.; Huso, Manuela M. P.; Dalthorp, Daniel; Korner-Nievergelt, Pius; Roth, Tobias; Niermann, Ivo

    2015-01-01

    This article is a tutorial for the R-package carcass. It starts with a short overview of common methods used to estimate mortality based on carcass searches. Then, it guides step by step through a simple example. First, the proportion of animals that fall into the search area is estimated. Second, carcass persistence time is estimated based on experimental data. Third, searcher efficiency is estimated. Fourth, these three estimated parameters are combined to obtain the probability that an animal killed is found by an observer. Finally, this probability is used together with the observed number of carcasses found to obtain an estimate for the total number of killed animals together with a credible interval.

  11. Off-Highway Gasoline Consuption Estimation Models Used in the Federal Highway Administration Attribution Process: 2008 Updates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Ho-Ling; Davis, Stacy Cagle

    2009-12-01

    This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the secondmore » major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent

  12. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    PubMed

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. © 2015, The International Biometric Society.

  13. Early Childhood Longitudinal Study, Birth Cohort (ECLS-B): Methodology Report for the 9-Month Data Collection (2001-02). Volume 2: Sampling. NCES 2005-147

    ERIC Educational Resources Information Center

    Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry

    2005-01-01

    This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…

  14. Estimation of under-reporting in epidemics using approximations.

    PubMed

    Gamado, Kokouvi; Streftaris, George; Zachary, Stan

    2017-06-01

    Under-reporting in epidemics, when it is ignored, leads to under-estimation of the infection rate and therefore of the reproduction number. In the case of stochastic models with temporal data, a usual approach for dealing with such issues is to apply data augmentation techniques through Bayesian methodology. Departing from earlier literature approaches implemented using reversible jump Markov chain Monte Carlo (RJMCMC) techniques, we make use of approximations to obtain faster estimation with simple MCMC. Comparisons among the methods developed here, and with the RJMCMC approach, are carried out and highlight that approximation-based methodology offers useful alternative inference tools for large epidemics, with a good trade-off between time cost and accuracy.

  15. Methodological Considerations in Social Cost Studies of Addictive Substances: A Systematic Literature Review.

    PubMed

    Verhaeghe, Nick; Lievens, Delfine; Annemans, Lieven; Vander Laenen, Freya; Putman, Koen

    2016-01-01

    Alcohol, tobacco, illicit drugs, and psychoactive pharmaceuticals' use is associated with a higher likelihood of developing several diseases and injuries and, as a consequence, considerable health-care expenditures. There is yet a lack of consistent methodologies to estimate the economic impact of addictive substances to society. The aim was to assess the methodological approaches applied in social cost studies estimating the economic impact of alcohol, tobacco, illicit drugs, and psychoactive pharmaceuticals. A systematic literature review through the electronic databases, Medline (PubMed) and Web of Science, was performed. Studies in English published from 1997 examining the social costs of the addictive substances alcohol, tobacco, illicit drugs, and psychoactive pharmaceuticals were eligible for inclusion. Twelve social cost studies met the inclusion criteria. In all studies, the direct and indirect costs were measured, but the intangible costs were seldom taken into account. A wide variety in cost items included across studies was observed. Sensitivity analyses to address the uncertainty around certain cost estimates were conducted in eight studies considered in the review. Differences in cost items included in cost-of-illness studies limit the comparison across studies. It is clear that it is difficult to deal with all consequences of substance use in cost-of-illness studies. Future social cost studies should be based on sound methodological principles in order to result in more reliable cost estimates of the economic burden of substance use.

  16. Traditional Chinese medicine injection for angina pectoris: an overview of systematic reviews.

    PubMed

    Luo, Jing; Shang, Qinghua; Han, Mei; Chen, Keji; Xu, Hao

    2014-01-01

    Traditional Chinese medicine (TCM) injection is widely used to treat angina pectoris in China. This overview aims to systematically summarize the general characteristics of systematic reviews (SRs) on TCM injection in treating angina, and assess the methodological and reporting quality of these reviews. We searched PubMed, Embase, the Cochrane Library and four Chinese databases from inception until March 2013. Data were extracted according to a preset form. The AMSTAR and PRISMA checklists were used to explore the methodological quality and reporting characteristics of included reviews, respectively. All data analyses were descriptive. 46 SRs involving over 57,463 participants with angina reviewing 23 kinds of TCM injections were included. The main outcomes evaluated in the reviews were symptoms (43/46, 93.5%), surrogate outcomes (42/46, 91.3%) and adverse events (41/46, 87.0%). Few reviews evaluated endpoints (7/46, 15.2%) and quality of life (1/46, 2.2%). One third of the reviews (16/46, 34.8%) drew definitely positive conclusions while the others (30/46, 65.2%) suggested potential benefits mainly in symptoms, electrocardiogram and adverse events. With many serious flaws such as lack of a protocol and inappropriate data synthesis, the overall methodological and reporting quality of the reviews was limited. While many SRs of TCM injection on the treatment of angina suggested potential benefits or definitely positive effects, stakeholders should not accept the findings of these reviews uncritically due to the limited methodological and reporting quality. Future SRs should be appropriately conducted and reported according to international standards such as AMSTAR and PRISMA, rather than published in large numbers.

  17. The South African Tuberculosis Care Cascade: Estimated Losses and Methodological Challenges

    PubMed Central

    Naidoo, Pren; Theron, Grant; Rangaka, Molebogeng X; Chihota, Violet N; Vaughan, Louise; Brey, Zameer O; Pillay, Yogan

    2017-01-01

    Abstract Background While tuberculosis incidence and mortality are declining in South Africa, meeting the goals of the End TB Strategy requires an invigorated programmatic response informed by accurate data. Enumerating the losses at each step in the care cascade enables appropriate targeting of interventions and resources. Methods We estimated the tuberculosis burden; the number and proportion of individuals with tuberculosis who accessed tests, had tuberculosis diagnosed, initiated treatment, and successfully completed treatment for all tuberculosis cases, for those with drug-susceptible tuberculosis (including human immunodeficiency virus (HIV)–coinfected cases) and rifampicin-resistant tuberculosis. Estimates were derived from national electronic tuberculosis register data, laboratory data, and published studies. Results The overall tuberculosis burden was estimated to be 532005 cases (range, 333760–764480 cases), with successful completion of treatment in 53% of cases. Losses occurred at multiple steps: 5% at test access, 13% at diagnosis, 12% at treatment initiation, and 17% at successful treatment completion. Overall losses were similar among all drug-susceptible cases and those with HIV coinfection (54% and 52%, respectively, successfully completed treatment). Losses were substantially higher among rifampicin- resistant cases, with only 22% successfully completing treatment. Conclusion Although the vast majority of individuals with tuberculosis engaged the public health system, just over half were successfully treated. Urgent efforts are required to improve implementation of existing policies and protocols to close gaps in tuberculosis diagnosis, treatment initiation, and successful treatment completion. PMID:29117342

  18. The South African Tuberculosis Care Cascade: Estimated Losses and Methodological Challenges.

    PubMed

    Naidoo, Pren; Theron, Grant; Rangaka, Molebogeng X; Chihota, Violet N; Vaughan, Louise; Brey, Zameer O; Pillay, Yogan

    2017-11-06

    While tuberculosis incidence and mortality are declining in South Africa, meeting the goals of the End TB Strategy requires an invigorated programmatic response informed by accurate data. Enumerating the losses at each step in the care cascade enables appropriate targeting of interventions and resources. We estimated the tuberculosis burden; the number and proportion of individuals with tuberculosis who accessed tests, had tuberculosis diagnosed, initiated treatment, and successfully completed treatment for all tuberculosis cases, for those with drug-susceptible tuberculosis (including human immunodeficiency virus (HIV)-coinfected cases) and rifampicin-resistant tuberculosis. Estimates were derived from national electronic tuberculosis register data, laboratory data, and published studies. The overall tuberculosis burden was estimated to be 532005 cases (range, 333760-764480 cases), with successful completion of treatment in 53% of cases. Losses occurred at multiple steps: 5% at test access, 13% at diagnosis, 12% at treatment initiation, and 17% at successful treatment completion. Overall losses were similar among all drug-susceptible cases and those with HIV coinfection (54% and 52%, respectively, successfully completed treatment). Losses were substantially higher among rifampicin- resistant cases, with only 22% successfully completing treatment. Although the vast majority of individuals with tuberculosis engaged the public health system, just over half were successfully treated. Urgent efforts are required to improve implementation of existing policies and protocols to close gaps in tuberculosis diagnosis, treatment initiation, and successful treatment completion. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  19. 10 CFR 436.23 - Estimated simple payback time.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Methodology and Procedures for Life Cycle Cost Analyses § 436.23 Estimated simple payback time. The estimated simple payback time is the number of years required for the cumulative value of energy or water cost savings less future non-fuel or non-water costs to equal the investment costs of the building energy or...

  20. 10 CFR 436.23 - Estimated simple payback time.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Methodology and Procedures for Life Cycle Cost Analyses § 436.23 Estimated simple payback time. The estimated simple payback time is the number of years required for the cumulative value of energy or water cost savings less future non-fuel or non-water costs to equal the investment costs of the building energy or...