Sample records for article presents estimates

  1. A cross-country Exchange Market Pressure (EMP) dataset.

    PubMed

    Desai, Mohit; Patnaik, Ila; Felman, Joshua; Shah, Ajay

    2017-06-01

    The data presented in this article are related to the research article titled - "An exchange market pressure measure for cross country analysis" (Patnaik et al. [1]). In this article, we present the dataset for Exchange Market Pressure values (EMP) for 139 countries along with their conversion factors, ρ (rho). Exchange Market Pressure, expressed in percentage change in exchange rate, measures the change in exchange rate that would have taken place had the central bank not intervened. The conversion factor ρ can interpreted as the change in exchange rate associated with $1 billion of intervention. Estimates of conversion factor ρ allow us to calculate a monthly time series of EMP for 139 countries. Additionally, the dataset contains the 68% confidence interval (high and low values) for the point estimates of ρ 's. Using the standard errors of estimates of ρ 's, we obtain one sigma intervals around mean estimates of EMP values. These values are also reported in the dataset.

  2. National health expenditures, 1990

    PubMed Central

    Levit, Katharine R.; Lazenby, Helen C.; Cowan, Cathy A.; Letsch, Suzanne W.

    1991-01-01

    During 1990, health expenditures as a share of gross national product rose to 12.2 percent, up from 11.6 percent in 1989. This dramatic increase is the second largest increase in the past three decades. The national health expenditure estimates presented in this article document rapidly rising health care costs and provide a context for understanding the health care financing crisis facing the Nation today. The 1990 national health expenditures incorporate the most recently available data. They differ from historical estimates presented in the preceding article. The length of time and complicated process of producing projections required use of 1989 national health expenditures—data available prior to the completion of the 1990 estimates presented here. PMID:10114934

  3. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    PubMed Central

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  4. Improved estimates of fixed reproducible tangible wealth, 1929-95

    DOT National Transportation Integrated Search

    1997-05-01

    This article presents revised estimates of the value of fixed reproducible tangible wealth in the United States for 192995; these estimates incorporate the definitional and statistical : improvements introduced in last years comprehensive revis...

  5. Cost of Equity Estimation in Fuel and Energy Sector Companies Based on CAPM

    NASA Astrophysics Data System (ADS)

    Kozieł, Diana; Pawłowski, Stanisław; Kustra, Arkadiusz

    2018-03-01

    The article presents cost of equity estimation of capital groups from the fuel and energy sector, listed at the Warsaw Stock Exchange, based on the Capital Asset Pricing Model (CAPM). The objective of the article was to perform a valuation of equity with the application of CAPM, based on actual financial data and stock exchange data and to carry out a sensitivity analysis of such cost, depending on the financing structure of the entity. The objective of the article formulated in this manner has determined its' structure. It focuses on presentation of substantive analyses related to the core of equity and methods of estimating its' costs, with special attention given to the CAPM. In the practical section, estimation of cost was performed according to the CAPM methodology, based on the example of leading fuel and energy companies, such as Tauron GE and PGE. Simultaneously, sensitivity analysis of such cost was performed depending on the structure of financing the company's operation.

  6. Error estimation in the neural network solution of ordinary differential equations.

    PubMed

    Filici, Cristian

    2010-06-01

    In this article a method of error estimation for the neural approximation of the solution of an Ordinary Differential Equation is presented. Some examples of the application of the method support the theory presented. Copyright 2010. Published by Elsevier Ltd.

  7. Simulation data for an estimation of the maximum theoretical value and confidence interval for the correlation coefficient.

    PubMed

    Rocco, Paolo; Cilurzo, Francesco; Minghetti, Paola; Vistoli, Giulio; Pedretti, Alessandro

    2017-10-01

    The data presented in this article are related to the article titled "Molecular Dynamics as a tool for in silico screening of skin permeability" (Rocco et al., 2017) [1]. Knowledge of the confidence interval and maximum theoretical value of the correlation coefficient r can prove useful to estimate the reliability of developed predictive models, in particular when there is great variability in compiled experimental datasets. In this Data in Brief article, data from purposely designed numerical simulations are presented to show how much the maximum r value is worsened by increasing the data uncertainty. The corresponding confidence interval of r is determined by using the Fisher r → Z transform.

  8. Estimating Local Food Capacity in Publicly Funded Institutions

    ERIC Educational Resources Information Center

    Knight, Andrew J.; Chopra, Hema M.

    2013-01-01

    This article presents three approaches to estimate the size of the publicly funded institutional marketplace to determine what opportunities exist for local farmers and fishers. First, we found that estimates from national foodservice sales statistics over-estimate local capacity opportunities. Second, analyzing budgets of publicly funded…

  9. Effect of Visual Field Presentation on Action Planning (Estimating Reach) in Children

    ERIC Educational Resources Information Center

    Gabbard, Carl; Cordova, Alberto

    2012-01-01

    In this article, the authors examined the effects of target information presented in different visual fields (lower, upper, central) on estimates of reach via use of motor imagery in children (5-11 years old) and young adults. Results indicated an advantage for estimating reach movements for targets placed in lower visual field (LoVF), with all…

  10. Estimation of sample size and testing power (Part 4).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-01-01

    Sample size estimation is necessary for any experimental or survey research. An appropriate estimation of sample size based on known information and statistical knowledge is of great significance. This article introduces methods of sample size estimation of difference test for data with the design of one factor with two levels, including sample size estimation formulas and realization based on the formulas and the POWER procedure of SAS software for quantitative data and qualitative data with the design of one factor with two levels. In addition, this article presents examples for analysis, which will play a leading role for researchers to implement the repetition principle during the research design phase.

  11. Line transect estimation of population size: the exponential case with grouped data

    USGS Publications Warehouse

    Anderson, D.R.; Burnham, K.P.; Crain, B.R.

    1979-01-01

    Gates, Marshall, and Olson (1968) investigated the line transect method of estimating grouse population densities in the case where sighting probabilities are exponential. This work is followed by a simulation study in Gates (1969). A general overview of line transect analysis is presented by Burnham and Anderson (1976). These articles all deal with the ungrouped data case. In the present article, an analysis of line transect data is formulated under the Gates framework of exponential sighting probabilities and in the context of grouped data.

  12. Sample Size Estimation: The Easy Way

    ERIC Educational Resources Information Center

    Weller, Susan C.

    2015-01-01

    This article presents a simple approach to making quick sample size estimates for basic hypothesis tests. Although there are many sources available for estimating sample sizes, methods are not often integrated across statistical tests, levels of measurement of variables, or effect sizes. A few parameters are required to estimate sample sizes and…

  13. Estimation of the object orientation and location with the use of MEMS sensors

    NASA Astrophysics Data System (ADS)

    Sawicki, Aleksander; Walendziuk, Wojciech; Idzkowski, Adam

    2015-09-01

    The article presents the implementation of the estimation algorithms of orientation in 3D space and the displacement of an object in a 2D space. Moreover, a general orientation storage methods using Euler angles, quaternion and rotation matrix are presented. The experimental part presents the results of the complementary filter implementation. In the study experimental microprocessor module based on STM32f4 Discovery system and myRIO hardware platform equipped with FPGA were used. The attempt to track an object in two-dimensional space, which are showed in the final part of this article, were made with the use of the equipment mentioned above.

  14. Myopia Glasses and Optical Power Estimation: An Easy Experiment

    ERIC Educational Resources Information Center

    Ribeiro, Jair Lúcio Prados

    2015-01-01

    Human eye optics is a common high school physics topic and students usually show a great interest during our presentation of this theme. In this article, we present an easy way to estimate a diverging lens' optical power from a simple experiment involving myopia eyeglasses and a smartphone flashlight.

  15. Myopia Glasses and Optical Power Estimation: An Easy Experiment

    NASA Astrophysics Data System (ADS)

    Ribeiro, Jair Lúcio Prados

    2015-02-01

    Human eye optics is a common high school physics topic and students usually show a great interest during our presentation of this theme. In this article, we present an easy way to estimate a diverging lens' optical power from a simple experiment involving myopia eyeglasses and a smartphone flashlight.

  16. Index cost estimate based BIM method - Computational example for sports fields

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2017-07-01

    The paper presents an example ofcost estimation in the early phase of the project. The fragment of relative database containing solution, descriptions, geometry of construction object and unit cost of sports facilities was shown. The Index Cost Estimate Based BIM method calculationswith use of Case Based Reasoning were presented, too. The article presentslocal and global similarity measurement and example of BIM based quantity takeoff process. The outcome of cost calculations based on CBR method was presented as a final result of calculations.

  17. Applying a Weighted Maximum Likelihood Latent Trait Estimator to the Generalized Partial Credit Model

    ERIC Educational Resources Information Center

    Penfield, Randall D.; Bergeron, Jennifer M.

    2005-01-01

    This article applies a weighted maximum likelihood (WML) latent trait estimator to the generalized partial credit model (GPCM). The relevant equations required to obtain the WML estimator using the Newton-Raphson algorithm are presented, and a simulation study is described that compared the properties of the WML estimator to those of the maximum…

  18. Using Multisite Experiments to Study Cross-Site Variation in Treatment Effects: A Hybrid Approach with Fixed Intercepts and A Random Treatment Coefficient

    ERIC Educational Resources Information Center

    Bloom, Howard S.; Raudenbush, Stephen W.; Weiss, Michael J.; Porter, Kristin

    2017-01-01

    The present article considers a fundamental question in evaluation research: "By how much do program effects vary across sites?" The article first presents a theoretical model of cross-site impact variation and a related estimation model with a random treatment coefficient and fixed site-specific intercepts. This approach eliminates…

  19. Compatible estimators of the components of change for a rotating panel forest inventory design

    Treesearch

    Francis A. Roesch

    2007-01-01

    This article presents two approaches for estimating the components of forest change utilizing data from a rotating panel sample design. One approach uses a variant of the exponentially weighted moving average estimator and the other approach uses mixed estimation. Three general transition models were each combined with a single compatibility model for the mixed...

  20. Diagnostics monitor of the braking efficiency in the on board diagnostics system for the motor vehicles

    NASA Astrophysics Data System (ADS)

    Gajek, Andrzej

    2016-09-01

    The article presents diagnostics monitor for control of the efficiency of brakes in various road conditions in cars equipped with pressure sensor in brake (ESP) system. Now the brake efficiency of the vehicles is estimated periodically in the stand conditions on the base of brake forces measurement or in the road conditions on the base of the brake deceleration. The presented method allows to complete the stand - periodical tests of the brakes by current on board diagnostics system OBD for brakes. First part of the article presents theoretical dependences between deceleration of the vehicle and brake pressure. The influence of the vehicle mass, initial speed of braking, temperature of brakes, aerodynamic drag, rolling resistance, engine resistance, state of the road surface, angle of the road sloping on the deceleration have been analysed. The manner of the appointed of these parameters has been analysed. The results of the initial investigation have been presented. At the end of the article the strategy of the estimation and signalization of the irregular value of the deceleration are presented.

  1. Using Work Breakdown Structure Models to Develop Unit Treatment Costs

    EPA Science Inventory

    This article presents a new cost modeling approach called work breakdown structure (WBS), designed to develop unit costs for drinking water technologies. WBS involves breaking the technology into its discrete components for the purposes of estimating unit costs. The article dem...

  2. A New Approach to Estimate the Age of the Earth and the Age of the Universe

    NASA Astrophysics Data System (ADS)

    Ben Salem, Kamel

    2011-01-01

    In a previous article, we proposed estimations for the age of the Universe and for the date of stabilization of its general structure on the basis of a given age of the Earth equal to 4.6 billion years. In the present article, we propose a new approach to estimate more accurately and at the same time, the age of the Earth and that of the Universe, starting from verse 4 of Sura 70 of the Qur'an. The procedure we followed and which is detailed in this article, should in our view, contribute to enlighten the debate on the question. We must add that our approach can in no case be considered as based on "concordism" or conjecture. Indeed, it rests on rigorous mathematical computations.

  3. Channel Temperature Estimates for Microwave AlGaN/GaN Power HEMTS on SiC and Sapphire

    NASA Technical Reports Server (NTRS)

    Freeman, Jon C.

    2003-01-01

    A simple technique to estimate the channel temperature of a generic AlGaN/GaN HEMTs on SiC or Sapphire, while incorporating temperature dependence of the thermal conductivity is presented. The procedure is validated b y comparing it's predictions with the experimentally measured temperatures in devices presented in three recently published articles.

  4. [Potentials in the regionalization of health indicators using small-area estimation methods : Exemplary results based on the 2009, 2010 and 2012 GEDA studies].

    PubMed

    Kroll, Lars Eric; Schumann, Maria; Müters, Stephan; Lampert, Thomas

    2017-12-01

    Nationwide health surveys can be used to estimate regional differences in health. Using traditional estimation techniques, the spatial depth for these estimates is limited due to the constrained sample size. So far - without special refreshment samples - results have only been available for larger populated federal states of Germany. An alternative is regression-based small-area estimation techniques. These models can generate smaller-scale data, but are also subject to greater statistical uncertainties because of the model assumptions. In the present article, exemplary regionalized results based on the studies "Gesundheit in Deutschland aktuell" (GEDA studies) 2009, 2010 and 2012, are compared to the self-rated health status of the respondents. The aim of the article is to analyze the range of regional estimates in order to assess the usefulness of the techniques for health reporting more adequately. The results show that the estimated prevalence is relatively stable when using different samples. Important determinants of the variation of the estimates are the achieved sample size on the district level and the type of the district (cities vs. rural regions). Overall, the present study shows that small-area modeling of prevalence is associated with additional uncertainties compared to conventional estimates, which should be taken into account when interpreting the corresponding findings.

  5. Insights into Engineering Education Administration.

    ERIC Educational Resources Information Center

    American Society for Engineering Education, Washington, DC.

    Twelve articles that are designed to provide ideas to engineering department heads are presented. Articles and authors are as follows: "Estimating Undergraduate Student Capacity for an Engineering Department," (T. W. F. Russell, R. L. Daughtery, A. F. Graziano); "Financial Evaluation of Education Programs," (George DePuy and Ralph Swalm); "The…

  6. New robust statistical procedures for the polytomous logistic regression models.

    PubMed

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  7. A Comparison of Factor Score Estimation Methods in the Presence of Missing Data: Reliability and an Application to Nicotine Dependence

    ERIC Educational Resources Information Center

    Estabrook, Ryne; Neale, Michael

    2013-01-01

    Factor score estimation is a controversial topic in psychometrics, and the estimation of factor scores from exploratory factor models has historically received a great deal of attention. However, both confirmatory factor models and the existence of missing data have generally been ignored in this debate. This article presents a simulation study…

  8. Data-Rate Estimation for Autonomous Receiver Operation

    NASA Technical Reports Server (NTRS)

    Tkacenko, A.; Simon, M. K.

    2005-01-01

    In this article, we present a series of algorithms for estimating the data rate of a signal whose admissible data rates are integer base, integer powered multiples of a known basic data rate. These algorithms can be applied to the Electra radio currently used in the Deep Space Network (DSN), which employs data rates having the above relationship. The estimation is carried out in an autonomous setting in which very little a priori information is assumed. It is done by exploiting an elegant property of the split symbol moments estimator (SSME), which is traditionally used to estimate the signal-to-noise ratio (SNR) of the received signal. By quantizing the assumed symbol-timing error or jitter, we present an all-digital implementation of the SSME which can be used to jointly estimate the data rate, SNR, and jitter. Simulation results presented show that these joint estimation algorithms perform well, even in the low SNR regions typically encountered in the DSN.

  9. An introduction to Bayesian statistics in health psychology.

    PubMed

    Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske

    2017-09-01

    The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.

  10. [The reference pricing of pharmaceuticals in European countries].

    PubMed

    Gildeyeva, G N; Starykh, D A

    2013-01-01

    The article presents the analysis of various approaches to estimation of pharmaceuticals prices in conditions of actual systems of pharmaceuticals support. The pricing is considered in pegging to actual systems of pharmaceuticals support based on the principles of insurance and co-financing. The detailed analysis is presented concerning the methodology of estimation of reference prices of pharmaceuticals in different countries of Europe. The experience of European countries in evaluation of interchangeability of pharmaceuticals is discussed.

  11. Estimating Green Net National Product for Puerto Rico: An Economic Measure of Sustainability (Journal article)

    EPA Science Inventory

    This paper presents the data sources and methodology used to estimate Green Net National Product (GNNP), an economic metric of sustainability, for Puerto Rico. Using the change in GNNP as a one-sided test of weak sustainability (i.e., positive growth in GNNP is not enough to show...

  12. National health expenditures, 1988

    PubMed Central

    1990-01-01

    Every year, analysts in the Health Care Financing Administration present figures on what our Nation spends for health. As the result of a comprehensive re-examination of the definitions, concepts, methods, and data sources used to prepare those figures, this year's report contains new estimates of national health expenditures for calendar years 1960 through 1988. Significant changes have been made to estimates of spending for professional services and to estimates of what consumers pay out of pocket for health care. In the first article, trends in use of and expenditure for various types of goods and services are discussed, as well as trends in the sources of funds used to finance health care. In a companion article, the benchmark process is described in more detail, as are the data sources and methods used to prepare annual estimates of health expenditures. PMID:10113395

  13. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  14. Estimation of fecundability from survey data.

    PubMed

    Goldman, N; Westoff, C F; Paul, L E

    1985-01-01

    The estimation of fecundability from survey data is plagued by methodological problems such as misreporting of dates of birth and marriage and the occurrence of premarital exposure to the risk of conception. Nevertheless, estimates of fecundability from World Fertility Survey data for women married in recent years appear to be plausible for most of the surveys analyzed here and are quite consistent with estimates reported in earlier studies. The estimates presented in this article are all derived from the first interval, the interval between marriage or consensual union and the first live birth conception.

  15. Dataset on the cost estimation for spent filter backwash water (SFBW) treatment.

    PubMed

    Ebrahimi, Afshin; Mahdavi, Mokhtar; Pirsaheb, Meghdad; Alimohammadi, Fariborz; Mahvi, Amir Hossein

    2017-12-01

    The dataset presented in this article are related to the research article entitled "Hybrid coagulation-UF processes for spent filter backwash water treatment: a comparison studies for PAFCl and FeCl 3 as a pre-treatment" (Ebrahimi et al., 2017) [1]. This article reports the cost estimation for treating produced spent filter backwash water (SFBW) during water treatment in Isfahan- Iran by various methods including primary sedimentation, coagulation & flocculation, second clarification, ultra filtration (UF) and recirculation of settled SFBW to water treatment plant (WTP) entrance. Coagulation conducted by PAFCl and FeCl 3 as pre polymerized and traditional coagulants. Cost estimation showed that contrary to expectations, the recirculation of settled SFBW to WTP entrance is more expensive than other method and it costs about $ 37,814,817.6. Versus the cheapest option related to separate primary sedimentation, coagulation & flocculation in WTP. This option cost about $ 4,757,200 and $ 950,213 when FeCl3 and PAFCl used as coagulant, respectively.

  16. Bigger is Better, but at What Cost? Estimating the Economic Value of Incremental Data Assets.

    PubMed

    Dalessandro, Brian; Perlich, Claudia; Raeder, Troy

    2014-06-01

    Many firms depend on third-party vendors to supply data for commercial predictive modeling applications. An issue that has received very little attention in the prior research literature is the estimation of a fair price for purchased data. In this work we present a methodology for estimating the economic value of adding incremental data to predictive modeling applications and present two cases studies. The methodology starts with estimating the effect that incremental data has on model performance in terms of common classification evaluation metrics. This effect is then translated into economic units, which gives an expected economic value that the firm might realize with the acquisition of a particular data asset. With this estimate a firm can then set a data acquisition price that targets a particular return on investment. This article presents the methodology in full detail and illustrates it in the context of two marketing case studies.

  17. L-statistics for Repeated Measurements Data With Application to Trimmed Means, Quantiles and Tolerance Intervals.

    PubMed

    Assaad, Houssein I; Choudhary, Pankaj K

    2013-01-01

    The L -statistics form an important class of estimators in nonparametric statistics. Its members include trimmed means and sample quantiles and functions thereof. This article is devoted to theory and applications of L -statistics for repeated measurements data, wherein the measurements on the same subject are dependent and the measurements from different subjects are independent. This article has three main goals: (a) Show that the L -statistics are asymptotically normal for repeated measurements data. (b) Present three statistical applications of this result, namely, location estimation using trimmed means, quantile estimation and construction of tolerance intervals. (c) Obtain a Bahadur representation for sample quantiles. These results are generalizations of similar results for independently and identically distributed data. The practical usefulness of these results is illustrated by analyzing a real data set involving measurement of systolic blood pressure. The properties of the proposed point and interval estimators are examined via simulation.

  18. State Education Finance and Governance Profile: South Carolina

    ERIC Educational Resources Information Center

    Conrad, Zachary

    2010-01-01

    This article presents the state education finance and governance profile of South Carolina. In the state of South Carolina, the population in 1990 was estimated at 3,486,310, and as of July 2008 the population was estimated at 4,479,800. In terms of education funding, the K-12 education General Fund appropriation is $2,441,044,733 for Fiscal Year…

  19. On the Utility of National Datasets and Resource Cost Models for Estimating Faculty Instructional Costs in Higher Education

    ERIC Educational Resources Information Center

    Morphew, Christopher; Baker, Bruce

    2007-01-01

    In this article, the authors present the results of a research study in which they used two national datasets to construct and examine a model that estimates relative faculty instructional costs for specific undergraduate degree programs and also identifies differences in these costs by region and institutional type. They conducted this research…

  20. A Bayesian Model for the Estimation of Latent Interaction and Quadratic Effects When Latent Variables Are Non-Normally Distributed

    ERIC Educational Resources Information Center

    Kelava, Augustin; Nagengast, Benjamin

    2012-01-01

    Structural equation models with interaction and quadratic effects have become a standard tool for testing nonlinear hypotheses in the social sciences. Most of the current approaches assume normally distributed latent predictor variables. In this article, we present a Bayesian model for the estimation of latent nonlinear effects when the latent…

  1. State-Space Modeling of Dynamic Psychological Processes via the Kalman Smoother Algorithm: Rationale, Finite Sample Properties, and Applications

    ERIC Educational Resources Information Center

    Song, Hairong; Ferrer, Emilio

    2009-01-01

    This article presents a state-space modeling (SSM) technique for fitting process factor analysis models directly to raw data. The Kalman smoother via the expectation-maximization algorithm to obtain maximum likelihood parameter estimates is used. To examine the finite sample properties of the estimates in SSM when common factors are involved, a…

  2. Passing Decisions in Football: Introducing an Empirical Approach to Estimating the Effects of Perceptual Information and Associative Knowledge.

    PubMed

    Steiner, Silvan

    2018-01-01

    The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented.

  3. Passing Decisions in Football: Introducing an Empirical Approach to Estimating the Effects of Perceptual Information and Associative Knowledge

    PubMed Central

    Steiner, Silvan

    2018-01-01

    The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented. PMID:29623057

  4. Writing a review article - Are you making these mistakes?

    PubMed

    Daldrup-Link, Heike E

    2018-01-01

    An explosion of scientific publications over the last decades has increased the need for review articles: Carefully crafted scientific review articles can provide the novice reader with an overview of a new subject and provide the expert with a synthesis of scientific evidence, proof of reproducibility of published data and pooled estimates of common truth through meta-analyses. Unfortunately, while there are ample presentations and published guidelines for the preparation of scientific articles available, detailed information about how to properly prepare scientific review articles is relatively scarce. This perspective summarizes possible mistakes that can lead to misinformation in scientific review articles with the goal to help authors to improve the scientific contribution of their review article and thereby, increase the respective value of these articles for the scientific community.

  5. Private health insurance: New measures of a complex and changing industry

    PubMed Central

    Arnett, Ross H.; Trapnell, Gordon R.

    1984-01-01

    Private health insurance benefit payments are an integral component of estimates of national health expenditures. Recent analyses indicate that the insurance industry has undergone significant changes since the mid-1970's. As a result of these study findings and corresponding changes to estimating techniques, private health insurance estimates have been revised upward. This has had a major impact on national health expenditure estimates. This article describes the changes that have occurred in the industry, discusses some of the implications of those changes, presents a new methodology to measure private health insurance and the resulting estimate levels, and then examines concepts that underpin these estimates. PMID:10310950

  6. Complementary filter implementation in the dynamic language Lua

    NASA Astrophysics Data System (ADS)

    Sadowski, Damian; Sawicki, Aleksander; Lukšys, Donatas; Slanina, Zdenek

    2017-08-01

    The article presents the complementary filter implementation, that is used for the estimation of the pitch angle, in Lua script language. Inertial sensors as accelerometer and gyroscope were used in the study. Methods of angles estimation using acceleration and angular velocity sensors were presented in the theoretical part of the article. The operating principle of complementary filter has been presented. The prototype of Butterworth's analogue filter and its digital equivalent have been designed. Practical implementation of the issue was performed with the use of PC and DISCOVERY evaluation board equipped with STM32F01 processor, L3GD20 gyroscope and LS303DLHC accelerometer. Measurement data was transmitted by UART serial interface, then processed with the use of Lua software and luaRS232 programming library. Practical implementation was divided into two stages. In the first part, measurement data has been recorded and then processed with help of a complementary filter. In the second step, coroutines mechanism was used to filter data in real time.

  7. Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.

    PubMed

    Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B

    2011-03-01

    Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.

  8. Exergetic analysis of autonomous power complex for drilling rig

    NASA Astrophysics Data System (ADS)

    Lebedev, V. A.; Karabuta, V. S.

    2017-10-01

    The article considers the issue of increasing the energy efficiency of power equipment of the drilling rig. At present diverse types of power plants are used in power supply systems. When designing and choosing a power plant, one of the main criteria is its energy efficiency. The main indicator in this case is the effective efficiency factor calculated by the method of thermal balances. In the article, it is suggested to use the exergy method to determine energy efficiency, which allows to perform estimations of the thermodynamic perfection degree of the system by the example of a gas turbine plant: relative estimation (exergetic efficiency factor) and an absolute estimation. An exergetic analysis of the gas turbine plant operating in a simple scheme was carried out using the program WaterSteamPro. Exergy losses in equipment elements are calculated.

  9. Structural nested mean models for assessing time-varying effect moderation.

    PubMed

    Almirall, Daniel; Ten Have, Thomas; Murphy, Susan A

    2010-03-01

    This article considers the problem of assessing causal effect moderation in longitudinal settings in which treatment (or exposure) is time varying and so are the covariates said to moderate its effect. Intermediate causal effects that describe time-varying causal effects of treatment conditional on past covariate history are introduced and considered as part of Robins' structural nested mean model. Two estimators of the intermediate causal effects, and their standard errors, are presented and discussed: The first is a proposed two-stage regression estimator. The second is Robins' G-estimator. The results of a small simulation study that begins to shed light on the small versus large sample performance of the estimators, and on the bias-variance trade-off between the two estimators are presented. The methodology is illustrated using longitudinal data from a depression study.

  10. A Bayesian Approach to a Multiple-Group Latent Class-Profile Analysis: The Timing of Drinking Onset and Subsequent Drinking Behaviors among U.S. Adolescents

    ERIC Educational Resources Information Center

    Chung, Hwan; Anthony, James C.

    2013-01-01

    This article presents a multiple-group latent class-profile analysis (LCPA) by taking a Bayesian approach in which a Markov chain Monte Carlo simulation is employed to achieve more robust estimates for latent growth patterns. This article describes and addresses a label-switching problem that involves the LCPA likelihood function, which has…

  11. Reliability and Maintainability Data for Lead Lithium Cooling Systems

    DOE PAGES

    Cadwallader, Lee

    2016-11-16

    This article presents component failure rate data for use in assessment of lead lithium cooling systems. Best estimate data applicable to this liquid metal coolant is presented. Repair times for similar components are also referenced in this work. These data support probabilistic safety assessment and reliability, availability, maintainability and inspectability analyses.

  12. Pollution abatement and control expenditures

    DOT National Transportation Integrated Search

    1996-09-01

    BEAs pollution abatement and control program (PAC) is being discontinued. The estimates presented in this article are the last of the annual series. BEA is reallocating resources away from some existing programs in order to move ahead with the mos...

  13. Estimating the lifetime risk of cancer associated with multiple CT scans.

    PubMed

    Ivanov, V K; Kashcheev, V V; Chekin, S Yu; Menyaylo, A N; Pryakhin, E A; Tsyb, A F; Mettler, F A

    2014-12-01

    Multiple CT scans are often done on the same patient resulting in an increased risk of cancer. Prior publications have estimated risks on a population basis and often using an effective dose. Simply adding up the risks from single scans does not correctly account for the survival function. A methodology for estimating personal radiation risks attributed to multiple CT imaging using organ doses is presented in this article. The estimated magnitude of the attributable risk fraction for the possible development of radiation-induced cancer indicates the necessity for strong clinical justification when ordering multiple CT scans.

  14. A log-linear model approach to estimation of population size using the line-transect sampling method

    USGS Publications Warehouse

    Anderson, D.R.; Burnham, K.P.; Crain, B.R.

    1978-01-01

    The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.

  15. Upper Grades Ideas.

    ERIC Educational Resources Information Center

    Thornburg, David; Beane, Pam

    1983-01-01

    Presents programming ideas using LOGO, activity for converting flowchart into a computer program, and a Pascal program for generating music using paddles. Includes the article "Helping Computers Adapt to Kids" by Philip Nothnagle; a program for estimating length of lines is included. (JN)

  16. Reliability and safety, and the risk of construction damage in mining areas

    NASA Astrophysics Data System (ADS)

    Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz

    2018-04-01

    This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.

  17. Body Weight Estimation for Dose-Finding and Health Monitoring of Lying, Standing and Walking Patients Based on RGB-D Data

    PubMed Central

    May, Stefan

    2018-01-01

    This paper describes the estimation of the body weight of a person in front of an RGB-D camera. A survey of different methods for body weight estimation based on depth sensors is given. First, an estimation of people standing in front of a camera is presented. Second, an approach based on a stream of depth images is used to obtain the body weight of a person walking towards a sensor. The algorithm first extracts features from a point cloud and forwards them to an artificial neural network (ANN) to obtain an estimation of body weight. Besides the algorithm for the estimation, this paper further presents an open-access dataset based on measurements from a trauma room in a hospital as well as data from visitors of a public event. In total, the dataset contains 439 measurements. The article illustrates the efficiency of the approach with experiments with persons lying down in a hospital, standing persons, and walking persons. Applicable scenarios for the presented algorithm are body weight-related dosing of emergency patients. PMID:29695098

  18. Body Weight Estimation for Dose-Finding and Health Monitoring of Lying, Standing and Walking Patients Based on RGB-D Data.

    PubMed

    Pfitzner, Christian; May, Stefan; Nüchter, Andreas

    2018-04-24

    This paper describes the estimation of the body weight of a person in front of an RGB-D camera. A survey of different methods for body weight estimation based on depth sensors is given. First, an estimation of people standing in front of a camera is presented. Second, an approach based on a stream of depth images is used to obtain the body weight of a person walking towards a sensor. The algorithm first extracts features from a point cloud and forwards them to an artificial neural network (ANN) to obtain an estimation of body weight. Besides the algorithm for the estimation, this paper further presents an open-access dataset based on measurements from a trauma room in a hospital as well as data from visitors of a public event. In total, the dataset contains 439 measurements. The article illustrates the efficiency of the approach with experiments with persons lying down in a hospital, standing persons, and walking persons. Applicable scenarios for the presented algorithm are body weight-related dosing of emergency patients.

  19. Can Reliability of Multiple Component Measuring Instruments Depend on Response Option Presentation Mode?

    ERIC Educational Resources Information Center

    Menold, Natalja; Raykov, Tenko

    2016-01-01

    This article examines the possible dependency of composite reliability on presentation format of the elements of a multi-item measuring instrument. Using empirical data and a recent method for interval estimation of group differences in reliability, we demonstrate that the reliability of an instrument need not be the same when polarity of the…

  20. Testing for Two-Way Interactions in the Multigroup Common Factor Model

    ERIC Educational Resources Information Center

    van Smeden, Maarten; Hessen, David J.

    2013-01-01

    In this article, a 2-way multigroup common factor model (MG-CFM) is presented. The MG-CFM can be used to estimate interaction effects between 2 grouping variables on 1 or more hypothesized latent variables. For testing the significance of such interactions, a likelihood ratio test is presented. In a simulation study, the robustness of the…

  1. Neural Correlates of Moral Sensitivity and Moral Judgment Associated with Brain Circuitries of Selfhood: A Meta-Analysis

    ERIC Educational Resources Information Center

    Han, Hyemin

    2017-01-01

    The present study meta-analyzed 45 experiments with 959 subjects and 463 activation foci reported in 43 published articles that investigated the neural mechanism of moral functions by comparing neural activity between the moral task conditions and non-moral task conditions with the Activation Likelihood Estimation method. The present study…

  2. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    PubMed

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  3. Using the Ridge Regression Procedures to Estimate the Multiple Linear Regression Coefficients

    NASA Astrophysics Data System (ADS)

    Gorgees, HazimMansoor; Mahdi, FatimahAssim

    2018-05-01

    This article concerns with comparing the performance of different types of ordinary ridge regression estimators that have been already proposed to estimate the regression parameters when the near exact linear relationships among the explanatory variables is presented. For this situations we employ the data obtained from tagi gas filling company during the period (2008-2010). The main result we reached is that the method based on the condition number performs better than other methods since it has smaller mean square error (MSE) than the other stated methods.

  4. A framework for analyzing the economic tradeoffs between urban commerce and security against terrorism.

    PubMed

    Rose, Adam; Avetisyan, Misak; Chatterjee, Samrat

    2014-08-01

    This article presents a framework for economic consequence analysis of terrorism countermeasures. It specifies major categories of direct and indirect costs, benefits, spillover effects, and transfer payments that must be estimated in a comprehensive assessment. It develops a spreadsheet tool for data collection, storage, and refinement, as well as estimation of the various components of the necessary economic accounts. It also illustrates the usefulness of the framework in the first assessment of the tradeoffs between enhanced security and changes in commercial activity in an urban area, with explicit attention to the role of spillover effects. The article also contributes a practical user interface to the model for emergency managers. © 2014 Society for Risk Analysis.

  5. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    PubMed

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  6. Self-Centered Management Skills and Knowledge Appropriation by Students in High Schools and Private Secondary Schools of the City of Maroua

    ERIC Educational Resources Information Center

    Oyono, Tadjuidje Michel

    2016-01-01

    Knowledge in its process of appropriation necessitates on the part of the learner, the mobilization of an efficient management strategy of adapted competencies. The present article in its problematic presents the theoretical perspective of Desaunay (1985) which estimates that three fundamental competences (relational, technical and affective) have…

  7. Forest Plots in Excel: Moving beyond a Clump of Trees to a Forest of Visual Information

    ERIC Educational Resources Information Center

    Derzon, James H.; Alford, Aaron A.

    2013-01-01

    Forest plots provide an effective means of presenting a wealth of information in a single graphic. Whether used to illustrate multiple results in a single study or the cumulative knowledge of an entire field, forest plots have become an accepted and generally understood way of presenting many estimates simultaneously. This article explores…

  8. Asynchronous variational integration using continuous assumed gradient elements.

    PubMed

    Wolff, Sebastian; Bucher, Christian

    2013-03-01

    Asynchronous variational integration (AVI) is a tool which improves the numerical efficiency of explicit time stepping schemes when applied to finite element meshes with local spatial refinement. This is achieved by associating an individual time step length to each spatial domain. Furthermore, long-term stability is ensured by its variational structure. This article presents AVI in the context of finite elements based on a weakened weak form (W2) Liu (2009) [1], exemplified by continuous assumed gradient elements Wolff and Bucher (2011) [2]. The article presents the main ideas of the modified AVI, gives implementation notes and a recipe for estimating the critical time step.

  9. Proceedings of the 1997 Northeastern Recreation Research Symposium

    Treesearch

    Hans G. Vogelsong; [Editor

    1998-01-01

    Contains articles presented at the 1997 Northeastern Recreation Research Symposium. Contents cover recreation; protected areas and social science; water based recreation management studies; forest recreation management studies; outdoor recreation management studies; estimation of economic impact of recreation and tourism; place meaning and attachment; tourism studies;...

  10. Muzea jako przedmiot zainteresowania turystyki kulturowej na przykładzie województwa łódzkiego

    NASA Astrophysics Data System (ADS)

    Krakowiak, Beata

    2009-01-01

    The article presents the potential created by the museums located in Łódzkie voivodeship. The author tries to explain what attracts visitors to such posts, characterizes tourist expectations connected with the analyzed museums and estimates them.

  11. Adapting the M3 Surveillance Metrics for an Unknown Baseline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Michael Scott; Abes, Jeff I.; Jaramillo, Brandon Michael Lee

    The original M 3 surveillance metrics assume that the baseline is known. In this article, adapted M 3 metrics are presented when the baseline is not known and estimated by available data. Deciding on how much available data is enough is also discussed.

  12. A Bayesian approach to estimating variance components within a multivariate generalizability theory framework.

    PubMed

    Jiang, Zhehan; Skorupski, William

    2017-12-12

    In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.

  13. Optimization and experimental validation of a thermal cycle that maximizes entropy coefficient fisher identifiability for lithium iron phosphate cells

    NASA Astrophysics Data System (ADS)

    Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam

    2016-03-01

    This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.

  14. Reliability reporting across studies using the Buss Durkee Hostility Inventory.

    PubMed

    Vassar, Matt; Hale, William

    2009-01-01

    Empirical research on anger and hostility has pervaded the academic literature for more than 50 years. Accurate measurement of anger/hostility and subsequent interpretation of results requires that the instruments yield strong psychometric properties. For consistent measurement, reliability estimates must be calculated with each administration, because changes in sample characteristics may alter the scale's ability to generate reliable scores. Therefore, the present study was designed to address reliability reporting practices for a widely used anger assessment, the Buss Durkee Hostility Inventory (BDHI). Of the 250 published articles reviewed, 11.2% calculated and presented reliability estimates for the data at hand, 6.8% cited estimates from a previous study, and 77.1% made no mention of score reliability. Mean alpha estimates of scores for BDHI subscales generally fell below acceptable standards. Additionally, no detectable pattern was found between reporting practices and publication year or journal prestige. Areas for future research are also discussed.

  15. Vehicle Lateral State Estimation Based on Measured Tyre Forces

    PubMed Central

    Tuononen, Ari J.

    2009-01-01

    Future active safety systems need more accurate information about the state of vehicles. This article proposes a method to evaluate the lateral state of a vehicle based on measured tyre forces. The tyre forces of two tyres are estimated from optically measured tyre carcass deflections and transmitted wirelessly to the vehicle body. The two remaining tyres are so-called virtual tyre sensors, the forces of which are calculated from the real tyre sensor estimates. The Kalman filter estimator for lateral vehicle state based on measured tyre forces is presented, together with a simple method to define adaptive measurement error covariance depending on the driving condition of the vehicle. The estimated yaw rate and lateral velocity are compared with the validation sensor measurements. PMID:22291535

  16. Nuclear Waste: Increasing Scale and Sociopolitical Impacts

    ERIC Educational Resources Information Center

    La Porte, Todd R.

    1978-01-01

    Discusses the impact of radioactive waste management system on social and political development. The article also presents (1) types of information necessary to estimate the costs and consequences of radioactive waste management; and (2) an index of radioactive hazards to improve the basis for policy decisions. (HM)

  17. Modeled Estimates of Soil and Dust Ingestion Rates for Children

    EPA Science Inventory

    Daily soil/dust ingestion rates typically used in exposure and risk assessments are based on tracer element studies, which have a number of limitations and do not separate contributions from soil and dust. This article presents an alternate approach of modeling soil and dust inge...

  18. Multi-population Genomic Relationships for Estimating Current Genetic Variances Within and Genetic Correlations Between Populations.

    PubMed

    Wientjes, Yvonne C J; Bijma, Piter; Vandenplas, Jérémie; Calus, Mario P L

    2017-10-01

    Different methods are available to calculate multi-population genomic relationship matrices. Since those matrices differ in base population, it is anticipated that the method used to calculate genomic relationships affects the estimate of genetic variances, covariances, and correlations. The aim of this article is to define the multi-population genomic relationship matrix to estimate current genetic variances within and genetic correlations between populations. The genomic relationship matrix containing two populations consists of four blocks, one block for population 1, one block for population 2, and two blocks for relationships between the populations. It is known, based on literature, that by using current allele frequencies to calculate genomic relationships within a population, current genetic variances are estimated. In this article, we theoretically derived the properties of the genomic relationship matrix to estimate genetic correlations between populations and validated it using simulations. When the scaling factor of across-population genomic relationships is equal to the product of the square roots of the scaling factors for within-population genomic relationships, the genetic correlation is estimated unbiasedly even though estimated genetic variances do not necessarily refer to the current population. When this property is not met, the correlation based on estimated variances should be multiplied by a correction factor based on the scaling factors. In this study, we present a genomic relationship matrix which directly estimates current genetic variances as well as genetic correlations between populations. Copyright © 2017 by the Genetics Society of America.

  19. Utility of Equations to Estimate Peak Oxygen Uptake and Work Rate From a 6-Minute Walk Test in Patients With COPD in a Clinical Setting.

    PubMed

    Kirkham, Amy A; Pauhl, Katherine E; Elliott, Robyn M; Scott, Jen A; Doria, Silvana C; Davidson, Hanan K; Neil-Sztramko, Sarah E; Campbell, Kristin L; Camp, Pat G

    2015-01-01

    To determine the utility of equations that use the 6-minute walk test (6MWT) results to estimate peak oxygen uptake ((Equation is included in full-text article.)o2) and peak work rate with chronic obstructive pulmonary disease (COPD) patients in a clinical setting. This study included a systematic review to identify published equations estimating peak (Equation is included in full-text article.)o2 and peak work rate in watts in COPD patients and a retrospective chart review of data from a hospital-based pulmonary rehabilitation program. The following variables were abstracted from the records of 42 consecutively enrolled COPD patients: measured peak (Equation is included in full-text article.)o2 and peak work rate achieved during a cycle ergometer cardiopulmonary exercise test, 6MWT distance, age, sex, weight, height, forced expiratory volume in 1 second, forced vital capacity, and lung diffusion capacity. Estimated peak (Equation is included in full-text article.)o2 and peak work rate were estimated from 6MWT distance using published equations. The error associated with using estimated peak (Equation is included in full-text article.)o2 or peak work to prescribe aerobic exercise intensities of 60% and 80% was calculated. Eleven equations from 6 studies were identified. Agreement between estimated and measured values was poor to moderate (intraclass correlation coefficients = 0.11-0.63). The error associated with using estimated peak (Equation is included in full-text article.)o2 or peak work rate to prescribe exercise intensities of 60% and 80% of measured values ranged from mean differences of 12 to 35 and 16 to 47 percentage points, respectively. There is poor to moderate agreement between measured peak (Equation is included in full-text article.)o2 and peak work rate and estimations from equations that use 6MWT distance, and the use of the estimated values for prescription of aerobic exercise intensity would result in large error. Equations estimating peak (Equation is included in full-text article.)o2 and peak work rate are of low utility for prescribing exercise intensity in pulmonary rehabilitation programs.

  20. Application of Novel Lateral Tire Force Sensors to Vehicle Parameter Estimation of Electric Vehicles.

    PubMed

    Nam, Kanghyun

    2015-11-11

    This article presents methods for estimating lateral vehicle velocity and tire cornering stiffness, which are key parameters in vehicle dynamics control, using lateral tire force measurements. Lateral tire forces acting on each tire are directly measured by load-sensing hub bearings that were invented and further developed by NSK Ltd. For estimating the lateral vehicle velocity, tire force models considering lateral load transfer effects are used, and a recursive least square algorithm is adapted to identify the lateral vehicle velocity as an unknown parameter. Using the estimated lateral vehicle velocity, tire cornering stiffness, which is an important tire parameter dominating the vehicle's cornering responses, is estimated. For the practical implementation, the cornering stiffness estimation algorithm based on a simple bicycle model is developed and discussed. Finally, proposed estimation algorithms were evaluated using experimental test data.

  1. First estimates of the global and regional incidence of neonatal herpes infection.

    PubMed

    Looker, Katharine J; Magaret, Amalia S; May, Margaret T; Turner, Katherine M E; Vickerman, Peter; Newman, Lori M; Gottlieb, Sami L

    2017-03-01

    Neonatal herpes is a rare but potentially devastating condition with an estimated 60% fatality rate without treatment. Transmission usually occurs during delivery from mothers with herpes simplex virus type 1 (HSV-1) or type 2 (HSV-2) genital infection. However, the global burden has never been quantified to our knowledge. We developed a novel methodology for burden estimation and present first WHO global and regional estimates of the annual number of neonatal herpes cases during 2010-15. We applied previous estimates of HSV-1 and HSV-2 prevalence and incidence in women aged 15-49 years to 2010-15 birth rates to estimate infections during pregnancy. We then applied published risks of neonatal HSV transmission according to whether maternal infection was incident or prevalent with HSV-1 or HSV-2 to generate annual numbers of incident neonatal infections. We estimated the number of incident neonatal infections by maternal age, and we generated separate estimates for each WHO region, which were then summed to obtain global estimates of the number of neonatal herpes infections. Globally the overall rate of neonatal herpes was estimated to be about ten cases per 100 000 livebirths, equivalent to a best-estimate of 14 000 cases annually roughly (4000 for HSV-1; 10 000 for HSV-2). We estimated that the most neonatal herpes cases occurred in Africa, due to high maternal HSV-2 infection and high birth rates. HSV-1 contributed more cases than HSV-2 in the Americas, Europe, and Western Pacific. High rates of genital HSV-1 infection and moderate HSV-2 prevalence meant the Americas had the highest overall rate. However, our estimates are highly sensitive to the core assumptions, and considerable uncertainty exists for many settings given sparse underlying data. These neonatal herpes estimates mark the first attempt to quantify the global burden of this rare but serious condition. Better collection of primary data for neonatal herpes is crucially needed to reduce uncertainty and refine future estimates. These data are particularly important in resource-poor settings where we may have underestimated cases. Nevertheless, these first estimates suggest development of new HSV prevention measures such as vaccines could have additional benefits beyond reducing genital ulcer disease and HSV-associated HIV transmission, through prevention of neonatal herpes. World Health Organization. Copyright © 2017 World Health Organization; licensee Elsevier. This is an Open Access article published under the CC BY-NC-ND 3.0 IGO license which permits users to download and share the article for non-commercial purposes, so long as the article is reproduced in the whole without changes, and provided the original source is properly cited. This article shall not be used or reproduced in association with the promotion of commercial products, services or any entity. There should be no suggestion that WHO endorses any specific organisation, products or services. The use of the WHO logo is not permitted. This notice should be preserved along with the article's original URL.

  2. Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.

    PubMed

    Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F

    2016-09-16

    Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.

  3. Demographic characteristics and health care use and expenditures by the aged in the United States: 1977-1984

    PubMed Central

    Waldo, Daniel R.; Lazenby, Helen C.

    1984-01-01

    In recent years, increasing attention has been given to the use and financing of health care for the aged. The authors of this article summarize much of the data related to that use, and present original estimates of health spending in 1984 on behalf of the aged. The estimates are designed to indicate trends in health expenditures and are tied to aggregate personal health care expenditures from the National Health Accounts. PMID:10310847

  4. Dynamic regime marginal structural mean models for estimation of optimal dynamic treatment regimes, Part II: proofs of results.

    PubMed

    Orellana, Liliana; Rotnitzky, Andrea; Robins, James M

    2010-03-03

    In this companion article to "Dynamic Regime Marginal Structural Mean Models for Estimation of Optimal Dynamic Treatment Regimes, Part I: Main Content" [Orellana, Rotnitzky and Robins (2010), IJB, Vol. 6, Iss. 2, Art. 7] we present (i) proofs of the claims in that paper, (ii) a proposal for the computation of a confidence set for the optimal index when this lies in a finite set, and (iii) an example to aid the interpretation of the positivity assumption.

  5. Estimating and Visualizing Nonlinear Relations among Latent Variables: A Semiparametric Approach

    ERIC Educational Resources Information Center

    Pek, Jolynn; Sterba, Sonya K.; Kok, Bethany E.; Bauer, Daniel J.

    2009-01-01

    The graphical presentation of any scientific finding enhances its description, interpretation, and evaluation. Research involving latent variables is no exception, especially when potential nonlinear effects are suspect. This article has multiple aims. First, it provides a nontechnical overview of a semiparametric approach to modeling nonlinear…

  6. The Rangeland Hydrology and Erosion Model: A dynamic approach for predicting soil loss on rangelands

    USDA-ARS?s Scientific Manuscript database

    In this study we present the improved Rangeland Hydrology and Erosion Model (RHEM V2.3), a process-based erosion prediction tool specific for rangeland application. The article provides the mathematical formulation of the model and parameter estimation equations. Model performance is assessed agains...

  7. Estimating Creativity with a Multiple-Measurement Approach within Scientific and Artistic Domains

    ERIC Educational Resources Information Center

    Agnoli, Sergio; Corazza, Giovanni E.; Runco, Mark A.

    2016-01-01

    This article presents the structure and the composition of a newly developed multifaceted test battery for the measurement of creativity within scientific and artistic domains. By integrating existing procedures for the evaluation of creativity, the new battery promises to become a comprehensive assessment of creativity, encompassing both…

  8. Job-Sharing: Another Way to Work

    ERIC Educational Resources Information Center

    Rich, Les

    1978-01-01

    A permanent part-time work force estimated at sixteen to seventeen million is one of the fastest-growing segments of the work population. The article discusses and presents some examples of job sharing--two persons handling one job--as a means of increasing employment, avoiding layoffs, and meeting individual needs. (MF)

  9. Sexual Victimization of Youth

    ERIC Educational Resources Information Center

    Small, Kevonne; Zweig, Janine M.

    2007-01-01

    An estimated 7.0% to 8.1% of American youth report being sexually victimized at some point in their life time. This article presents a background to youth sexual victimization, focusing on prevalence data, challenging issues when studying this problem, risk factors, and common characteristics of perpetrators. Additionally, a type of sexual…

  10. Stochastic Approximation Methods for Latent Regression Item Response Models

    ERIC Educational Resources Information Center

    von Davier, Matthias; Sinharay, Sandip

    2010-01-01

    This article presents an application of a stochastic approximation expectation maximization (EM) algorithm using a Metropolis-Hastings (MH) sampler to estimate the parameters of an item response latent regression model. Latent regression item response models are extensions of item response theory (IRT) to a latent variable model with covariates…

  11. Computer Aided Evaluation of Higher Education Tutors' Performance

    ERIC Educational Resources Information Center

    Xenos, Michalis; Papadopoulos, Thanos

    2007-01-01

    This article presents a method for computer-aided tutor evaluation: Bayesian Networks are used for organizing the collected data about tutors and for enabling accurate estimations and predictions about future tutor behavior. The model provides indications about each tutor's strengths and weaknesses, which enables the evaluator to exploit strengths…

  12. A National Look at Children and Families Entering Early Intervention

    ERIC Educational Resources Information Center

    Scarborough, Anita A.; Spiker, Donna; Mallik, Sangeeta; Hebbeler, Kathleen M.; Bailey Jr., Donald B.; Simeonsson, Rune J.

    2004-01-01

    The National Early Intervention Longitudinal Study (NEILS) is the first study of Part C of the Individuals With Disabilities Education Act (IDEA) early intervention system with a nationally representative sample of infants and toddlers with disabilities. This article presents national estimates of characteristics of infants and toddlers and their…

  13. Reliability Generalization of Scores on the Spielberger State-Trait Anxiety Inventory.

    ERIC Educational Resources Information Center

    Barnes, Laura L. B.; Harp, Diane; Jung, Woo Sik

    2002-01-01

    Conducted a reliability generalization study for the State-Trait Anxiety Inventory (C. Spielberger, 1983) by reviewing and classifying 816 research articles. Average reliability coefficients were acceptable for both internal consistency and test-retest reliability, but variation was present among the estimates. Other differences are discussed.…

  14. Diagnostic Procedures for Detecting Nonlinear Relationships between Latent Variables

    ERIC Educational Resources Information Center

    Bauer, Daniel J.; Baldasaro, Ruth E.; Gottfredson, Nisha C.

    2012-01-01

    Structural equation models are commonly used to estimate relationships between latent variables. Almost universally, the fitted models specify that these relationships are linear in form. This assumption is rarely checked empirically, largely for lack of appropriate diagnostic techniques. This article presents and evaluates two procedures that can…

  15. Passage-Based Bibliographic Coupling: An Inter-Article Similarity Measure for Biomedical Articles

    PubMed Central

    Liu, Rey-Long

    2015-01-01

    Biomedical literature is an essential source of biomedical evidence. To translate the evidence for biomedicine study, researchers often need to carefully read multiple articles about specific biomedical issues. These articles thus need to be highly related to each other. They should share similar core contents, including research goals, methods, and findings. However, given an article r, it is challenging for search engines to retrieve highly related articles for r. In this paper, we present a technique PBC (Passage-based Bibliographic Coupling) that estimates inter-article similarity by seamlessly integrating bibliographic coupling with the information collected from context passages around important out-link citations (references) in each article. Empirical evaluation shows that PBC can significantly improve the retrieval of those articles that biomedical experts believe to be highly related to specific articles about gene-disease associations. PBC can thus be used to improve search engines in retrieving the highly related articles for any given article r, even when r is cited by very few (or even no) articles. The contribution is essential for those researchers and text mining systems that aim at cross-validating the evidence about specific gene-disease associations. PMID:26440794

  16. Passage-Based Bibliographic Coupling: An Inter-Article Similarity Measure for Biomedical Articles.

    PubMed

    Liu, Rey-Long

    2015-01-01

    Biomedical literature is an essential source of biomedical evidence. To translate the evidence for biomedicine study, researchers often need to carefully read multiple articles about specific biomedical issues. These articles thus need to be highly related to each other. They should share similar core contents, including research goals, methods, and findings. However, given an article r, it is challenging for search engines to retrieve highly related articles for r. In this paper, we present a technique PBC (Passage-based Bibliographic Coupling) that estimates inter-article similarity by seamlessly integrating bibliographic coupling with the information collected from context passages around important out-link citations (references) in each article. Empirical evaluation shows that PBC can significantly improve the retrieval of those articles that biomedical experts believe to be highly related to specific articles about gene-disease associations. PBC can thus be used to improve search engines in retrieving the highly related articles for any given article r, even when r is cited by very few (or even no) articles. The contribution is essential for those researchers and text mining systems that aim at cross-validating the evidence about specific gene-disease associations.

  17. Radiation Doses and Associated Risk From the Fukushima Nuclear Accident.

    PubMed

    Ishikawa, Tetsuo

    2017-03-01

    The magnitude of dose due to the Fukushima Daiichi Accident was estimated by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) 2013 report published in April 2014. Following this, the UNSCEAR white paper, which comprises a digest of new information for the 2013 Fukushima report, was published in October 2015. Another comprehensive report on radiation dose due to the accident is the International Atomic Energy Agency (IAEA) report on the Fukushima Daiichi Accident published in August 2015. Although the UNSCEAR and IAEA publications well summarize doses received by residents, they review only literature published before the end of December 2014 and the end of March 2015, respectively. However, some studies on dose estimation have been published since then. In addition, the UNSCEAR 2013 report states it was likely that some overestimation had been introduced generally by the methodology used by the Committee. For example, effects of decontamination were not considered in the lifetime external dose estimated. Decontamination is in progress for most living areas in Fukushima Prefecture, which could reduce long-term external dose to residents. This article mainly reviews recent English language articles that may add new information to the UNSCEAR and IAEA publications. Generally, recent articles suggest lower doses than those presented by the UNSCEAR 2013 report.

  18. Self-Estimation of Blood Alcohol Concentration: A Review

    PubMed Central

    Aston, Elizabeth R.; Liguori, Anthony

    2013-01-01

    This article reviews the history of blood alcohol concentration (BAC) estimation training, which trains drinkers to discriminate distinct BAC levels and thus avoid excessive alcohol consumption. BAC estimation training typically combines education concerning alcohol metabolism with attention to subjective internal cues associated with specific concentrations. Estimation training was originally conceived as a component of controlled drinking programs. However, dependent drinkers were unsuccessful in BAC estimation, likely due to extreme tolerance. In contrast, moderate drinkers successfully acquired this ability. A subsequent line of research translated laboratory estimation studies to naturalistic settings by studying large samples of drinkers in their preferred drinking environments. Thus far, naturalistic studies have provided mixed results regarding the most effective form of BAC feedback. BAC estimation training is important because it imparts an ability to perceive individualized impairment that may be present below the legal limit for driving. Consequently, the training can be a useful component for moderate drinkers in drunk driving prevention programs. PMID:23380489

  19. Application of Novel Lateral Tire Force Sensors to Vehicle Parameter Estimation of Electric Vehicles

    PubMed Central

    Nam, Kanghyun

    2015-01-01

    This article presents methods for estimating lateral vehicle velocity and tire cornering stiffness, which are key parameters in vehicle dynamics control, using lateral tire force measurements. Lateral tire forces acting on each tire are directly measured by load-sensing hub bearings that were invented and further developed by NSK Ltd. For estimating the lateral vehicle velocity, tire force models considering lateral load transfer effects are used, and a recursive least square algorithm is adapted to identify the lateral vehicle velocity as an unknown parameter. Using the estimated lateral vehicle velocity, tire cornering stiffness, which is an important tire parameter dominating the vehicle’s cornering responses, is estimated. For the practical implementation, the cornering stiffness estimation algorithm based on a simple bicycle model is developed and discussed. Finally, proposed estimation algorithms were evaluated using experimental test data. PMID:26569246

  20. Survival analysis for the missing censoring indicator model using kernel density estimation techniques

    PubMed Central

    Subramanian, Sundarraman

    2008-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented. PMID:18953423

  1. Survival analysis for the missing censoring indicator model using kernel density estimation techniques.

    PubMed

    Subramanian, Sundarraman

    2006-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented.

  2. Marginal regression analysis of recurrent events with coarsened censoring times.

    PubMed

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  3. Methodical approaches to value assessment and determination of the capitalization level of high-rise construction

    NASA Astrophysics Data System (ADS)

    Smirnov, Vitaly; Dashkov, Leonid; Gorshkov, Roman; Burova, Olga; Romanova, Alina

    2018-03-01

    The article presents the analysis of the methodological approaches to cost estimation and determination of the capitalization level of high-rise construction objects. Factors determining the value of real estate were considered, three main approaches for estimating the value of real estate objects are given. The main methods of capitalization estimation were analyzed, the most reasonable method for determining the level of capitalization of high-rise buildings was proposed. In order to increase the value of real estate objects, the author proposes measures that enable to increase significantly the capitalization of the enterprise through more efficient use of intangible assets and goodwill.

  4. Method and apparatus for measurement of orientation in an anisotropic medium

    DOEpatents

    Gilmore, Robert Snee; Kline, Ronald Alan; Deaton, Jr., John Broddus

    1999-01-01

    A method and apparatus are provided for simultaneously measuring the anisotropic orientation and the thickness of an article. The apparatus comprises a transducer assembly which propagates longitudinal and transverse waves through the article and which receives reflections of the waves. A processor is provided to measure respective transit times of the longitudinal and shear waves propagated through the article and to calculate respective predicted transit times of the longitudinal and shear waves based on an estimated thickness, an estimated anisotropic orientation, and an elasticity of the article. The processor adjusts the estimated thickness and the estimated anisotropic orientation to reduce the difference between the measured transit times and the respective predicted transit times of the longitudinal and shear waves.

  5. An Application of Structural Equation Modeling for Developing Good Teaching Characteristics Ontology

    ERIC Educational Resources Information Center

    Phiakoksong, Somjin; Niwattanakul, Suphakit; Angskun, Thara

    2013-01-01

    Ontology is a knowledge representation technique which aims to make knowledge explicit by defining the core concepts and their relationships. The Structural Equation Modeling (SEM) is a statistical technique which aims to explore the core factors from empirical data and estimates the relationship between these factors. This article presents an…

  6. Maternal Employment and Caring for Children with Disabilities. Data Trends #95

    ERIC Educational Resources Information Center

    Research and Training Center on Family Support and Children's Mental Health, 2004

    2004-01-01

    "Data Trends" reports present summaries of research on mental health services for children and adolescents and their families. The article summarized in this "Data Trends" addresses several gaps in previous research estimating the impact of caregiving on employment. For instance, prior studies employ a variety of disability definitions, making it…

  7. Sourcing for Parameter Estimation and Study of Logistic Differential Equation

    ERIC Educational Resources Information Center

    Winkel, Brian J.

    2012-01-01

    This article offers modelling opportunities in which the phenomena of the spread of disease, perception of changing mass, growth of technology, and dissemination of information can be described by one differential equation--the logistic differential equation. It presents two simulation activities for students to generate real data, as well as…

  8. Guide to the Table

    ERIC Educational Resources Information Center

    Occupational Outlook Quarterly, 2010

    2010-01-01

    This article presents a table that provides a snapshot of how employment is expected to change in 289 occupations. For each occupation, it shows estimated employment in 2008, the projected numeric change in employment (that is, how many jobs are expected to be gained or lost) over the 2008-18 decade, and the projected percent change in employment…

  9. The Ultimate Practitioner

    ERIC Educational Resources Information Center

    Richardson, Joan

    2011-01-01

    A solid idea coupled with savvy marketing has enabled Rick DuFour's vision of professional learning communities to revolutionize how teachers work with each other. The author of 13 books and 80 articles has presented to an estimated quarter million educators over the last eight years and is beginning to reach beyond the U.S. with books and…

  10. Learning about and from a Distribution of Program Impacts Using Multisite Trials

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Bloom, Howard S.

    2015-01-01

    The present article provides a synthesis of the conceptual and statistical issues involved in using multisite randomized trials to learn about and from a distribution of heterogeneous program impacts across individuals and/or program sites. Learning "about" such a distribution involves estimating its mean value, detecting and quantifying…

  11. Do Charter Schools Improve Student Achievement?

    ERIC Educational Resources Information Center

    Clark, Melissa A.; Gleason, Philip M.; Tuttle, Christina Clark; Silverberg, Marsha K.

    2015-01-01

    This article presents findings from a lottery-based study of the impacts of a broad set of 33 charter middle schools across 13 states on student achievement. To estimate charter school impacts, we compare test score outcomes of students admitted to these schools through the randomized admissions lotteries with outcomes of applicants who were not…

  12. Modelling Sublimation of Carbon Dioxide

    ERIC Educational Resources Information Center

    Winkel, Brian

    2012-01-01

    In this article, the author reports results in their efforts to model sublimation of carbon dioxide and the associated kinetics order and parameter estimation issues in their model. They have offered the reader two sets of data and several approaches to determine the rate of sublimation of a piece of solid dry ice. They presented several models…

  13. An Economic Wellbeing Index for the Spanish Provinces: A Data Envelopment Analysis Approach

    ERIC Educational Resources Information Center

    Murias, Pilar; Martinez, Fidel; De Miguel, Carlos

    2006-01-01

    This article presents the estimation of a synthetic economic wellbeing index using Data Envelopment Analysis (DEA). The DEA is a multidimensional technique that has its origins in efficiency analysis, but its usage within the social indicators context is particularly appropriate. It allows the researcher to take advantage of the inherent…

  14. Copula Models for Sociology: Measures of Dependence and Probabilities for Joint Distributions

    ERIC Educational Resources Information Center

    Vuolo, Mike

    2017-01-01

    Often in sociology, researchers are confronted with nonnormal variables whose joint distribution they wish to explore. Yet, assumptions of common measures of dependence can fail or estimating such dependence is computationally intensive. This article presents the copula method for modeling the joint distribution of two random variables, including…

  15. The Validity of the Comparative Interrupted Time Series Design for Evaluating the Effect of School-Level Interventions.

    PubMed

    Jacob, Robin; Somers, Marie-Andree; Zhu, Pei; Bloom, Howard

    2016-06-01

    In this article, we examine whether a well-executed comparative interrupted time series (CITS) design can produce valid inferences about the effectiveness of a school-level intervention. This article also explores the trade-off between bias reduction and precision loss across different methods of selecting comparison groups for the CITS design and assesses whether choosing matched comparison schools based only on preintervention test scores is sufficient to produce internally valid impact estimates. We conduct a validation study of the CITS design based on the federal Reading First program as implemented in one state using results from a regression discontinuity design as a causal benchmark. Our results contribute to the growing base of evidence regarding the validity of nonexperimental designs. We demonstrate that the CITS design can, in our example, produce internally valid estimates of program impacts when multiple years of preintervention outcome data (test scores in the present case) are available and when a set of reasonable criteria are used to select comparison organizations (schools in the present case). © The Author(s) 2016.

  16. Studying radiation hardness of a cadmium tungstate crystal based radiation detector

    NASA Astrophysics Data System (ADS)

    Shtein, M. M.; Smekalin, L. F.; Stepanov, S. A.; Zatonov, I. A.; Tkacheva, T. V.; Usachev, E. Yu

    2016-06-01

    The given article considers radiation hardness of an X-ray detector used in production of non-destructive testing instruments and inspection systems. In the course of research, experiments were carried out to estimate radiation hardness of a detector based on cadmium tungstate crystal and its structural components individually. The article describes a layout of an experimental facility that was used for measurements of radiation hardness. The radiation dose dependence of the photodiode current is presented, when it is excited by a light flux of a scintillator or by an external light source. Experiments were carried out to estimate radiation hardness of two types of optical glue used in detector production; they are based on silicon rubber and epoxy. With the help of a spectrophotometer and cobalt gun, each of the glue samples was measured for a relative light transmission factor with different wavelengths, depending on the radiation dose. The obtained data are presented in a comprehensive analysis of the results. It was determined, which of the glue samples is most suitable for production of detectors working under exposure to strong radiation.

  17. Analysis and meta-analysis of single-case designs: an introduction.

    PubMed

    Shadish, William R

    2014-04-01

    The last 10 years have seen great progress in the analysis and meta-analysis of single-case designs (SCDs). This special issue includes five articles that provide an overview of current work on that topic, including standardized mean difference statistics, multilevel models, Bayesian statistics, and generalized additive models. Each article analyzes a common example across articles and presents syntax or macros for how to do them. These articles are followed by commentaries from single-case design researchers and journal editors. This introduction briefly describes each article and then discusses several issues that must be addressed before we can know what analyses will eventually be best to use in SCD research. These issues include modeling trend, modeling error covariances, computing standardized effect size estimates, assessing statistical power, incorporating more accurate models of outcome distributions, exploring whether Bayesian statistics can improve estimation given the small samples common in SCDs, and the need for annotated syntax and graphical user interfaces that make complex statistics accessible to SCD researchers. The article then discusses reasons why SCD researchers are likely to incorporate statistical analyses into their research more often in the future, including changing expectations and contingencies regarding SCD research from outside SCD communities, changes and diversity within SCD communities, corrections of erroneous beliefs about the relationship between SCD research and statistics, and demonstrations of how statistics can help SCD researchers better meet their goals. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  18. PubMed related articles: a probabilistic topic-based model for content similarity

    PubMed Central

    Lin, Jimmy; Wilbur, W John

    2007-01-01

    Background We present a probabilistic topic-based model for content similarity called pmra that underlies the related article search feature in PubMed. Whether or not a document is about a particular topic is computed from term frequencies, modeled as Poisson distributions. Unlike previous probabilistic retrieval models, we do not attempt to estimate relevance–but rather our focus is "relatedness", the probability that a user would want to examine a particular document given known interest in another. We also describe a novel technique for estimating parameters that does not require human relevance judgments; instead, the process is based on the existence of MeSH ® in MEDLINE ®. Results The pmra retrieval model was compared against bm25, a competitive probabilistic model that shares theoretical similarities. Experiments using the test collection from the TREC 2005 genomics track shows a small but statistically significant improvement of pmra over bm25 in terms of precision. Conclusion Our experiments suggest that the pmra model provides an effective ranking algorithm for related article search. PMID:17971238

  19. Is herpes zoster vaccination likely to be cost-effective in Canada?

    PubMed

    Peden, Alexander D; Strobel, Stephenson B; Forget, Evelyn L

    2014-05-30

    To synthesize the current literature detailing the cost-effectiveness of the herpes zoster (HZ) vaccine, and to provide Canadian policy-makers with cost-effectiveness measurements in a Canadian context. This article builds on an existing systematic review of the HZ vaccine that offers a quality assessment of 11 recent articles. We first replicated this study, and then two assessors reviewed the articles and extracted information on vaccine effectiveness, cost of HZ, other modelling assumptions and QALY estimates. Then we transformed the results into a format useful for Canadian policy decisions. Results expressed in different currencies from different years were converted into 2012 Canadian dollars using Bank of Canada exchange rates and a Consumer Price Index deflator. Modelling assumptions that varied between studies were synthesized. We tabled the results for comparability. The Szucs systematic review presented a thorough methodological assessment of the relevant literature. However, the various studies presented results in a variety of currencies, and based their analyses on disparate methodological assumptions. Most of the current literature uses Markov chain models to estimate HZ prevalence. Cost assumptions, discount rate assumptions, assumptions about vaccine efficacy and waning and epidemiological assumptions drove variation in the outcomes. This article transforms the results into a table easily understood by policy-makers. The majority of the current literature shows that HZ vaccination is cost-effective at the price of $100,000 per QALY. Few studies showed that vaccination cost-effectiveness was higher than this threshold, and only under conservative assumptions. Cost-effectiveness was sensitive to vaccine price and discount rate.

  20. Tobacco Influence on Taste and Smell: Systematic Review of the Literature

    PubMed Central

    Da Ré, Allessandra Fraga; Gurgel, Léia Gonçalves; Buffon, Gabriela; Moura, Weluma Evelyn Rodrigues; Marques Vidor, Deisi Cristina Gollo; Maahs, Márcia Angelica Peter

    2018-01-01

    Introduction  In Brazil, estimates show that 14.7% of the adult population smokes, and changes in smell and taste arising from tobacco consumption are largely present in this population, which is an aggravating factor to these dysfunctions. Objectives  The objective of this study is to systematically review the findings in the literature about the influence of smoking on smell and taste. Data Synthesis  Our research covered articles published from January 1980 to August 2014 in the following databases: MEDLINE (accessed through PubMed), LILACS, Cochrane Library, and SciELO. We conducted separate lines of research: one concerning smell and the other, taste. We analyzed all the articles that presented randomized controlled studies involving the relation between smoking and smell and taste. Articles that presented unclear methodologies and those whose main results did not target the smell or taste of the subjects were excluded. Titles and abstracts of the articles identified by the research strategy were evaluated by researchers. We included four studies, two of which were exclusively about smell: the first noted the relation between the perception of puff strength and nicotine content; the second did not find any differences in the thresholds and discriminative capacity between smokers and nonsmokers. One article considered only taste and supports the relation between smoking and flavor, another considered both sensory modalities and observes positive results toward the relation immediately after smoking cessation. Conclusion  Three of the four studies presented positive results for the researched variables. PMID:29371903

  1. Incorporating GIS building data and census housing statistics for sub-block-level population estimation

    USGS Publications Warehouse

    Wu, S.-S.; Wang, L.; Qiu, X.

    2008-01-01

    This article presents a deterministic model for sub-block-level population estimation based on the total building volumes derived from geographic information system (GIS) building data and three census block-level housing statistics. To assess the model, we generated artificial blocks by aggregating census block areas and calculating the respective housing statistics. We then applied the model to estimate populations for sub-artificial-block areas and assessed the estimates with census populations of the areas. Our analyses indicate that the average percent error of population estimation for sub-artificial-block areas is comparable to those for sub-census-block areas of the same size relative to associated blocks. The smaller the sub-block-level areas, the higher the population estimation errors. For example, the average percent error for residential areas is approximately 0.11 percent for 100 percent block areas and 35 percent for 5 percent block areas.

  2. Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles

    PubMed Central

    Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.

    2016-01-01

    Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203

  3. Optimal Window and Lattice in Gabor Transform. Application to Audio Analysis.

    PubMed

    Lachambre, Helene; Ricaud, Benjamin; Stempfel, Guillaume; Torrésani, Bruno; Wiesmeyr, Christoph; Onchis-Moaca, Darian

    2015-01-01

    This article deals with the use of optimal lattice and optimal window in Discrete Gabor Transform computation. In the case of a generalized Gaussian window, extending earlier contributions, we introduce an additional local window adaptation technique for non-stationary signals. We illustrate our approach and the earlier one by addressing three time-frequency analysis problems to show the improvements achieved by the use of optimal lattice and window: close frequencies distinction, frequency estimation and SNR estimation. The results are presented, when possible, with real world audio signals.

  4. Dynamic Regime Marginal Structural Mean Models for Estimation of Optimal Dynamic Treatment Regimes, Part II: Proofs of Results*

    PubMed Central

    Orellana, Liliana; Rotnitzky, Andrea; Robins, James M.

    2010-01-01

    In this companion article to “Dynamic Regime Marginal Structural Mean Models for Estimation of Optimal Dynamic Treatment Regimes, Part I: Main Content” [Orellana, Rotnitzky and Robins (2010), IJB, Vol. 6, Iss. 2, Art. 7] we present (i) proofs of the claims in that paper, (ii) a proposal for the computation of a confidence set for the optimal index when this lies in a finite set, and (iii) an example to aid the interpretation of the positivity assumption. PMID:20405047

  5. Estimation and Spatiotemporal Analysis of Methane Emissions from Agriculture in China

    NASA Astrophysics Data System (ADS)

    Fu, Chao; Yu, Guirui

    2010-10-01

    Estimating and analyzing the temporal and spatial patterns of methane emissions from agriculture (MEA) will help China formulate mitigation and adaptation strategies for the nation’s agricultural sector. Based on the Tier 2 method presented in the 2006 guidelines of the Intergovernmental Panel on Climate Change (IPCC) and on existing reports, this article presents a systematic estimation of MEA in China from 1990 to 2006, with a particular emphasis on trends and spatial distribution. Results from our study indicate that China’s MEA rose from 16.37 Tg yr-1 in 1990 to 19.31 Tg yr-1 in 2006, with an average annual increase of 1.04%. Over the study period, while emissions from field burning of crop residues remained rather low, those from rice cultivation and from livestock typically decreased and increased, respectively, showing extremely opposite trends that chiefly resulted from changes in the cultivated areas for different rice seasons and changes in the populations of different animal species. Over the study period, China’s high-MEA regions shifted generally northward, chiefly as a result of reduced emissions from rice cultivation in most of China’s southern provinces and a substantial growth in emissions from livestock enteric fermentation in most of China’s northern, northeastern, and northwestern provinces. While this article provides significant information on estimates of MEA in China, it also includes some uncertainties in terms of estimating emissions from each source category. We conclude that China’s MEA will likely continue to increase in the future and recommend a demonstration study on MEA mitigation along the middle and lower reaches of the Yellow River. We further recommend enhanced data monitoring and statistical analysis, which will be essential for preparation of the national greenhouse gas (GHG) inventory.

  6. Verification bias: an under-recognized source of error in assessing the efficacy of MRI of the meniscii.

    PubMed

    Richardson, Michael L; Petscavage, Jonelle M

    2011-11-01

    The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  7. A comparative review of estimates of the proportion unchanged genes and the false discovery rate

    PubMed Central

    Broberg, Per

    2005-01-01

    Background In the analysis of microarray data one generally produces a vector of p-values that for each gene give the likelihood of obtaining equally strong evidence of change by pure chance. The distribution of these p-values is a mixture of two components corresponding to the changed genes and the unchanged ones. The focus of this article is how to estimate the proportion unchanged and the false discovery rate (FDR) and how to make inferences based on these concepts. Six published methods for estimating the proportion unchanged genes are reviewed, two alternatives are presented, and all are tested on both simulated and real data. All estimates but one make do without any parametric assumptions concerning the distributions of the p-values. Furthermore, the estimation and use of the FDR and the closely related q-value is illustrated with examples. Five published estimates of the FDR and one new are presented and tested. Implementations in R code are available. Results A simulation model based on the distribution of real microarray data plus two real data sets were used to assess the methods. The proposed alternative methods for estimating the proportion unchanged fared very well, and gave evidence of low bias and very low variance. Different methods perform well depending upon whether there are few or many regulated genes. Furthermore, the methods for estimating FDR showed a varying performance, and were sometimes misleading. The new method had a very low error. Conclusion The concept of the q-value or false discovery rate is useful in practical research, despite some theoretical and practical shortcomings. However, it seems possible to challenge the performance of the published methods, and there is likely scope for further developing the estimates of the FDR. The new methods provide the scientist with more options to choose a suitable method for any particular experiment. The article advocates the use of the conjoint information regarding false positive and negative rates as well as the proportion unchanged when identifying changed genes. PMID:16086831

  8. Analytical estimation show low depth-independent water loss due to vapor flux from deep aquifers

    NASA Astrophysics Data System (ADS)

    Selker, John S.

    2017-06-01

    Recent articles have provided estimates of evaporative flux from water tables in deserts that span 5 orders of magnitude. In this paper, we present an analytical calculation that indicates aquifer vapor flux to be limited to 0.01 mm/yr for sites where there is negligible recharge and the water table is well over 20 m below the surface. This value arises from the geothermal gradient, and therefore, is nearly independent of the actual depth of the aquifer. The value is in agreement with several numerical studies, but is 500 times lower than recently reported experimental values, and 100 times larger than an earlier analytical estimate.

  9. Neural networks for tracking of unknown SISO discrete-time nonlinear dynamic systems.

    PubMed

    Aftab, Muhammad Saleheen; Shafiq, Muhammad

    2015-11-01

    This article presents a Lyapunov function based neural network tracking (LNT) strategy for single-input, single-output (SISO) discrete-time nonlinear dynamic systems. The proposed LNT architecture is composed of two feedforward neural networks operating as controller and estimator. A Lyapunov function based back propagation learning algorithm is used for online adjustment of the controller and estimator parameters. The controller and estimator error convergence and closed-loop system stability analysis is performed by Lyapunov stability theory. Moreover, two simulation examples and one real-time experiment are investigated as case studies. The achieved results successfully validate the controller performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. A Note on the Score Reliability for the Satisfaction with Life Scale: An RG Study

    ERIC Educational Resources Information Center

    Vassar, Matt

    2008-01-01

    The purpose of the present study was to meta-analytically investigate the score reliability for the Satisfaction With Life Scale. Four-hundred and sixteen articles using the measure were located through electronic database searches and then separated to identify studies which had calculated reliability estimates from their own data. Sixty-two…

  11. CAT Model with Personalized Algorithm for Evaluation of Estimated Student Knowledge

    ERIC Educational Resources Information Center

    Andjelic, Svetlana; Cekerevac, Zoran

    2014-01-01

    This article presents the original model of the computer adaptive testing and grade formation, based on scientifically recognized theories. The base of the model is a personalized algorithm for selection of questions depending on the accuracy of the answer to the previous question. The test is divided into three basic levels of difficulty, and the…

  12. [Somatotype of the patients with obesity and associated cardio-vascular patholodgy. Clinical and anthropological bonds].

    PubMed

    Bukavneva, N S; Pozdniakov, A L; Nikitiuk, D B

    2008-01-01

    In the article there are presented major anthropometric parameters of patients (male and female) with obesity, combined with cardiovascular pathology, before and after treatment. Constitutional predisposition to obesity is defined. Efficacy of dietary therapy is estimated and with obesity are defined on the basil of associated cardio-vascular pathology.

  13. A Brief Guide to Modelling in Secondary School: Estimating Big Numbers

    ERIC Educational Resources Information Center

    Albarracín, Lluís; Gorgorió, Núria

    2015-01-01

    Fermi problems are problems which, due to their difficulty, can be satisfactorily solved by being broken down into smaller pieces that are solved separately. In this article, we present different sequences of activities involving Fermi problems that can be carried out in Secondary School classes. The aim of these activities is to discuss…

  14. Bayesian Inference for Growth Mixture Models with Latent Class Dependent Missing Data

    ERIC Educational Resources Information Center

    Lu, Zhenqiu Laura; Zhang, Zhiyong; Lubke, Gitta

    2011-01-01

    "Growth mixture models" (GMMs) with nonignorable missing data have drawn increasing attention in research communities but have not been fully studied. The goal of this article is to propose and to evaluate a Bayesian method to estimate the GMMs with latent class dependent missing data. An extended GMM is first presented in which class…

  15. Sample Size Calculations for Precise Interval Estimation of the Eta-Squared Effect Size

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2015-01-01

    Analysis of variance is one of the most frequently used statistical analyses in the behavioral, educational, and social sciences, and special attention has been paid to the selection and use of an appropriate effect size measure of association in analysis of variance. This article presents the sample size procedures for precise interval estimation…

  16. Transoral incision free fundoplication (TIF) – A new paradigm in the surgical treatment of GERD

    PubMed Central

    Yushuva, Arthur; McMahon, Michael; Goodman, Elliot

    2010-01-01

    An estimated 10 billion dollars is spent treating gastro-oesophageal reflux disease (GERD) in the USA every year. The present article reports a case of the safe and successful use of transoral incisionless fundoplication (TIF) using the EsophyX90™ device in the surgical treatment of GERD. PMID:24946319

  17. Transoral incision free fundoplication (TIF) - A new paradigm in the surgical treatment of GERD.

    PubMed

    Yushuva, Arthur; McMahon, Michael; Goodman, Elliot

    2010-07-01

    An estimated 10 billion dollars is spent treating gastro-oesophageal reflux disease (GERD) in the USA every year. The present article reports a case of the safe and successful use of transoral incisionless fundoplication (TIF) using the EsophyX90™ device in the surgical treatment of GERD. © JSCR.

  18. Estimating the Market Demand and Elasticity for Enrollment at an Institution

    ERIC Educational Resources Information Center

    Wohlgemuth, Darin

    2013-01-01

    This article presents an applied research framework that can be helpful in tuition and net price policy discussions. It is the classic microeconomic concept of market demand applied to enrollment management in higher education. The policy relevance includes measuring a response to price. For example, the results of this model will allow the…

  19. Measuring use value from recreation participation: comment

    Treesearch

    Donald B.K. English; J. Michael Bowker

    1994-01-01

    In a recent article in this Journal, Whitehead (1 992) presents a method for estimating annual economic surplus for recreation trips to a natural resource site based on whether an individual participates in recreation at that site. Whitehead proposes his method as an alternative to the traditional two-stage travel cost approach. We contend that Whitehead's method...

  20. How Risky Is Marijuana Possession? Considering the Role of Age, Race, and Gender

    ERIC Educational Resources Information Center

    Nguyen, Holly; Reuter, Peter

    2012-01-01

    Arrest rates per capita for possession of marijuana have increased threefold over the last 20 years and now constitute the largest single arrest offense category. Despite the increase in arrest numbers, rates of use have remained stable during much of the same period. This article presents the first estimates of the arrest probabilities for…

  1. Analytical and quasi-Bayesian methods as development of the iterative approach for mixed radiation biodosimetry.

    PubMed

    Słonecka, Iwona; Łukasik, Krzysztof; Fornalski, Krzysztof W

    2018-06-04

    The present paper proposes two methods of calculating components of the dose absorbed by the human body after exposure to a mixed neutron and gamma radiation field. The article presents a novel approach to replace the common iterative method in its analytical form, thus reducing the calculation time. It also shows a possibility of estimating the neutron and gamma doses when their ratio in a mixed beam is not precisely known.

  2. Approaches to Estimate Consumer Exposure under TSCA

    EPA Pesticide Factsheets

    CEM contains a combination of models and default parameters which are used to estimate inhalation, dermal, and oral exposures to consumer products and articles for a wide variety of product and article use categories.

  3. Radiation research society 1952-2002. Physics as an element of radiation research.

    PubMed

    Inokuti, Mitio; Seltzer, Stephen M

    2002-07-01

    Since its inception in 1954, Radiation Research has published an estimated total of about 8700 scientific articles up to August 2001, about 520, or roughly 6%, of which are primarily related to physics. This average of about 11 articles per year indicates steadily continuing contributions by physicists, though there are appreciable fluctuations from year to year. These works of physicists concern radiation sources, dosimetry, instrumentation for measurements of radiation effects, fundamentals of radiation physics, mechanisms of radiation actions, and applications. In this review, we have selected some notable accomplishments for discussion and present an outlook for the future.

  4. Bayes Error Rate Estimation Using Classifier Ensembles

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep

    2003-01-01

    The Bayes error rate gives a statistical lower bound on the error achievable for a given classification problem and the associated choice of features. By reliably estimating th is rate, one can assess the usefulness of the feature set that is being used for classification. Moreover, by comparing the accuracy achieved by a given classifier with the Bayes rate, one can quantify how effective that classifier is. Classical approaches for estimating or finding bounds for the Bayes error, in general, yield rather weak results for small sample sizes; unless the problem has some simple characteristics, such as Gaussian class-conditional likelihoods. This article shows how the outputs of a classifier ensemble can be used to provide reliable and easily obtainable estimates of the Bayes error with negligible extra computation. Three methods of varying sophistication are described. First, we present a framework that estimates the Bayes error when multiple classifiers, each providing an estimate of the a posteriori class probabilities, a recombined through averaging. Second, we bolster this approach by adding an information theoretic measure of output correlation to the estimate. Finally, we discuss a more general method that just looks at the class labels indicated by ensem ble members and provides error estimates based on the disagreements among classifiers. The methods are illustrated for artificial data, a difficult four-class problem involving underwater acoustic data, and two problems from the Problem benchmarks. For data sets with known Bayes error, the combiner-based methods introduced in this article outperform existing methods. The estimates obtained by the proposed methods also seem quite reliable for the real-life data sets for which the true Bayes rates are unknown.

  5. Methodological quality and reporting of generalized linear mixed models in clinical medicine (2000-2012): a systematic review.

    PubMed

    Casals, Martí; Girabent-Farrés, Montserrat; Carrasco, Josep L

    2014-01-01

    Modeling count and binary data collected in hierarchical designs have increased the use of Generalized Linear Mixed Models (GLMMs) in medicine. This article presents a systematic review of the application and quality of results and information reported from GLMMs in the field of clinical medicine. A search using the Web of Science database was performed for published original articles in medical journals from 2000 to 2012. The search strategy included the topic "generalized linear mixed models","hierarchical generalized linear models", "multilevel generalized linear model" and as a research domain we refined by science technology. Papers reporting methodological considerations without application, and those that were not involved in clinical medicine or written in English were excluded. A total of 443 articles were detected, with an increase over time in the number of articles. In total, 108 articles fit the inclusion criteria. Of these, 54.6% were declared to be longitudinal studies, whereas 58.3% and 26.9% were defined as repeated measurements and multilevel design, respectively. Twenty-two articles belonged to environmental and occupational public health, 10 articles to clinical neurology, 8 to oncology, and 7 to infectious diseases and pediatrics. The distribution of the response variable was reported in 88% of the articles, predominantly Binomial (n = 64) or Poisson (n = 22). Most of the useful information about GLMMs was not reported in most cases. Variance estimates of random effects were described in only 8 articles (9.2%). The model validation, the method of covariate selection and the method of goodness of fit were only reported in 8.0%, 36.8% and 14.9% of the articles, respectively. During recent years, the use of GLMMs in medical literature has increased to take into account the correlation of data when modeling qualitative data or counts. According to the current recommendations, the quality of reporting has room for improvement regarding the characteristics of the analysis, estimation method, validation, and selection of the model.

  6. Some Calculated Research Results of the Working Process Parameters of the Low Thrust Rocket Engine Operating on Gaseous Oxygen-Hydrogen Fuel

    NASA Astrophysics Data System (ADS)

    Ryzhkov, V.; Morozov, I.

    2018-01-01

    The paper presents the calculating results of the combustion products parameters in the tract of the low thrust rocket engine with thrust P ∼ 100 N. The article contains the following data: streamlines, distribution of total temperature parameter in the longitudinal section of the engine chamber, static temperature distribution in the cross section of the engine chamber, velocity distribution of the combustion products in the outlet section of the engine nozzle, static temperature near the inner wall of the engine. The presented parameters allow to estimate the efficiency of the mixture formation processes, flow of combustion products in the engine chamber and to estimate the thermal state of the structure.

  7. Estimating Dynamical Systems: Derivative Estimation Hints From Sir Ronald A. Fisher.

    PubMed

    Deboeck, Pascal R

    2010-08-06

    The fitting of dynamical systems to psychological data offers the promise of addressing new and innovative questions about how people change over time. One method of fitting dynamical systems is to estimate the derivatives of a time series and then examine the relationships between derivatives using a differential equation model. One common approach for estimating derivatives, Local Linear Approximation (LLA), produces estimates with correlated errors. Depending on the specific differential equation model used, such correlated errors can lead to severely biased estimates of differential equation model parameters. This article shows that the fitting of dynamical systems can be improved by estimating derivatives in a manner similar to that used to fit orthogonal polynomials. Two applications using simulated data compare the proposed method and a generalized form of LLA when used to estimate derivatives and when used to estimate differential equation model parameters. A third application estimates the frequency of oscillation in observations of the monthly deaths from bronchitis, emphysema, and asthma in the United Kingdom. These data are publicly available in the statistical program R, and functions in R for the method presented are provided.

  8. Item Response Theory Modeling of the Philadelphia Naming Test.

    PubMed

    Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D

    2015-06-01

    In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating explanatory variables to item difficulty. This article describes the statistical model underlying the computer adaptive PNT presented in a companion article (Hula, Kellough, & Fergadiotis, 2015). Using archival data, we evaluated the fit of the PNT to 1- and 2-parameter logistic models and examined the precision of the resulting parameter estimates. We regressed the item difficulty estimates on three predictor variables: word length, age of acquisition, and contextual diversity. The 2-parameter logistic model demonstrated marginally better fit, but the fit of the 1-parameter logistic model was adequate. Precision was excellent for both person ability and item difficulty estimates. Word length, age of acquisition, and contextual diversity all independently contributed to variance in item difficulty. Item-response-theory methods can be productively used to analyze and quantify anomia severity in aphasia. Regression of item difficulty on lexical variables supported the validity of the PNT and interpretation of anomia severity scores in the context of current word-finding models.

  9. Nonparametric analysis of bivariate gap time with competing risks.

    PubMed

    Huang, Chiung-Yu; Wang, Chenguang; Wang, Mei-Cheng

    2016-09-01

    This article considers nonparametric methods for studying recurrent disease and death with competing risks. We first point out that comparisons based on the well-known cumulative incidence function can be confounded by different prevalence rates of the competing events, and that comparisons of the conditional distribution of the survival time given the failure event type are more relevant for investigating the prognosis of different patterns of recurrence disease. We then propose nonparametric estimators for the conditional cumulative incidence function as well as the conditional bivariate cumulative incidence function for the bivariate gap times, that is, the time to disease recurrence and the residual lifetime after recurrence. To quantify the association between the two gap times in the competing risks setting, a modified Kendall's tau statistic is proposed. The proposed estimators for the conditional bivariate cumulative incidence distribution and the association measure account for the induced dependent censoring for the second gap time. Uniform consistency and weak convergence of the proposed estimators are established. Hypothesis testing procedures for two-sample comparisons are discussed. Numerical simulation studies with practical sample sizes are conducted to evaluate the performance of the proposed nonparametric estimators and tests. An application to data from a pancreatic cancer study is presented to illustrate the methods developed in this article. © 2016, The International Biometric Society.

  10. Studies of the Hard X-ray Emission from the Filippov Type Plasma Focus Device, Dena

    NASA Astrophysics Data System (ADS)

    Tafreshi, M. A.; Saeedzadeh, E.

    2006-12-01

    This article is about the characteristics of the hard X-ray (HXR) emission from the Filippov type plasma focus (PF) device, Dena. The article begins with a brief presentation of Dena, and the mechanism of the HXR production in PF devices. Then using the differential absorption spectrometry, the energy resolved spectrum of the HXR emission from a 37 kJ discharge in Dena, is estimated. The energy flux density and the energy fluence of this emission have also been calculated to be 1.9 kJ cm-2 s-1 and 9.4 × 10-5 J cm-2. In the end, after presentation of radiography of sheep bones and calf ribs, the medical application of the PF devices has been discussed.

  11. Does exposure to simulated patient cases improve accuracy of clinicians' predictive value estimates of diagnostic test results? A within-subjects experiment at St Michael's Hospital, Toronto, Canada.

    PubMed

    Armstrong, Bonnie; Spaniol, Julia; Persaud, Nav

    2018-02-13

    Clinicians often overestimate the probability of a disease given a positive test result (positive predictive value; PPV) and the probability of no disease given a negative test result (negative predictive value; NPV). The purpose of this study was to investigate whether experiencing simulated patient cases (ie, an 'experience format') would promote more accurate PPV and NPV estimates compared with a numerical format. Participants were presented with information about three diagnostic tests for the same fictitious disease and were asked to estimate the PPV and NPV of each test. Tests varied with respect to sensitivity and specificity. Information about each test was presented once in the numerical format and once in the experience format. The study used a 2 (format: numerical vs experience) × 3 (diagnostic test: gold standard vs low sensitivity vs low specificity) within-subjects design. The study was completed online, via Qualtrics (Provo, Utah, USA). 50 physicians (12 clinicians and 38 residents) from the Department of Family and Community Medicine at St Michael's Hospital in Toronto, Canada, completed the study. All participants had completed at least 1 year of residency. Estimation accuracy was quantified by the mean absolute error (MAE; absolute difference between estimate and true predictive value). PPV estimation errors were larger in the numerical format (MAE=32.6%, 95% CI 26.8% to 38.4%) compared with the experience format (MAE=15.9%, 95% CI 11.8% to 20.0%, d =0.697, P<0.001). Likewise, NPV estimation errors were larger in the numerical format (MAE=24.4%, 95% CI 14.5% to 34.3%) than in the experience format (MAE=11.0%, 95% CI 6.5% to 15.5%, d =0.303, P=0.015). Exposure to simulated patient cases promotes accurate estimation of predictive values in clinicians. This finding carries implications for diagnostic training and practice. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. A Simple Method for Deriving the Confidence Regions for the Penalized Cox’s Model via the Minimand Perturbation†

    PubMed Central

    Lin, Chen-Yen; Halabi, Susan

    2017-01-01

    We propose a minimand perturbation method to derive the confidence regions for the regularized estimators for the Cox’s proportional hazards model. Although the regularized estimation procedure produces a more stable point estimate, it remains challenging to provide an interval estimator or an analytic variance estimator for the associated point estimate. Based on the sandwich formula, the current variance estimator provides a simple approximation, but its finite sample performance is not entirely satisfactory. Besides, the sandwich formula can only provide variance estimates for the non-zero coefficients. In this article, we present a generic description for the perturbation method and then introduce a computation algorithm using the adaptive least absolute shrinkage and selection operator (LASSO) penalty. Through simulation studies, we demonstrate that our method can better approximate the limiting distribution of the adaptive LASSO estimator and produces more accurate inference compared with the sandwich formula. The simulation results also indicate the possibility of extending the applications to the adaptive elastic-net penalty. We further demonstrate our method using data from a phase III clinical trial in prostate cancer. PMID:29326496

  13. A Simple Method for Deriving the Confidence Regions for the Penalized Cox's Model via the Minimand Perturbation.

    PubMed

    Lin, Chen-Yen; Halabi, Susan

    2017-01-01

    We propose a minimand perturbation method to derive the confidence regions for the regularized estimators for the Cox's proportional hazards model. Although the regularized estimation procedure produces a more stable point estimate, it remains challenging to provide an interval estimator or an analytic variance estimator for the associated point estimate. Based on the sandwich formula, the current variance estimator provides a simple approximation, but its finite sample performance is not entirely satisfactory. Besides, the sandwich formula can only provide variance estimates for the non-zero coefficients. In this article, we present a generic description for the perturbation method and then introduce a computation algorithm using the adaptive least absolute shrinkage and selection operator (LASSO) penalty. Through simulation studies, we demonstrate that our method can better approximate the limiting distribution of the adaptive LASSO estimator and produces more accurate inference compared with the sandwich formula. The simulation results also indicate the possibility of extending the applications to the adaptive elastic-net penalty. We further demonstrate our method using data from a phase III clinical trial in prostate cancer.

  14. Meeting NCLB Goals for Highly Qualified Teachers: Estimates by State from Survey Data

    ERIC Educational Resources Information Center

    Blank, Rolf K.; Langesen, Doreen; Laird, Elizabeth; Toye, Carla; de Mello, Victor Bandeira

    2004-01-01

    This article presents results of survey data showing teacher qualifications for their assignments that are comparable from state-to-state as well as data trends over time. The analysis is intended to help state leaders, educators, and others obtain a picture of highly qualified teachers in their state, and to be able to compare their state…

  15. The Use of a School Value-Added Model for Educational Improvement: A Case Study from the Portuguese Primary Education System

    ERIC Educational Resources Information Center

    Ferrão, Maria Eugénia; Couto, Alcino Pinto

    2014-01-01

    This article focuses on the use of a value-added approach for promoting school improvement. It presents yearly value-added estimates, analyses their stability over time, and discusses the contribution of this methodological approach for promoting school improvement programmes in the Portuguese system of evaluation. The value-added model is applied…

  16. Benefit segmentation of the fitness market.

    PubMed

    Brown, J D

    1992-01-01

    While considerate attention is being paid to the fitness and wellness needs of people by healthcare and related marketing organizations, little research attention has been directed to identifying the market segments for fitness based upon consumers' perceived benefits of fitness. This article describes three distinct segments of fitness consumers comprising an estimated 50 percent of households. Implications for marketing strategies are also presented.

  17. The Impact of Private Sector Competition on Public Schooling in Kuwait: Some Socio-Educational Implications

    ERIC Educational Resources Information Center

    Al-Shehab, Ali Jasem

    2010-01-01

    With the diminishing model of the welfare state, public education in Kuwait is facing the challenges of the competition of private schools, while the private sector has always struggled against the monopolistic power of the public schools that educate a broad spectrum of K-12 students. This article presents estimates of the effect of private…

  18. Nonequilibrium processes of segregation and diffusion in metal-polymer tribosystems

    NASA Astrophysics Data System (ADS)

    Sidashov, A. V.; Kolesnikov, I. V.

    2017-12-01

    The article presents the results of exchange-diffusion processes between chemical elements in metal-polymer tribosystems (between a metal wheel of a rolling stock and a composite polymer brake shoe). The effect of the segregation processes on the strength characteristics of the working surface of a tribosystem is estimated by quantum chemical calculations, Auger and X-ray photoelectron spectroscopies.

  19. CFD Simulations for Arc-Jet Panel Testing Capability Development Using Semi-Elliptical Nozzles

    NASA Technical Reports Server (NTRS)

    Gokcen, Tahir; Balboni, John A.; Hartman, G. Joseph

    2016-01-01

    This paper reports computational simulations in support of arc-jet panel testing capability development using semi-elliptical nozzles in a high enthalpy arc-jet facility at NASA Ames Research Center. Two different semi-elliptical nozzle configurations are proposed for testing panel test articles. Computational fluid dynamics simulations are performed to provide estimates of achievable panel surface conditions and useful test area for each configuration. The present analysis comprises three-dimensional simulations of the nonequilibrium flowfields in the semi-elliptical nozzles, test box and flowfield over the panel test articles. Computations show that useful test areas for the proposed two nozzle options are 20.32 centimeters by 20.32 centimeters (8 inches by 8 inches) and 43.18 centimeters by 43.18 centimeters (17 inches by 17 inches). Estimated values of the maximum cold-wall heat flux and surface pressure are 155 watts per centimeters squared and 39 kilopascals for the smaller panel test option, and 44 watts per centimeters squared and 7 kilopascals for the larger panel test option. Other important properties of the predicted flowfields are presented, and factors that limit the useful test area in the semi-free jet test configuration are discussed.

  20. Hypovitaminosis A and its control

    PubMed Central

    Underwood, Barbara A.

    1978-01-01

    Hypovitaminosis A is considered to be the most common cause of blindness in the developing countries but it is not possible to estimate the prevalence of keratomalacia directly attributable to it. Subclinical hypovitaminosis A is not measurable at present in human subjects, but studies in animals indicate that the possibility of subclinical effects should not be ignored. The recommended procedure for identifying the ”at risk” population involves a three-part survey to evaluate dietary intake, biochemical indices, and clinical signs. This article examines all three approaches in some detail, but in the present state of knowledge, none of them gives a satisfactory estimate of vitamin A status. For community assessment, the article discusses preliminary experience with a predictive model of the number of children in a population at risk of hypovitaminosis A that is based on associations noted repeatedly between protein—energy malnutrition and certain child-rearing practices, family economics, and morbidity. Criteria have been established for deciding on the need for a programme of prevention and the types of programme most appropriate in different situations are discussed. The methods of programme evaluation must take into account the stated objectives of the programme. PMID:310359

  1. Spatial distributions of heating, cooling, and industrial degree-days in Turkey

    NASA Astrophysics Data System (ADS)

    Yildiz, I.; Sosaoglu, B.

    2007-11-01

    The degree-day method is commonly used to estimate energy consumption for heating and cooling in residential, commercial and industrial buildings, as well as in greenhouses, livestock facilities, storage facilities and warehouses. This article presents monthly and yearly averages and spatial distributions of heating, cooling, and industrial degree-days at the base temperatures of 18 °C and 20 °C, 18 °C and 24 °C, and 7 °C and 13 °C, respectively; as well as the corresponding number of days in Turkey. The findings presented here will facilitate the estimation of heating and cooling energy consumption for any residential, commercial and industrial buildings in Turkey, for any period of time (monthly, seasonal, etc.). From this analysis it will also be possible to compare and design alternative building systems in terms of energy efficiencies. If one prefers to use set point temperatures to indicate the resumption of the heating season would also be possible using the provided information in this article. In addition, utility companies and manufacturing/marketing companies of HVAC systems would be able to easily determine the demand, marketing strategies and policies based on the findings in this study.

  2. Energy Losses Estimation During Pulsed-Laser Seam Welding

    NASA Astrophysics Data System (ADS)

    Sebestova, Hana; Havelkova, Martina; Chmelickova, Hana

    2014-06-01

    The finite-element tool SYSWELD (ESI Group, Paris, France) was adapted to simulate pulsed-laser seam welding. Besides temperature field distribution, one of the possible outputs of the welding simulation is the amount of absorbed power necessary to melt the required material volume including energy losses. Comparing absorbed or melting energy with applied laser energy, welding efficiencies can be calculated. This article presents achieved results of welding efficiency estimation based on the assimilation both experimental and simulation output data of the pulsed Nd:YAG laser bead on plate welding of 0.6-mm-thick AISI 304 stainless steel sheets using different beam powers.

  3. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    PubMed

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  4. A new method for the estimation of high temperature radiant heat emittance by means of aero-acoustic levitation

    NASA Astrophysics Data System (ADS)

    Greffrath, Fabian; Prieler, Robert; Telle, Rainer

    2014-11-01

    A new method for the experimental estimation of radiant heat emittance at high temperatures has been developed which involves aero-acoustic levitation of samples, laser heating and contactless temperature measurement. Radiant heat emittance values are determined from the time dependent development of the sample temperature which requires analysis of both the radiant and convective heat transfer towards the surroundings by means of fluid dynamics calculations. First results for the emittance of a corundum sample obtained with this method are presented in this article and found in good agreement with literature values.

  5. Migration plans and hours of work in Malaysia.

    PubMed

    Gillin, E D; Sumner, D A

    1985-01-01

    "This article describes characteristics of prospective migrants in the Malaysian Family Life Survey and investigates how planning to move affects hours of work. [The authors] use ideas about intertemporal substitution...to discuss the response to temporary and permanent wage expectations on the part of potential migrants. [An] econometric section presents reduced-form estimates for wage rates and planned migration equations and two-stage least squares estimates for hours of work. Men currently planning a move were found to work fewer hours. Those originally planning only a temporary stay at their current location work more hours." excerpt

  6. Civil partnerships five years on.

    PubMed

    Ross, Helen; Gask, Karen; Berrington, Ann

    2011-01-01

    The Civil Partnership Act 2004, which came into force in December 2005 allowing same-sex couples in the UK to register their relationship for the first time, celebrated its fifth anniversary in December 2010. This article examines civil partnership in England and Wales, five years on from its introduction. The characteristics of those forming civil partnerships between 2005 and 2010 including age, sex and previous marital/civil partnership status are examined. These are then compared with the characteristics of those marrying over the same period. Further comparisons are also made between civil partnership dissolutions and divorce. The article presents estimates of the number of people currently in civil partnerships and children of civil partners. Finally the article examines attitudes towards same-sex and civil partner couples both in the UK and in other countries across Europe.

  7. Artificial Intelligence Estimation of Carotid-Femoral Pulse Wave Velocity using Carotid Waveform.

    PubMed

    Tavallali, Peyman; Razavi, Marianne; Pahlevan, Niema M

    2018-01-17

    In this article, we offer an artificial intelligence method to estimate the carotid-femoral Pulse Wave Velocity (PWV) non-invasively from one uncalibrated carotid waveform measured by tonometry and few routine clinical variables. Since the signal processing inputs to this machine learning algorithm are sensor agnostic, the presented method can accompany any medical instrument that provides a calibrated or uncalibrated carotid pressure waveform. Our results show that, for an unseen hold back test set population in the age range of 20 to 69, our model can estimate PWV with a Root-Mean-Square Error (RMSE) of 1.12 m/sec compared to the reference method. The results convey the fact that this model is a reliable surrogate of PWV. Our study also showed that estimated PWV was significantly associated with an increased risk of CVDs.

  8. A complex symbol signal-to-noise ratio estimator and its performance

    NASA Technical Reports Server (NTRS)

    Feria, Y.

    1994-01-01

    This article presents an algorithm for estimating the signal-to-noise ratio (SNR) of signals that contain data on a downconverted suppressed carrier or the first harmonic of a square-wave subcarrier. This algorithm can be used to determine the performance of the full-spectrum combiner for the Galileo S-band (2.2- to 2.3-GHz) mission by measuring the input and output symbol SNR. A performance analysis of the algorithm shows that the estimator can estimate the complex symbol SNR using 10,000 symbols at a true symbol SNR of -5 dB with a mean of -4.9985 dB and a standard deviation of 0.2454 dB, and these analytical results are checked by simulations of 100 runs with a mean of -5.06 dB and a standard deviation of 0.2506 dB.

  9. Chapter 12: Survey Design and Implementation for Estimating Gross Savings Cross-Cutting Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Baumgartner, Robert

    This chapter presents an overview of best practices for designing and executing survey research to estimate gross energy savings in energy efficiency evaluations. A detailed description of the specific techniques and strategies for designing questions, implementing a survey, and analyzing and reporting the survey procedures and results is beyond the scope of this chapter. So for each topic covered below, readers are encouraged to consult articles and books cited in References, as well as other sources that cover the specific topics in greater depth. This chapter focuses on the use of survey methods to collect data for estimating gross savingsmore » from energy efficiency programs.« less

  10. Analysis of nursing home use and bed supply: Wisconsin, 1983.

    PubMed Central

    Nyman, J A

    1989-01-01

    This article presents evidence that in 1983 excess demand was a prevailing characteristic of nursing home care markets in Wisconsin, a state with one of the highest bed to elderly population ratios. It further shows that excess demand is the source of at least three types of error in use-based estimates of the determinants of the need for nursing home care. First, if excess demand is present, estimates of the determinants of Medicaid use may simply represent a crowding out of Medicaid patients, driven by the determinants of private use. As a result, factors associated with greater overall need in an area will be correlated with fewer Medicaid patients in nursing homes, ceteris paribus. Second, estimates of the substitutability of home health care for nursing home care may be misleadingly insignificant if they are based on the bed supply-constrained behavior of Medicaid-eligible subjects. Third, because the determinants of bed supply become the determinants of overall use under excess-demand conditions, the determinants of use will reflect, to some extent, the nursing home's desire for profits. Because profitability considerations are reflected in use based estimates of need, these estimates are likely to be misleading. PMID:2681081

  11. ESTIMATION OF EARLY INTERNAL DOSES TO FUKUSHIMA RESIDENTS AFTER THE NUCLEAR DISASTER BASED ON THE ATMOSPHERIC DISPERSION SIMULATION.

    PubMed

    Kim, Eunjoo; Tani, Kotaro; Kunishima, Naoaki; Kurihara, Osamu; Sakai, Kazuo; Akashi, Makoto

    2016-11-01

    Estimating the early internal doses to residents in the Fukushima Daiichi Nuclear Power Station accident is a difficult task because limited human/environmental measurement data are available. Hence, the feasibility of using atmospheric dispersion simulations created by the Worldwide version of System for Prediction of Environmental Emergency Dose Information 2nd Version (WSPEEDI-II) in the estimation was examined in the present study. This examination was done by comparing the internal doses evaluated based on the human measurements with those calculated using time series air concentration maps ( 131 I and 137 Cs) generated by WSPEEDI-II. The results showed that the latter doses were several times higher than the former doses. However, this discrepancy could be minimised by taking into account personal behaviour data that will be available soon. This article also presents the development of a prototype system for estimating the internal dose based on the simulations. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    PubMed

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  13. Estimation of the full-field dynamic response of a floating bridge using Kalman-type filtering algorithms

    NASA Astrophysics Data System (ADS)

    Petersen, Ø. W.; Øiseth, O.; Nord, T. S.; Lourens, E.

    2018-07-01

    Numerical predictions of the dynamic response of complex structures are often uncertain due to uncertainties inherited from the assumed load effects. Inverse methods can estimate the true dynamic response of a structure through system inversion, combining measured acceleration data with a system model. This article presents a case study of full-field dynamic response estimation of a long-span floating bridge: the Bergøysund Bridge in Norway. This bridge is instrumented with a network of 14 triaxial accelerometers. The system model consists of 27 vibration modes with natural frequencies below 2 Hz, obtained from a tuned finite element model that takes the fluid-structure interaction with the surrounding water into account. Two methods, a joint input-state estimation algorithm and a dual Kalman filter, are applied to estimate the full-field response of the bridge. The results demonstrate that the displacements and the accelerations can be estimated at unmeasured locations with reasonable accuracy when the wave loads are the dominant source of excitation.

  14. A Sensor Fusion Method Based on an Integrated Neural Network and Kalman Filter for Vehicle Roll Angle Estimation.

    PubMed

    Vargas-Meléndez, Leandro; Boada, Beatriz L; Boada, María Jesús L; Gauchía, Antonio; Díaz, Vicente

    2016-08-31

    This article presents a novel estimator based on sensor fusion, which combines the Neural Network (NN) with a Kalman filter in order to estimate the vehicle roll angle. The NN estimates a "pseudo-roll angle" through variables that are easily measured from Inertial Measurement Unit (IMU) sensors. An IMU is a device that is commonly used for vehicle motion detection, and its cost has decreased during recent years. The pseudo-roll angle is introduced in the Kalman filter in order to filter noise and minimize the variance of the norm and maximum errors' estimation. The NN has been trained for J-turn maneuvers, double lane change maneuvers and lane change maneuvers at different speeds and road friction coefficients. The proposed method takes into account the vehicle non-linearities, thus yielding good roll angle estimation. Finally, the proposed estimator has been compared with one that uses the suspension deflections to obtain the pseudo-roll angle. Experimental results show the effectiveness of the proposed NN and Kalman filter-based estimator.

  15. A Sensor Fusion Method Based on an Integrated Neural Network and Kalman Filter for Vehicle Roll Angle Estimation

    PubMed Central

    Vargas-Meléndez, Leandro; Boada, Beatriz L.; Boada, María Jesús L.; Gauchía, Antonio; Díaz, Vicente

    2016-01-01

    This article presents a novel estimator based on sensor fusion, which combines the Neural Network (NN) with a Kalman filter in order to estimate the vehicle roll angle. The NN estimates a “pseudo-roll angle” through variables that are easily measured from Inertial Measurement Unit (IMU) sensors. An IMU is a device that is commonly used for vehicle motion detection, and its cost has decreased during recent years. The pseudo-roll angle is introduced in the Kalman filter in order to filter noise and minimize the variance of the norm and maximum errors’ estimation. The NN has been trained for J-turn maneuvers, double lane change maneuvers and lane change maneuvers at different speeds and road friction coefficients. The proposed method takes into account the vehicle non-linearities, thus yielding good roll angle estimation. Finally, the proposed estimator has been compared with one that uses the suspension deflections to obtain the pseudo-roll angle. Experimental results show the effectiveness of the proposed NN and Kalman filter-based estimator. PMID:27589763

  16. Dataset of anomalies and malicious acts in a cyber-physical subsystem.

    PubMed

    Laso, Pedro Merino; Brosset, David; Puentes, John

    2017-10-01

    This article presents a dataset produced to investigate how data and information quality estimations enable to detect aNomalies and malicious acts in cyber-physical systems. Data were acquired making use of a cyber-physical subsystem consisting of liquid containers for fuel or water, along with its automated control and data acquisition infrastructure. Described data consist of temporal series representing five operational scenarios - Normal, aNomalies, breakdown, sabotages, and cyber-attacks - corresponding to 15 different real situations. The dataset is publicly available in the .zip file published with the article, to investigate and compare faulty operation detection and characterization methods for cyber-physical systems.

  17. Quantile Regression for Recurrent Gap Time Data

    PubMed Central

    Luo, Xianghua; Huang, Chiung-Yu; Wang, Lan

    2014-01-01

    Summary Evaluating covariate effects on gap times between successive recurrent events is of interest in many medical and public health studies. While most existing methods for recurrent gap time analysis focus on modeling the hazard function of gap times, a direct interpretation of the covariate effects on the gap times is not available through these methods. In this article, we consider quantile regression that can provide direct assessment of covariate effects on the quantiles of the gap time distribution. Following the spirit of the weighted risk-set method by Luo and Huang (2011, Statistics in Medicine 30, 301–311), we extend the martingale-based estimating equation method considered by Peng and Huang (2008, Journal of the American Statistical Association 103, 637–649) for univariate survival data to analyze recurrent gap time data. The proposed estimation procedure can be easily implemented in existing software for univariate censored quantile regression. Uniform consistency and weak convergence of the proposed estimators are established. Monte Carlo studies demonstrate the effectiveness of the proposed method. An application to data from the Danish Psychiatric Central Register is presented to illustrate the methods developed in this article. PMID:23489055

  18. A new method for determining the mass ejected during the cometary outburst - Application to the Jupiter-family comets

    NASA Astrophysics Data System (ADS)

    Wesołowski, M.; Gronkowski, P.

    2018-07-01

    In the present article, we propose a new method of mass estimation which is ejected from a nucleus of a comet during its outburst of brightness. The phenomena of cometary outburst are often reported for both periodic and parabolic comets. The outburst of a comet brightness is a sudden increase in its brightness greater than one magnitude, average by 2-5 mag. This should not be confused with explosions such as outbreak of a bomb. The essence of the phenomenon is only a sudden brightening of the comet. Long-term observations and studies of this phenomenon lead to the conclusion that the very probable direct cause of the many outbursts is the ejection of the some part of surface layer of a comet's nucleus and an increase in the rate of a sublimation (Hughes (1990), Gronkowski (2007), Gronkowski and Wesołowski (2015)). The purpose of this article is presentation of a new simple method of the estimation of the mass which is ejected from the comet's nucleus during considered phenomenon. To estimate the mass released during an outburst, different probable coefficients of extinction for cometary matter was assumed. The scattering cross-sections of cometary grains were precisely calculated on the basis of Mie's theory. This method was applied to the outburst of a hypothetical comet X/PC belonging to the Jupiter-family comets and to the case of the comet 17P/Holmes outburst in 2007.

  19. Estimating research productivity and quality in assistive technology: a bibliometric analysis spanning four decades.

    PubMed

    Ryan, Cindy; Tewey, Betsy; Newman, Shelia; Turner, Tracy; Jaeger, Robert J

    2004-12-01

    Conduct a quantitative assessment of the number of papers contained in MEDLINE related to selected types of assistive technology (AT), and to identify journals publishing significant numbers of papers related to AT, and evaluate them with quantitative productivity and quality measures. Consecutive sample of all papers in MEDLINE identified by standard medical subject headings for selected types of AT from 1963-2003. Number of journals carrying AT papers, papers per journal (both total number and those specific to AT), journal impact factor, circulation, and number of AT citations per year over time for each area of AT. We present search terms, estimates of the numbers of AT citations in MEDLINE, the journals most likely to contain articles related to AT, journal impact factors, and journal circulations (when available). We also present the number of citations in various areas of AT over time from 1963-2003. Suggestions are presented for possible future modifications of the MEDLINE controlled vocabulary, based on terminology used in existing AT classifications schemes, such as ISO 9999. Research papers in the areas of AT examined showed publication across a wide variety of journals. There are a number of journals publishing articles in AT that have impact factors above the median. Some areas of AT have shown an increase in publications per year over time, while others have shown a more constant level of productivity.

  20. Predicting the Future as Bayesian Inference: People Combine Prior Knowledge with Observations when Estimating Duration and Extent

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2011-01-01

    Predicting the future is a basic problem that people have to solve every day and a component of planning, decision making, memory, and causal reasoning. In this article, we present 5 experiments testing a Bayesian model of predicting the duration or extent of phenomena from their current state. This Bayesian model indicates how people should…

  1. Test Review: C. R. Reynolds and B. Livingston "CMOCS--Children's Measure of Obsessive-Compulsive Symptoms." Los Angeles, CA: Western Psychological Services, 2010

    ERIC Educational Resources Information Center

    Lund, Emily M.; Dennison, Andrea; Ewing, Heidi K.; de Carvalho, Catharina F.

    2011-01-01

    This article presents a review of the Children's Measure of Obsessive-Compulsive Symptoms (CMOCS), a self-report screening measure of obsessive and compulsive thoughts and behaviors in children and adolescents aged 8 through 19 years. Obsessive-compulsive disorder (OCD) is estimated to affect 1% to 3% of the population over their lifetime. The…

  2. Feasibility of flow studies at NICA/MPD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geraksiev, N. S., E-mail: nikolay.geraksiev@gmail.com; Collaboration: MPD Collaboration

    In the light of recent developments in heavy ion physic, anisotropic flow measurements play a key role in a better understanding of the hot and dense barionic matter. In the presented article a short introduction to the proposed NICA/MPD project is given, as well as a brief description of the event-plane method used to estimate the elliptic flow of reconstructed and identified hadrons (p, π, Λ)

  3. Estimating the Effect of Changes in Criterion Score Reliability on the Power of the "F" Test of Equality of Means

    ERIC Educational Resources Information Center

    Feldt, Leonard S.

    2011-01-01

    This article presents a simple, computer-assisted method of determining the extent to which increases in reliability increase the power of the "F" test of equality of means. The method uses a derived formula that relates the changes in the reliability coefficient to changes in the noncentrality of the relevant "F" distribution. A readily available…

  4. Local correction of quadrupole errors at LHC interaction regions using action and phase jump analysis on turn-by-turn beam position data

    NASA Astrophysics Data System (ADS)

    Cardona, Javier Fernando; García Bonilla, Alba Carolina; Tomás García, Rogelio

    2017-11-01

    This article shows that the effect of all quadrupole errors present in an interaction region with low β * can be modeled by an equivalent magnetic kick, which can be estimated from action and phase jumps found on beam position data. This equivalent kick is used to find the strengths that certain normal and skew quadrupoles located on the IR must have to make an effective correction in that region. Additionally, averaging techniques to reduce noise on beam position data, which allows precise estimates of equivalent kicks, are presented and mathematically justified. The complete procedure is tested with simulated data obtained from madx and 2015-LHC experimental data. The analyses performed in the experimental data indicate that the strengths of the IR skew quadrupole correctors and normal quadrupole correctors can be estimated within a 10% uncertainty. Finally, the effect of IR corrections in the β* is studied, and a correction scheme that returns this parameter to its designed value is proposed.

  5. Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma

    NASA Astrophysics Data System (ADS)

    Dhar, Purbarun; Sirisha Maganti, Lakshmi

    2017-08-01

    This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.

  6. Effect of visual field presentation on action planning (estimating reach) in children.

    PubMed

    Gabbard, Carl; Cordova, Alberto

    2012-01-01

    In this article, the authors examined the effects of target information presented in different visual fields (lower, upper, central) on estimates of reach via use of motor imagery in children (5-11 years old) and young adults. Results indicated an advantage for estimating reach movements for targets placed in lower visual field (LoVF), with all groups having greater difficulty in the upper visual field (UpVF) condition, especially 5- and 7-year-olds. Complementing these results was an overall age-related increase in accuracy. Based in part on the equivalence hypothesis suggesting that motor imagery and motor planning and execution are similar, the findings support previous work of executed behaviors showing that there is a LoVF bias for motor skill actions of the hand. Given that previous research hints that the UpVF may be bias for visuospatial (perceptual) qualities, research in that area and its association with visuomotor processing (LoVF) should be considered.

  7. Time-varying effect moderation using the structural nested mean model: estimation using inverse-weighted regression with residuals

    PubMed Central

    Almirall, Daniel; Griffin, Beth Ann; McCaffrey, Daniel F.; Ramchand, Rajeev; Yuen, Robert A.; Murphy, Susan A.

    2014-01-01

    This article considers the problem of examining time-varying causal effect moderation using observational, longitudinal data in which treatment, candidate moderators, and possible confounders are time varying. The structural nested mean model (SNMM) is used to specify the moderated time-varying causal effects of interest in a conditional mean model for a continuous response given time-varying treatments and moderators. We present an easy-to-use estimator of the SNMM that combines an existing regression-with-residuals (RR) approach with an inverse-probability-of-treatment weighting (IPTW) strategy. The RR approach has been shown to identify the moderated time-varying causal effects if the time-varying moderators are also the sole time-varying confounders. The proposed IPTW+RR approach provides estimators of the moderated time-varying causal effects in the SNMM in the presence of an additional, auxiliary set of known and measured time-varying confounders. We use a small simulation experiment to compare IPTW+RR versus the traditional regression approach and to compare small and large sample properties of asymptotic versus bootstrap estimators of the standard errors for the IPTW+RR approach. This article clarifies the distinction between time-varying moderators and time-varying confounders. We illustrate the methodology in a case study to assess if time-varying substance use moderates treatment effects on future substance use. PMID:23873437

  8. Modified ADALINE algorithm for harmonic estimation and selective harmonic elimination in inverters

    NASA Astrophysics Data System (ADS)

    Vasumathi, B.; Moorthi, S.

    2011-11-01

    In digital signal processing, algorithms are very well developed for the estimation of harmonic components. In power electronic applications, an objective like fast response of a system is of primary importance. An effective method for the estimation of instantaneous harmonic components, along with conventional harmonic elimination technique, is presented in this article. The primary function is to eliminate undesirable higher harmonic components from the selected signal (current or voltage) and it requires only the knowledge of the frequency of the component to be eliminated. A signal processing technique using modified ADALINE algorithm has been proposed for harmonic estimation. The proposed method stays effective as it converges to a minimum error and brings out a finer estimation. A conventional control based on pulse width modulation for selective harmonic elimination is used to eliminate harmonic components after its estimation. This method can be applied to a wide range of equipment. The validity of the proposed method to estimate and eliminate voltage harmonics is proved with a dc/ac inverter as a simulation example. Then, the results are compared with existing ADALINE algorithm for illustrating its effectiveness.

  9. Sequential estimation and satellite data assimilation in meteorology and oceanography

    NASA Technical Reports Server (NTRS)

    Ghil, M.

    1986-01-01

    The central theme of this review article is the role that dynamics plays in estimating the state of the atmosphere and of the ocean from incomplete and noisy data. Objective analysis and inverse methods represent an attempt at relying mostly on the data and minimizing the role of dynamics in the estimation. Four-dimensional data assimilation tries to balance properly the roles of dynamical and observational information. Sequential estimation is presented as the proper framework for understanding this balance, and the Kalman filter as the ideal, optimal procedure for data assimilation. The optimal filter computes forecast error covariances of a given atmospheric or oceanic model exactly, and hence data assimilation should be closely connected with predictability studies. This connection is described, and consequences drawn for currently active areas of the atmospheric and oceanic sciences, namely, mesoscale meteorology, medium and long-range forecasting, and upper-ocean dynamics.

  10. Estimation of electromagnetic dosimetric values from non-ionizing radiofrequency fields in an indoor commercial airplane environment.

    PubMed

    Aguirre, Erik; Arpón, Javier; Azpilicueta, Leire; López, Peio; de Miguel, Silvia; Ramos, Victoria; Falcone, Francisco

    2014-12-01

    In this article, the impact of topology as well as morphology of a complex indoor environment such as a commercial aircraft in the estimation of dosimetric assessment is presented. By means of an in-house developed deterministic 3D ray-launching code, estimation of electric field amplitude as a function of position for the complete volume of a commercial passenger airplane is obtained. Estimation of electromagnetic field exposure in this environment is challenging, due to the complexity and size of the scenario, as well as to the large metallic content, giving rise to strong multipath components. By performing the calculation with a deterministic technique, the complete scenario can be considered with an optimized balance between accuracy and computational cost. The proposed method can aid in the assessment of electromagnetic dosimetry in the future deployment of embarked wireless systems in commercial aircraft.

  11. Towards a smart non-invasive fluid loss measurement system.

    PubMed

    Suryadevara, N K; Mukhopadhyay, S C; Barrack, L

    2015-04-01

    In this article, a smart wireless sensing non-invasive system for estimating the amount of fluid loss, a person experiences while physical activity is presented. The system measures three external body parameters, Heart Rate, Galvanic Skin Response (GSR, or skin conductance), and Skin Temperature. These three parameters are entered into an empirically derived formula along with the user's body mass index, and estimation for the amount of fluid lost is determined. The core benefit of the developed system is the affluence usage in combining with smart home monitoring systems to care elderly people in ambient assisted living environments as well in automobiles to monitor the body parameters of a motorist.

  12. Choosing the best index for the average score intraclass correlation coefficient.

    PubMed

    Shieh, Gwowen

    2016-09-01

    The intraclass correlation coefficient (ICC)(2) index from a one-way random effects model is widely used to describe the reliability of mean ratings in behavioral, educational, and psychological research. Despite its apparent utility, the essential property of ICC(2) as a point estimator of the average score intraclass correlation coefficient is seldom mentioned. This article considers several potential measures and compares their performance with ICC(2). Analytical derivations and numerical examinations are presented to assess the bias and mean square error of the alternative estimators. The results suggest that more advantageous indices can be recommended over ICC(2) for their theoretical implication and computational ease.

  13. A Web-based interface to calculate phonotactic probability for words and nonwords in English

    PubMed Central

    VITEVITCH, MICHAEL S.; LUCE, PAUL A.

    2008-01-01

    Phonotactic probability refers to the frequency with which phonological segments and sequences of phonological segments occur in words in a given language. We describe one method of estimating phonotactic probabilities based on words in American English. These estimates of phonotactic probability have been used in a number of previous studies and are now being made available to other researchers via a Web-based interface. Instructions for using the interface, as well as details regarding how the measures were derived, are provided in the present article. The Phonotactic Probability Calculator can be accessed at http://www.people.ku.edu/~mvitevit/PhonoProbHome.html. PMID:15641436

  14. A class of Box-Cox transformation models for recurrent event data.

    PubMed

    Sun, Liuquan; Tong, Xingwei; Zhou, Xian

    2011-04-01

    In this article, we propose a class of Box-Cox transformation models for recurrent event data, which includes the proportional means models as special cases. The new model offers great flexibility in formulating the effects of covariates on the mean functions of counting processes while leaving the stochastic structure completely unspecified. For the inference on the proposed models, we apply a profile pseudo-partial likelihood method to estimate the model parameters via estimating equation approaches and establish large sample properties of the estimators and examine its performance in moderate-sized samples through simulation studies. In addition, some graphical and numerical procedures are presented for model checking. An example of application on a set of multiple-infection data taken from a clinic study on chronic granulomatous disease (CGD) is also illustrated.

  15. [Pedophilia. Prevalence, etiology, and diagnostics].

    PubMed

    Mokros, A; Osterheider, M; Nitschke, J

    2012-03-01

    Pedophilia is a disorder of sexual preference that increases the risk for committing sexual offenses against children. Consequently, pedophilia is not only relevant in psychiatric therapy and prognostics, but also greatly influences the public attitude towards criminality. Public opinion seems to equate pedophilia with child sexual abuse and vice versa which leads to stigmatization of patients and may impede treatment. The present paper provides information on recent studies on the potential origins of the disorder and introduces new diagnostic methods. Moreover, the article presents estimates on the prevalence of pedophilic sexual interest.

  16. The costs of nurse turnover, part 2: application of the Nursing Turnover Cost Calculation Methodology.

    PubMed

    Jones, Cheryl Bland

    2005-01-01

    This is the second article in a 2-part series focusing on nurse turnover and its costs. Part 1 (December 2004) described nurse turnover costs within the context of human capital theory, and using human resource accounting methods, presented the updated Nursing Turnover Cost Calculation Methodology. Part 2 presents an application of this method in an acute care setting and the estimated costs of nurse turnover that were derived. Administrators and researchers can use these methods and cost information to build a business case for nurse retention.

  17. National Health Expenditures, 1996

    PubMed Central

    Levit, Katharine R.; Lazenby, Helen C.; Braden, Bradley R.; Cowan, Cathy A.; Sensenig, Arthur L.; McDonnell, Patricia A.; Stiller, Jean M.; Won, Darleen K.; Martin, Anne B.; Sivarajan, Lekha; Donham, Carolyn S.; Long, Anna M.; Stewart, Madie W.

    1997-01-01

    The national health expenditures (NHE) series presented in this report for 1960-96 provides a view of the economic history of health care in the United States through spending for health care services and the sources financing that care. In 1996 NHE topped $1 trillion. At the same time, spending grew at the slowest rate, 4.4 percent, ever recorded in the current series. For the first time, this article presents estimates of Medicare managed care payments by type of service, as well as nursing home and home health spending in hospital-based facilities. PMID:10179997

  18. Reduced-rank technique for joint channel estimation in TD-SCDMA systems

    NASA Astrophysics Data System (ADS)

    Kamil Marzook, Ali; Ismail, Alyani; Mohd Ali, Borhanuddin; Sali, Adawati; Khatun, Sabira

    2013-02-01

    In time division-synchronous code division multiple access systems, increasing the system capacity by exploiting the inserting of the largest number of users in one time slot (TS) requires adding more estimation processes to estimate the joint channel matrix for the whole system. The increase in the number of channel parameters due the increase in the number of users in one TS directly affects the precision of the estimator's performance. This article presents a novel channel estimation with low complexity, which relies on reducing the rank order of the total channel matrix H. The proposed method exploits the rank deficiency of H to reduce the number of parameters that characterise this matrix. The adopted reduced-rank technique is based on truncated singular value decomposition algorithm. The algorithms for reduced-rank joint channel estimation (JCE) are derived and compared against traditional full-rank JCEs: least squares (LS) or Steiner and enhanced (LS or MMSE) algorithms. Simulation results of the normalised mean square error showed the superiority of reduced-rank estimators. In addition, the channel impulse responses founded by reduced-rank estimator for all active users offers considerable performance improvement over the conventional estimator along the channel window length.

  19. An Anisotropic A posteriori Error Estimator for CFD

    NASA Astrophysics Data System (ADS)

    Feijóo, Raúl A.; Padra, Claudio; Quintana, Fernando

    In this article, a robust anisotropic adaptive algorithm is presented, to solve compressible-flow equations using a stabilized CFD solver and automatic mesh generators. The association includes a mesh generator, a flow solver, and an a posteriori error-estimator code. The estimator was selected among several choices available (Almeida et al. (2000). Comput. Methods Appl. Mech. Engng, 182, 379-400; Borges et al. (1998). "Computational mechanics: new trends and applications". Proceedings of the 4th World Congress on Computational Mechanics, Bs.As., Argentina) giving a powerful computational tool. The main aim is to capture solution discontinuities, in this case, shocks, using the least amount of computational resources, i.e. elements, compatible with a solution of good quality. This leads to high aspect-ratio elements (stretching). To achieve this, a directional error estimator was specifically selected. The numerical results show good behavior of the error estimator, resulting in strongly-adapted meshes in few steps, typically three or four iterations, enough to capture shocks using a moderate and well-distributed amount of elements.

  20. A Bayesian Approach to More Stable Estimates of Group-Level Effects in Contextual Studies.

    PubMed

    Zitzmann, Steffen; Lüdtke, Oliver; Robitzsch, Alexander

    2015-01-01

    Multilevel analyses are often used to estimate the effects of group-level constructs. However, when using aggregated individual data (e.g., student ratings) to assess a group-level construct (e.g., classroom climate), the observed group mean might not provide a reliable measure of the unobserved latent group mean. In the present article, we propose a Bayesian approach that can be used to estimate a multilevel latent covariate model, which corrects for the unreliable assessment of the latent group mean when estimating the group-level effect. A simulation study was conducted to evaluate the choice of different priors for the group-level variance of the predictor variable and to compare the Bayesian approach with the maximum likelihood approach implemented in the software Mplus. Results showed that, under problematic conditions (i.e., small number of groups, predictor variable with a small ICC), the Bayesian approach produced more accurate estimates of the group-level effect than the maximum likelihood approach did.

  1. The truly remarkable universality of half a standard deviation: confirmation through another look.

    PubMed

    Norman, Geoffrey R; Sloan, Jeff A; Wyrwich, Kathleen W

    2004-10-01

    In this issue of Expert Review of Pharmacoeconomics and Outcomes Research, Farivar, Liu, and Hays present their findings in 'Another look at the half standard deviation estimate of the minimally important difference in health-related quality of life scores (hereafter referred to as 'Another look') . These researchers have re-examined the May 2003 Medical Care article 'Interpretation of changes in health-related quality of life: the remarkable universality of half a standard deviation' (hereafter referred to as 'Remarkable') in the hope of supporting their hypothesis that the minimally important difference in health-related quality of life measures is undoubtedly closer to 0.3 standard deviations than 0.5. Nonetheless, despite their extensive wranglings with the exclusion of many articles that we included in our review; the inclusion of articles that we did not include in our review; and the recalculation of effect sizes using the absolute value of the mean differences, in our opinion, the results of the 'Another look' article confirm the same findings in the 'Remarkable' paper.

  2. A record of large earthquakes on the southern Hayward fault for the past 1800 years

    USGS Publications Warehouse

    Lienkaemper, J.J.; Williams, P.L.

    2007-01-01

    This is the second article presenting evidence of the occurrence and timing of paleoearthquakes on the southern Hayward fault as interpreted from trenches excavated within a sag pond at the Tyson's Lagoon site in Fremont, California. We use the information to estimate the mean value and aperiodicity of the fault's recurrence interval (RI): two fundamental parameters for estimation of regional seismic hazard. An earlier article documented the four most recent earthquakes, including the historic 1868 earthquake. In this article we present evidence for at least seven earlier paleoruptures since about A.D. 170. We document these events with evidence for ground rupture, such as the presence of blocky colluvium at the base of the main trace fault scarp, and by corroborating evidence such as simultaneous liquefaction or an increase in deformation immediately below event horizons. The mean RI is 170 ?? 82 yr (1??, standard deviation of the sample), aperiodicity is 0.48, and individual intervals may be expected to range from 30 to 370 yr (95.4% confidence). The mean RI is consistent with the recurrence model of the Working Group on California Earthquake Probabilities (2003) (mean, 161 yr; range, 99 yr [2.5%]; 283 yr [97.5%]). We note that the mean RI for the five most recent events may have been only 138 ?? 58 yr (1??). Hypothesis tests for the shorter RI do not demonstrate that any recent acceleration has occurred compared to the earlier period or the entire 1800-yr record, principally because of inherent uncertainties of the event ages.

  3. Assigning value to energy storage systems at multiple points in an electrical grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balducci, Patrick J.; Alam, M. Jan E.; Hardy, Trevor D.

    This article presents a taxonomy for assigning benefits to the services provided by energy storage systems (ESSs), defines approaches for monetizing the value associated with these services, assigns values to major ESS applications by region based on a review of an extensive set of literature, and summarizes and evaluates the capabilities of several tools currently used to estimate value for specific ESS deployments.

  4. Assigning value to energy storage systems at multiple points in an electrical grid

    DOE PAGES

    Balducci, Patrick J.; Alam, M. Jan E.; Hardy, Trevor D.; ...

    2018-01-01

    This article presents a taxonomy for assigning benefits to the services provided by energy storage systems (ESSs), defines approaches for monetizing the value associated with these services, assigns values to major ESS applications by region based on a review of an extensive set of literature, and summarizes and evaluates the capabilities of several tools currently used to estimate value for specific ESS deployments.

  5. Forest resources, government policy, and investment location decisions of the forest products industry in the southern United States

    Treesearch

    Changyou Sun; Daowei Zhang

    2010-01-01

    In this article, the results of an initial attempt to estimate the effects of state attributes on plant location and investment expenditure were presented for the forest products industry in the southern United States. A conditional logit model was used to analyze new plant births, and a time-series cross-section model to assess the total capital expenditure....

  6. Preparedness for radiological emergency situations in Austria.

    PubMed

    Ditto, Manfred

    2012-02-01

    This article presents the Austrian system of emergency preparedness for nuclear and radiological emergency situations. It demonstrates, in particular, the legal basis, the roles and competencies of the competent authorities, international and bilateral conventions on early notification of nuclear accidents, the Austrian emergency plans, the Austrian radiation monitoring system, the operated prognosis and decision support systems and the results of an estimation of possible impacts of nuclear power plant disasters on Austria.

  7. Mineral resource of the month: feldspar

    USGS Publications Warehouse

    ,

    2011-01-01

    The article focuses on feldspar, a mineral that composes of potassium, sodium, or a fusion of the two, and its various applications. According to estimates by scientists, the mineral is present at 60 percent of the crust of Earth, wherein it is commonly used for making glass and ceramics. Global mining of feldspar was about 20 million metric tons in 2010, wherein Italy, Turkey, and China mine 55 percent of the feldspar worldwide.

  8. Numerical research of the swirling supersonic gas flows in the self-vacuuming vortex tube

    NASA Astrophysics Data System (ADS)

    Volov, V. T.; Lyaskin, A. S.

    2018-03-01

    This article presents the results of simulation for a special type of vortex tubes – self-vacuuming vortex tube (SVVT), for which extreme values of temperature separation and vacuum are realized. The main results of this study are the flow structure in the SVVT and energy loss estimations on oblique shock waves, gas friction, instant expansion and organization of vortex bundles in SVVT.

  9. Mobile Phone Surveys for Collecting Population-Level Estimates in Low- and Middle-Income Countries: A Literature Review

    PubMed Central

    Pereira, Amanda; Farrenkopf, Brooke A; Labrique, Alain B; Pariyo, George W; Hyder, Adnan A

    2017-01-01

    Background National and subnational level surveys are important for monitoring disease burden, prioritizing resource allocation, and evaluating public health policies. As mobile phone access and ownership become more common globally, mobile phone surveys (MPSs) offer an opportunity to supplement traditional public health household surveys. Objective The objective of this study was to systematically review the current landscape of MPSs to collect population-level estimates in low- and middle-income countries (LMICs). Methods Primary and gray literature from 7 online databases were systematically searched for studies that deployed MPSs to collect population-level estimates. Titles and abstracts were screened on primary inclusion and exclusion criteria by two research assistants. Articles that met primary screening requirements were read in full and screened for secondary eligibility criteria. Articles included in review were grouped into the following three categories by their survey modality: (1) interactive voice response (IVR), (2) short message service (SMS), and (3) human operator or computer-assisted telephone interviews (CATI). Data were abstracted by two research assistants. The conduct and reporting of the review conformed to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Results A total of 6625 articles were identified through the literature review. Overall, 11 articles were identified that contained 19 MPS (CATI, IVR, or SMS) surveys to collect population-level estimates across a range of topics. MPSs were used in Latin America (n=8), the Middle East (n=1), South Asia (n=2), and sub-Saharan Africa (n=8). Nine articles presented results for 10 CATI surveys (10/19, 53%). Two articles discussed the findings of 6 IVR surveys (6/19, 32%). Three SMS surveys were identified from 2 articles (3/19, 16%). Approximately 63% (12/19) of MPS were delivered to mobile phone numbers collected from previously administered household surveys. The majority of MPS (11/19, 58%) were panel surveys where a cohort of participants, who often were provided a mobile phone upon a face-to-face enrollment, were surveyed multiple times. Conclusions Very few reports of population-level MPS were identified. Of the MPS that were identified, the majority of surveys were conducted using CATI. Due to the limited number of identified IVR and SMS surveys, the relative advantages and disadvantages among the three survey modalities cannot be adequately assessed. The majority of MPS were sent to mobile phone numbers that were collected from a previously administered household survey. There is limited evidence on whether a random digit dialing (RDD) approach or a simple random sample of mobile network provided list of numbers can produce a population representative survey. PMID:28476725

  10. Estimating the size of key populations at higher risk of HIV infection: a summary of experiences and lessons presented during a technical meeting on size estimation among key populations in Asian countries

    PubMed Central

    Calleja, Jesus Maria Garcia; Zhao, Jinkou; Reddy, Amala; Seguy, Nicole

    2014-01-01

    Problem Size estimates of key populations at higher risk of HIV exposure are recognized as critical for understanding the trajectory of the HIV epidemic and planning and monitoring an effective response, especially for countries with concentrated and low epidemics such as those in Asia. Context To help countries estimate population sizes of key populations, global guidelines were updated in 2011 to reflect new technical developments and recent field experiences in applying these methods. Action In September 2013, a meeting of programme managers and experts experienced with population size estimates (PSE) for key populations was held for 13 Asian countries. This article summarizes the key results presented, shares practical lessons learnt and reviews the methodological approaches from implementing PSE in 13 countries. Lessons learnt It is important to build capacity to collect, analyse and use PSE data; establish a technical review group; and implement a transparent, well documented process. Countries should adapt global PSE guidelines and maintain operational definitions that are more relevant and useable for country programmes. Development of methods for non-venue-based key populations requires more investment and collaborative efforts between countries and among partners. PMID:25320676

  11. [Afterschool physical activity programs: Literature review].

    PubMed

    Reloba-Martínez, Sergio; Martín-Tamayo, Ignacio; Martínez-López, Emilio José; Guerrero-Almeida, Laura

    2015-01-01

    The purpose of this review was to analyze the scientific production about extra-curricular physical activity (PA) in western children of 6-12 years. Medline / Pub-Med, Scopus and Google Scholar were used. This search collects articles published between January 1990 and May 2013. A total of 104 publications were analyzed. The body composition parameters are best used to assess the results of the studies, followed by those which estimate the maximum aerobic capacity. Articles of intervention are presented with very heterogeneous methodological features but there are clear trends in the use of certain aspects. As for the reviews, most are systematic and include meta-analysis. In this studies, body mass index (BMI) is the most used parameter.

  12. Growth and development issues in adolescents with ostomies: a primer for the WOC nurses.

    PubMed

    Mohr, Lynn D

    2012-01-01

    Caring for the adolescent (13-18 years of age) with an ostomy presents multiple challenges. The purpose of this article is to provide strategies to assist the WOC nurse in minimizing the potential impact on growth and development for this age group with an ostomy. This is relevant to the WOC nurse since it is estimated that between 6% and 14% of all adolescents have symptoms of irritable bowel disease, and many will require an ostomy. Thus the WOC nurse will be called upon to provide care to this age group. This article discusses normal adolescent growth and development and provides strategies to support the normal growth and development.

  13. COSMIC monthly progress report

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of April 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Five articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: GAP 1.0 - Groove Analysis Program, Version 1.0; SUBTRANS - Subband/Transform MATLAB Functions for Image Processing; CSDM - COLD-SAT Dynamic Model; CASRE - Computer Aided Software Reliability Estimation; and XOPPS - OEL Project Planner/Scheduler Tool. Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and disseminations are also described along with a budget summary.

  14. The P Value Problem in Otolaryngology: Shifting to Effect Sizes and Confidence Intervals.

    PubMed

    Vila, Peter M; Townsend, Melanie Elizabeth; Bhatt, Neel K; Kao, W Katherine; Sinha, Parul; Neely, J Gail

    2017-06-01

    There is a lack of reporting effect sizes and confidence intervals in the current biomedical literature. The objective of this article is to present a discussion of the recent paradigm shift encouraging the use of reporting effect sizes and confidence intervals. Although P values help to inform us about whether an effect exists due to chance, effect sizes inform us about the magnitude of the effect (clinical significance), and confidence intervals inform us about the range of plausible estimates for the general population mean (precision). Reporting effect sizes and confidence intervals is a necessary addition to the biomedical literature, and these concepts are reviewed in this article.

  15. September 11, 2001: then and now.

    PubMed

    Jameson, John R

    2002-01-01

    This article, written by a historian, uses the sequential questioning technique to present a selected historical and statistical overview of the tragic events of September 11, 2001, including: the hijackings; the suicide attacks in New York, Washington, D C, and Pennsylvania; background on Osama bin Laden and al-Qaeda; rescue and recovery efforts; and a brief discussion of how the horrors of the day continue to affect the American people a year later. Especially sobering are the dollar costs of the attacks and the projected expenses of U.S. efforts to control the spread of international terrorism (estimated at $640 billion, just through fiscal year 2003). Throughout, the article draws on the experiences of the victims, the rescuers, and the survivors.

  16. Dental radiographic indicators, a key to age estimation

    PubMed Central

    Panchbhai, AS

    2011-01-01

    Objective The present review article is aimed at describing the radiological methods utilized for human age identification. Methods The application and importance of radiological methods in human age assessment was discussed through the literature survey. Results Following a literature search, 46 articles were included in the study and the relevant information is depicted in the article. Dental tissue is often preserved indefinitely after death. Implementation of radiography is based on the assessment of the extent of calcification of teeth and in turn the degree of formation of crown and root structures, along with the sequence and the stages of eruption. Several radiological techniques can be used to assist in both individual and general identification, including determination of gender, ethnic group and age. The radiographic method is a simpler and cheaper method of age identification compared with histological and biochemical methods. Radiographic and tomographic images have become an essential aid for human identification in forensic dentistry, particularly with the refinement of techniques and the incorporation of information technology resources. Conclusion Based on an appropriate knowledge of the available methods, forensic dentists can choose the most appropriate since the validity of age estimation crucially depends on the method used and its proper application. The multifactorial approach will lead to optimum age assessment. The legal requirements also have to be considered. PMID:21493876

  17. Sojourning with the Homogeneous Poisson Process.

    PubMed

    Liu, Piaomu; Peña, Edsel A

    2016-01-01

    In this pedagogical article, distributional properties, some surprising, pertaining to the homogeneous Poisson process (HPP), when observed over a possibly random window, are presented. Properties of the gap-time that covered the termination time and the correlations among gap-times of the observed events are obtained. Inference procedures, such as estimation and model validation, based on event occurrence data over the observation window, are also presented. We envision that through the results in this paper, a better appreciation of the subtleties involved in the modeling and analysis of recurrent events data will ensue, since the HPP is arguably one of the simplest among recurrent event models. In addition, the use of the theorem of total probability, Bayes theorem, the iterated rules of expectation, variance and covariance, and the renewal equation could be illustrative when teaching distribution theory, mathematical statistics, and stochastic processes at both the undergraduate and graduate levels. This article is targeted towards both instructors and students.

  18. Three journal similarity metrics and their application to biomedical journals.

    PubMed

    D'Souza, Jennifer L; Smalheiser, Neil R

    2014-01-01

    In the present paper, we have created several novel journal similarity metrics. The MeSH odds ratio measures the topical similarity of any pair of journals, based on the major MeSH headings assigned to articles in MEDLINE. The second metric employed the 2009 Author-ity author name disambiguation dataset as a gold standard for estimating the author odds ratio. This gives a straightforward, intuitive answer to the question: Given two articles in PubMed that share the same author name (lastname, first initial), how does knowing only the identity of the journals (in which the articles were published) predict the relative likelihood that they are written by the same person vs. different persons? The article pair odds ratio detects the tendency of authors to publish repeatedly in the same journal, as well as in specific pairs of journals. The metrics can be applied not only to estimate the similarity of a pair of journals, but to provide novel profiles of individual journals as well. For example, for each journal, one can define the MeSH cloud as the number of other journals that are topically more similar to it than expected by chance, and the author cloud as the number of other journals that share more authors than expected by chance. These metrics for journal pairs and individual journals have been provided in the form of public datasets that can be readily studied and utilized by others.

  19. Three Journal Similarity Metrics and Their Application to Biomedical Journals

    PubMed Central

    D′Souza, Jennifer L.; Smalheiser, Neil R.

    2014-01-01

    In the present paper, we have created several novel journal similarity metrics. The MeSH odds ratio measures the topical similarity of any pair of journals, based on the major MeSH headings assigned to articles in MEDLINE. The second metric employed the 2009 Author-ity author name disambiguation dataset as a gold standard for estimating the author odds ratio. This gives a straightforward, intuitive answer to the question: Given two articles in PubMed that share the same author name (lastname, first initial), how does knowing only the identity of the journals (in which the articles were published) predict the relative likelihood that they are written by the same person vs. different persons? The article pair odds ratio detects the tendency of authors to publish repeatedly in the same journal, as well as in specific pairs of journals. The metrics can be applied not only to estimate the similarity of a pair of journals, but to provide novel profiles of individual journals as well. For example, for each journal, one can define the MeSH cloud as the number of other journals that are topically more similar to it than expected by chance, and the author cloud as the number of other journals that share more authors than expected by chance. These metrics for journal pairs and individual journals have been provided in the form of public datasets that can be readily studied and utilized by others. PMID:25536326

  20. The trend of quality of publications in endodontic surgery: a 10-year systematic survey of the literature.

    PubMed

    Del Fabbro, Massimo; Corbella, Stefano; Tsesis, Igor; Taschieri, Silvio

    2015-03-01

    The aims of the present systematic literature analysis were to evaluate, over a 10-year period, the trend of the proportion of RCT, SR, MA published on endodontic surgery, and to investigate if the impact factor (IF) of the main endodontic Journals correlates with the proportion of RCT, SR, MA they publish. An electronic search of the RCT, SR and MA published on the topic "endodontic surgery" from 2001 to 2010 was performed on Medline and Cochrane CENTRAL database using specific search terms combined with Boolean operators. Endodontic Journals impact factor was retrieved by the Thomson Scientific database. The proportion of each study type over the total number of articles on endodontic surgery published per year was estimated. The correlation between the number of high-evidence level studies published on the main endodontic Journals and the IF of such Journals per year was estimated. From a total of 900 articles published in 2001-2010 on endodontic surgery, there were 114 studies of high evidence level. A significant increase of the proportion of either RCT, SR and MA over the years was found. A modest to unclear correlation was found between the Journal IF and the number of high-evidence articles published. There is a positive trend over the years among researchers in performing studies of good quality in endodontic surgery. The impact factor of endodontic Journals is not consistently influenced by publication of high-evidence level articles. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Axioms of adaptivity

    PubMed Central

    Carstensen, C.; Feischl, M.; Page, M.; Praetorius, D.

    2014-01-01

    This paper aims first at a simultaneous axiomatic presentation of the proof of optimal convergence rates for adaptive finite element methods and second at some refinements of particular questions like the avoidance of (discrete) lower bounds, inexact solvers, inhomogeneous boundary data, or the use of equivalent error estimators. Solely four axioms guarantee the optimality in terms of the error estimators. Compared to the state of the art in the temporary literature, the improvements of this article can be summarized as follows: First, a general framework is presented which covers the existing literature on optimality of adaptive schemes. The abstract analysis covers linear as well as nonlinear problems and is independent of the underlying finite element or boundary element method. Second, efficiency of the error estimator is neither needed to prove convergence nor quasi-optimal convergence behavior of the error estimator. In this paper, efficiency exclusively characterizes the approximation classes involved in terms of the best-approximation error and data resolution and so the upper bound on the optimal marking parameters does not depend on the efficiency constant. Third, some general quasi-Galerkin orthogonality is not only sufficient, but also necessary for the R-linear convergence of the error estimator, which is a fundamental ingredient in the current quasi-optimality analysis due to Stevenson 2007. Finally, the general analysis allows for equivalent error estimators and inexact solvers as well as different non-homogeneous and mixed boundary conditions. PMID:25983390

  2. Redshift-Independent Distances in the NASA/IPAC Extragalactic Database Surpass 166,000 Estimates for 77,000 Galaxies

    NASA Astrophysics Data System (ADS)

    Steer, Ian

    2017-01-01

    Redshift-independent extragalactic distance estimates are used by researchers to establish the extragalactic distance scale, to underpin estimates of the Hubble constant, and to study peculiar velocities induced by gravitational attractions that perturb the motions of galaxies with respect to the “Hubble flow” of universal expansion. In 2006, the NASA/IPAC Extragalactic Database (NED) began providing users with a comprehensive tabulation of the redshift-independent extragalactic distance estimates published in the astronomical literature since 1980. A decade later, this compendium of distances (NED-D) surpassed 100,000 estimates for 28,000 galaxies, as reported in our recent journal article (Steer et al. 2016). Here, we are pleased to report NED-D has surpassed 166,000 distance estimates for 77,000 galaxies. Visualizations of the growth in data and of the statistical distributions of the most used distance indicators will be presented, along with an overview of the new data responsible for the most recent growth. We conclude with an outline of NED’s current plans to facilitate extragalactic research further by making greater use of redshift-independent distances. Additional information about other extensive updates to NED is presented at this meeting by Mazzarella et al. (2017). NED is operated by and this research is funded by the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  3. Scale Estimation and Correction of the Monocular Simultaneous Localization and Mapping (SLAM) Based on Fusion of 1D Laser Range Finder and Vision Data.

    PubMed

    Zhang, Zhuang; Zhao, Rujin; Liu, Enhai; Yan, Kun; Ma, Yuebo

    2018-06-15

    This article presents a new sensor fusion method for visual simultaneous localization and mapping (SLAM) through integration of a monocular camera and a 1D-laser range finder. Such as a fusion method provides the scale estimation and drift correction and it is not limited by volume, e.g., the stereo camera is constrained by the baseline and overcomes the limited depth range problem associated with SLAM for RGBD cameras. We first present the analytical feasibility for estimating the absolute scale through the fusion of 1D distance information and image information. Next, the analytical derivation of the laser-vision fusion is described in detail based on the local dense reconstruction of the image sequences. We also correct the scale drift of the monocular SLAM using the laser distance information which is independent of the drift error. Finally, application of this approach to both indoor and outdoor scenes is verified by the Technical University of Munich dataset of RGBD and self-collected data. We compare the effects of the scale estimation and drift correction of the proposed method with the SLAM for a monocular camera and a RGBD camera.

  4. Simulation methods to estimate design power: an overview for applied research.

    PubMed

    Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E

    2011-06-20

    Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.

  5. Simulation methods to estimate design power: an overview for applied research

    PubMed Central

    2011-01-01

    Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447

  6. Spirituality, religion, and health: over the last 15 years of field research (1999-2013).

    PubMed

    Lucchetti, Giancarlo; Lucchetti, Alessandra Lamas Granero

    2014-01-01

    Although several studies have examined the contribution of specific countries, journals, and authors in different scientific disciplines, little is known about the contribution of different world countries, journals, and authors to scientific research in the field of "Spirituality, religion, and health" (S/R). The present study aims to analyze the last 15 years of research in the field of spirituality and religiousness (S/R) through a bibliometric analysis. Using the Pubmed database, we retrieved all articles related to S/R field for the period 1999-2013. We then estimated the total number of publications, number of articles published per year, articles published per country, journals with most publications in S/R field, most productive authors, and most used keywords. We found a growth of publications in the last years, most from the United States and the United Kingdom and published in the English language. Noteworthy, some developing countries such as India, Brazil, Israel, and Iran are at higher positions in this list. The S/R articles were published in journals embracing all fields of research, including high impact journals. In the present study, we took a closer look at the field of "Spirituality, religion, and health," showing that this field of research has been constantly growing and consolidating in the scientific community.

  7. The effect of giving global coronary risk information to adults: a systematic review.

    PubMed

    Sheridan, Stacey L; Viera, Anthony J; Krantz, Mori J; Ice, Christa L; Steinman, Lesley E; Peters, Karen E; Kopin, Laurie A; Lungelow, Danielle

    2010-02-08

    Global coronary heart disease (CHD) risk estimation (ie, a quantitative estimate of a patient's chances of CHD calculated by combining risk factors in an empirical equation) is recommended as a starting point for primary prevention efforts in all US adults. Whether it improves outcomes is currently unknown. To assess the effect of providing global CHD risk information to adults, we performed a systematic evidence review. We searched MEDLINE for the years 1980 to 2008, Psych Info, CINAHL, and the Cochrane Database and included English-language articles that met prespecified inclusion criteria. Two reviewers independently reviewed titles, abstracts, and articles for inclusion and assessed study quality. We identified 20 articles, reporting on 18 unique fair or good quality studies (including 14 randomized controlled studies). These showed that global CHD risk information alone or with accompanying education increased the accuracy of perceived risk and probably increased intent to start therapy. Studies with repeated risk information or risk information and repeated doses of counseling showed small significant reductions in predicted CHD risk (absolute differences, -0.2% to -2% over 10 years in studies using risk estimates derived from Framingham equations). Studies providing global risk information at only 1 point in time seemed ineffective. Global CHD risk information seems to improve the accuracy of risk perception and may increase intent to initiate CHD prevention among individuals at moderate to high risk. The effect of global risk presentation on more distal outcomes is less clear and seems to be related to the intensity of accompanying interventions.

  8. Economic burden of underweight and overweight among adults in the Asia-Pacific region: a systematic review.

    PubMed

    Hoque, Mohammad Enamul; Mannan, Munim; Long, Kurt Z; Al Mamun, Abdullah

    2016-04-01

    To assess the economic burden of underweight and overweight among adults in the Asia-Pacific region. Systematic review of articles published until March 2015. Seventeen suitable articles were found, of which 13 assess the economic burden of overweight/obesity and estimate that it accounts for 1.5-9.9% of a country's total healthcare expenditure. Four articles on the economic burden of underweight estimate it at 2.5-3.8% of the country's total GDP. Using hospital data, and compared to normal weight individuals, four articles estimated extra healthcare costs for overweight individuals of 7-9.8% and more, and extra healthcare costs for obese individuals of 17-22.3% and higher. Despite methodological diversity across the studies, there is a consensus that both underweight and overweight impose a substantial financial burden on healthcare systems in the Asia-Pacific region. © 2016 John Wiley & Sons Ltd.

  9. Estimation of sample size and testing power (part 5).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-02-01

    Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.

  10. The Cost of Crime to Society: New Crime-Specific Estimates for Policy and Program Evaluation

    PubMed Central

    French, Michael T.; Fang, Hai

    2010-01-01

    Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than ten years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost of society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. PMID:20071107

  11. The cost of crime to society: new crime-specific estimates for policy and program evaluation.

    PubMed

    McCollister, Kathryn E; French, Michael T; Fang, Hai

    2010-04-01

    Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than 10 years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost to society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Computation of nonlinear least squares estimator and maximum likelihood using principles in matrix calculus

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi; Balasiddamuni, P.

    2017-11-01

    This paper uses matrix calculus techniques to obtain Nonlinear Least Squares Estimator (NLSE), Maximum Likelihood Estimator (MLE) and Linear Pseudo model for nonlinear regression model. David Pollard and Peter Radchenko [1] explained analytic techniques to compute the NLSE. However the present research paper introduces an innovative method to compute the NLSE using principles in multivariate calculus. This study is concerned with very new optimization techniques used to compute MLE and NLSE. Anh [2] derived NLSE and MLE of a heteroscedatistic regression model. Lemcoff [3] discussed a procedure to get linear pseudo model for nonlinear regression model. In this research article a new technique is developed to get the linear pseudo model for nonlinear regression model using multivariate calculus. The linear pseudo model of Edmond Malinvaud [4] has been explained in a very different way in this paper. David Pollard et.al used empirical process techniques to study the asymptotic of the LSE (Least-squares estimation) for the fitting of nonlinear regression function in 2006. In Jae Myung [13] provided a go conceptual for Maximum likelihood estimation in his work “Tutorial on maximum likelihood estimation

  13. Reconstructing the hidden states in time course data of stochastic models.

    PubMed

    Zimmer, Christoph

    2015-11-01

    Parameter estimation is central for analyzing models in Systems Biology. The relevance of stochastic modeling in the field is increasing. Therefore, the need for tailored parameter estimation techniques is increasing as well. Challenges for parameter estimation are partial observability, measurement noise, and the computational complexity arising from the dimension of the parameter space. This article extends the multiple shooting for stochastic systems' method, developed for inference in intrinsic stochastic systems. The treatment of extrinsic noise and the estimation of the unobserved states is improved, by taking into account the correlation between unobserved and observed species. This article demonstrates the power of the method on different scenarios of a Lotka-Volterra model, including cases in which the prey population dies out or explodes, and a Calcium oscillation system. Besides showing how the new extension improves the accuracy of the parameter estimates, this article analyzes the accuracy of the state estimates. In contrast to previous approaches, the new approach is well able to estimate states and parameters for all the scenarios. As it does not need stochastic simulations, it is of the same order of speed as conventional least squares parameter estimation methods with respect to computational time. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Composite Partial Likelihood Estimation Under Length-Biased Sampling, With Application to a Prevalent Cohort Study of Dementia

    PubMed Central

    Huang, Chiung-Yu; Qin, Jing

    2013-01-01

    The Canadian Study of Health and Aging (CSHA) employed a prevalent cohort design to study survival after onset of dementia, where patients with dementia were sampled and the onset time of dementia was determined retrospectively. The prevalent cohort sampling scheme favors individuals who survive longer. Thus, the observed survival times are subject to length bias. In recent years, there has been a rising interest in developing estimation procedures for prevalent cohort survival data that not only account for length bias but also actually exploit the incidence distribution of the disease to improve efficiency. This article considers semiparametric estimation of the Cox model for the time from dementia onset to death under a stationarity assumption with respect to the disease incidence. Under the stationarity condition, the semiparametric maximum likelihood estimation is expected to be fully efficient yet difficult to perform for statistical practitioners, as the likelihood depends on the baseline hazard function in a complicated way. Moreover, the asymptotic properties of the semiparametric maximum likelihood estimator are not well-studied. Motivated by the composite likelihood method (Besag 1974), we develop a composite partial likelihood method that retains the simplicity of the popular partial likelihood estimator and can be easily performed using standard statistical software. When applied to the CSHA data, the proposed method estimates a significant difference in survival between the vascular dementia group and the possible Alzheimer’s disease group, while the partial likelihood method for left-truncated and right-censored data yields a greater standard error and a 95% confidence interval covering 0, thus highlighting the practical value of employing a more efficient methodology. To check the assumption of stable disease for the CSHA data, we also present new graphical and numerical tests in the article. The R code used to obtain the maximum composite partial likelihood estimator for the CSHA data is available in the online Supplementary Material, posted on the journal web site. PMID:24000265

  15. RETRACTED: Crystal growth and spectroscopic characterization of Aloevera amino acid added lithium sulfate monohydrate: A non-linear optical crystal

    NASA Astrophysics Data System (ADS)

    Manimekalai, R.; Antony Joseph, A.; Ramachandra Raja, C.

    2014-03-01

    This article has been retracted: please see Elsevier Policy on Article Withdrawal. This article has been retracted at the request of authors. According to the author we have reported Aloevera Amino Acid added Lithium sulphate monohydrate [AALSMH] crystal is a new nonlinear optical crystal. From the recorded high performance liquid chromatography spectrum, by matching the retention times with the known compounds, the amino acids present in our extract are identified as homocystine, isoleucine, serine, leucine and tyrosine. From the thin layer chromatography and colorimetric estimation techniques, presence of isoleucine was identified and it was also confirmed by NMR spectrum. From the above studies, we came to conclude that AALSMH is new nonlinear optical crystal. After further investigation, lattice parameter values of AALSMH are coinciding with lithium sulphate. Therefore we have decided to withdraw our paper. Sorry for the inconvenience and time spent.

  16. Gene-Environment Interplay in Twin Models

    PubMed Central

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Chanyoung; Kim, Nam H.

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  18. Adaptive sampling in research on risk-related behaviors.

    PubMed

    Thompson, Steven K; Collins, Linda M

    2002-11-01

    This article introduces adaptive sampling designs to substance use researchers. Adaptive sampling is particularly useful when the population of interest is rare, unevenly distributed, hidden, or hard to reach. Examples of such populations are injection drug users, individuals at high risk for HIV/AIDS, and young adolescents who are nicotine dependent. In conventional sampling, the sampling design is based entirely on a priori information, and is fixed before the study begins. By contrast, in adaptive sampling, the sampling design adapts based on observations made during the survey; for example, drug users may be asked to refer other drug users to the researcher. In the present article several adaptive sampling designs are discussed. Link-tracing designs such as snowball sampling, random walk methods, and network sampling are described, along with adaptive allocation and adaptive cluster sampling. It is stressed that special estimation procedures taking the sampling design into account are needed when adaptive sampling has been used. These procedures yield estimates that are considerably better than conventional estimates. For rare and clustered populations adaptive designs can give substantial gains in efficiency over conventional designs, and for hidden populations link-tracing and other adaptive procedures may provide the only practical way to obtain a sample large enough for the study objectives.

  19. Measuring reproductive health: review of community-based approaches to assessing morbidity.

    PubMed Central

    Sadana, R.

    2000-01-01

    This article begins by reviewing selected past approaches to estimating the prevalence of a range of morbidities through the use of household or community-based interview surveys in developed and developing countries. Subsequently, it reviews epidemiological studies that have used a range of methods to estimate the prevalence of reproductive morbidities. A detailed review of recent community or hospital based health interview validation studies that compare self-reported, clinical and laboratory measures is presented. Studies from Bangladesh, Bolivia, China, Egypt, India, Indonesia, Nigeria, Philippines and Turkey provide empirical evidence that self-reported morbidity and observed morbidity measure different phenomena and therefore different aspects of reproductive health and illness. Rather than estimating the prevalence of morbidity, interview-based surveys may provide useful information about the disability or burden associated with reproductive health and illness. PMID:10859858

  20. Economic Returns to Speaking "Standard Mandarin" among Migrants in China's Urban Labour Market

    ERIC Educational Resources Information Center

    Gao, Wenshu; Smyth, Russell

    2011-01-01

    This article uses data from the China Urban Labour Survey administered across 12 cities in 2005 to estimate the economic returns to speaking standard Mandarin among internal migrants in China's urban labour market. The article builds on studies that estimate the economic returns to international immigrants of being fluent in the major language of…

  1. Modeling Heterogeneity in Relationships between Initial Status and Rates of Change: Treating Latent Variable Regression Coefficients as Random Coefficients in a Three-Level Hierarchical Model

    ERIC Educational Resources Information Center

    Choi, Kilchan; Seltzer, Michael

    2010-01-01

    In studies of change in education and numerous other fields, interest often centers on how differences in the status of individuals at the start of a period of substantive interest relate to differences in subsequent change. In this article, the authors present a fully Bayesian approach to estimating three-level Hierarchical Models in which latent…

  2. The technological future of 7 T MRI hardware.

    PubMed

    Webb, A G; Van de Moortele, P F

    2016-09-01

    In this article we present our projections of future hardware developments on 7 T human MRI systems. These include compact cryogen-light magnets, improved gradient performance, integrated RF-receive and direct current shimming coil arrays, new RF technology with adaptive impedance matching, patient-specific specific absorption rate estimation and monitoring, and increased integration of physiological monitoring systems. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Effect of Stress State on Fracture Features

    NASA Astrophysics Data System (ADS)

    Das, Arpan

    2018-02-01

    Present article comprehensively explores the influence of specimen thickness on the quantitative estimates of different ductile fractographic features in two dimensions, correlating tensile properties of a reactor pressure vessel steel tested under ambient temperature where the initial crystallographic texture, inclusion content, and their distribution are kept unaltered. It has been investigated that the changes in tensile fracture morphology of these steels are directly attributable to the resulting stress-state history under tension for given specimen dimensions.

  4. Body mass and stature estimation based on the first metatarsal in humans.

    PubMed

    De Groote, Isabelle; Humphrey, Louise T

    2011-04-01

    Archaeological assemblages often lack the complete long bones needed to estimate stature and body mass. The most accurate estimates of body mass and stature are produced using femoral head diameter and femur length. Foot bones including the first metatarsal preserve relatively well in a range of archaeological contexts. In this article we present regression equations using the first metatarsal to estimate femoral head diameter, femoral length, and body mass in a diverse human sample. The skeletal sample comprised 87 individuals (Andamanese, Australasians, Africans, Native Americans, and British). Results show that all first metatarsal measurements correlate moderately to highly (r = 0.62-0.91) with femoral head diameter and length. The proximal articular dorsoplantar diameter is the best single measurement to predict both femoral dimensions. Percent standard errors of the estimate are below 5%. Equations using two metatarsal measurements show a small increase in accuracy. Direct estimations of body mass (calculated from measured femoral head diameter using previously published equations) have an error of just over 7%. No direct stature estimation equations were derived due to the varied linear body proportions represented in the sample. The equations were tested on a sample of 35 individuals from Christ Church Spitalfields. Percentage differences in estimated and measured femoral head diameter and length were less than 1%. This study demonstrates that it is feasible to use the first metatarsal in the estimation of body mass and stature. The equations presented here are particularly useful for assemblages where the long bones are either missing or fragmented, and enable estimation of these fundamental population parameters in poorly preserved assemblages. Copyright © 2011 Wiley-Liss, Inc.

  5. Extinction rates in North American freshwater fishes, 1900-2010

    USGS Publications Warehouse

    Burkhead, Noel M.

    2012-01-01

    Widespread evidence shows that the modern rates of extinction in many plants and animals exceed background rates in the fossil record. In the present article, I investigate this issue with regard to North American freshwater fishes. From 1898 to 2006, 57 taxa became extinct, and three distinct populations were extirpated from the continent. Since 1989, the numbers of extinct North American fishes have increased by 25%. From the end of the nineteenth century to the present, modern extinctions varied by decade but significantly increased after 1950 (post-1950s mean = 7.5 extinct taxa per decade). In the twentieth century, freshwater fishes had the highest extinction rate worldwide among vertebrates. The modern extinction rate for North American freshwater fishes is conservatively estimated to be 877 times greater than the background extinction rate for freshwater fishes (one extinction every 3 million years). Reasonable estimates project that future increases in extinctions will range from 53 to 86 species by 2050.

  6. Extinction rates in North American freshwater fishes, 1900-2010

    USGS Publications Warehouse

    Burkhead, Noel M.

    2012-01-01

    Widespread evidence shows that the modern rates of extinction in many plants and animals exceed background rates in the fossil record. In the present article, I investigate this issue with regard to North American freshwater fishes. From 1898 to 2006, 57 taxa became extinct, and three distinct populations were extirpated from the continent. Since 1989, the numbers of extinct North American fishes have increased by 25%. From the end of the nineteenth century to the present, modern extinctions varied by decade but significantly increased after 1950 (post-1950s mean = 7.5 extinct taxa per decade). The modern extinction rate for North American freshwater fishes is conservatively estimated to be 877 times greater than the background extinction rate for freshwater fishes (one extinction every 3 million years). Reasonable estimates project that future increases in extinctions will range from 53 to 86 species by 2050.

  7. Personnel resources in physical therapy: an analysis of supply, career patterns, and methods to enhance availability.

    PubMed

    Gwyer, J

    1995-01-01

    Describing the ever-changing supply and demand for physical therapy personnel in the United States is an intricate, complex, and profoundly significant task for the profession. In this article, a review of data relating to the supply of physical therapy personnel in the work force and their typical career patterns is presented. The estimates of the numbers of physical therapists and physical therapist assistants are discussed, as are problems associated with such estimates. Studies of career patterns of physical therapists are compared. Changes in the participation rates of women in the physical therapy work force over the last three decades are described. Career expectations, defined as both length and pattern of work-force participation, of entering physical therapy professionals are presented. Strategies to adjust the work-force participation of personnel through changes in the educational process, career patterns, and practice patterns are discussed.

  8. Inverse probability weighting for covariate adjustment in randomized studies

    PubMed Central

    Li, Xiaochun; Li, Lingling

    2013-01-01

    SUMMARY Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting “favorable” model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a “favorable” model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. PMID:24038458

  9. Inverse probability weighting for covariate adjustment in randomized studies.

    PubMed

    Shen, Changyu; Li, Xiaochun; Li, Lingling

    2014-02-20

    Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Boltzmann equation and hydrodynamics beyond Navier-Stokes.

    PubMed

    Bobylev, A V

    2018-04-28

    We consider in this paper the problem of derivation and regularization of higher (in Knudsen number) equations of hydrodynamics. The author's approach based on successive changes of hydrodynamic variables is presented in more detail for the Burnett level. The complete theory is briefly discussed for the linearized Boltzmann equation. It is shown that the best results in this case can be obtained by using the 'diagonal' equations of hydrodynamics. Rigorous estimates of accuracy of the Navier-Stokes and Burnett approximations are also presented.This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  11. Accounting for the move to ambulatory patient groups.

    PubMed

    Boyagian, H R; Dessingue, R F

    1998-07-01

    This article focuses on the cost accounting challenge an ambulatory patient group (APG)-like-based prospective payment system presents to providers and the issues associated with that challenge. In particular, how can costs be identified, how can the differences in costs be associated with alternative settings, and how do costs identified through a detailed resource costing methodology compare to estimates using alternative measures? The results presented suggest that decisions made based on current measures of ambulatory cost (i.e., charge-based measures) need to be reexamined. These decisions could include which services to provide, what setting is appropriate, and where marketshare opportunities exist.

  12. Acquisition and extinction in autoshaping.

    PubMed

    Kakade, Sham; Dayan, Peter

    2002-07-01

    C. R. Gallistel and J. Gibbon (2000) presented quantitative data on the speed with which animals acquire behavioral responses during autoshaping, together with a statistical model of learning intended to account for them. Although this model captures the form of the dependencies among critical variables, its detailed predictions are substantially at variance with the data. In the present article, further key data on the speed of acquisition are used to motivate an alternative model of learning, in which animals can be interpreted as paying different amounts of attention to stimuli according to estimates of their differential reliabilities as predictors.

  13. Improving hydropower choices via an online and open access tool

    PubMed Central

    Vilela, Thais; Reid, John

    2017-01-01

    This paper describes and validates the HydroCalculator Tool developed by Conservation Strategy Fund. The HydroCalculator Tool allows researchers, policy-makers and citizens to easily assess hydropower feasibility, by calculating traditional financial indicators, such as the levelized cost of energy, as well as greenhouse gas emissions and the economic net present value including emissions costs. Currently, people other than project developers have limited or no access to such information, which stifles informed public debate on electric energy options. Within this context, the use of the HydroCalculator Tool may contribute to the debate, by facilitating access to information. To validate the tool’s greenhouse gas calculations, we replicate two peer-reviewed articles that estimate greenhouse gas emissions from different hydropower plants in the Amazon basin. The estimates calculated by the HydroCalculator Tool are similar to the ones found in both peer-reviewed articles. The results show that hydropower plants can lead to greenhouse gas emissions and that, in some cases, these emissions can be larger than those of alternative energy sources producing the same amount of electricity. PMID:28650968

  14. Improving hydropower choices via an online and open access tool.

    PubMed

    Vilela, Thais; Reid, John

    2017-01-01

    This paper describes and validates the HydroCalculator Tool developed by Conservation Strategy Fund. The HydroCalculator Tool allows researchers, policy-makers and citizens to easily assess hydropower feasibility, by calculating traditional financial indicators, such as the levelized cost of energy, as well as greenhouse gas emissions and the economic net present value including emissions costs. Currently, people other than project developers have limited or no access to such information, which stifles informed public debate on electric energy options. Within this context, the use of the HydroCalculator Tool may contribute to the debate, by facilitating access to information. To validate the tool's greenhouse gas calculations, we replicate two peer-reviewed articles that estimate greenhouse gas emissions from different hydropower plants in the Amazon basin. The estimates calculated by the HydroCalculator Tool are similar to the ones found in both peer-reviewed articles. The results show that hydropower plants can lead to greenhouse gas emissions and that, in some cases, these emissions can be larger than those of alternative energy sources producing the same amount of electricity.

  15. A Bayesian network for modelling blood glucose concentration and exercise in type 1 diabetes.

    PubMed

    Ewings, Sean M; Sahu, Sujit K; Valletta, John J; Byrne, Christopher D; Chipperfield, Andrew J

    2015-06-01

    This article presents a new statistical approach to analysing the effects of everyday physical activity on blood glucose concentration in people with type 1 diabetes. A physiologically based model of blood glucose dynamics is developed to cope with frequently sampled data on food, insulin and habitual physical activity; the model is then converted to a Bayesian network to account for measurement error and variability in the physiological processes. A simulation study is conducted to determine the feasibility of using Markov chain Monte Carlo methods for simultaneous estimation of all model parameters and prediction of blood glucose concentration. Although there are problems with parameter identification in a minority of cases, most parameters can be estimated without bias. Predictive performance is unaffected by parameter misspecification and is insensitive to misleading prior distributions. This article highlights important practical and theoretical issues not previously addressed in the quest for an artificial pancreas as treatment for type 1 diabetes. The proposed methods represent a new paradigm for analysis of deterministic mathematical models of blood glucose concentration. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  16. An Iterative Maximum a Posteriori Estimation of Proficiency Level to Detect Multiple Local Likelihood Maxima

    ERIC Educational Resources Information Center

    Magis, David; Raiche, Gilles

    2010-01-01

    In this article the authors focus on the issue of the nonuniqueness of the maximum likelihood (ML) estimator of proficiency level in item response theory (with special attention to logistic models). The usual maximum a posteriori (MAP) method offers a good alternative within that framework; however, this article highlights some drawbacks of its…

  17. Estimation of pyrethroid pesticide intake using regression ...

    EPA Pesticide Factsheets

    Population-based estimates of pesticide intake are needed to characterize exposure for particular demographic groups based on their dietary behaviors. Regression modeling performed on measurements of selected pesticides in composited duplicate diet samples allowed (1) estimation of pesticide intakes for a defined demographic community, and (2) comparison of dietary pesticide intakes between the composite and individual samples. Extant databases were useful for assigning individual samples to composites, but they could not provide the breadth of information needed to facilitate measurable levels in every composite. Composite sample measurements were found to be good predictors of pyrethroid pesticide levels in their individual sample constituents where sufficient measurements are available above the method detection limit. Statistical inference shows little evidence of differences between individual and composite measurements and suggests that regression modeling of food groups based on composite dietary samples may provide an effective tool for estimating dietary pesticide intake for a defined population. The research presented in the journal article will improve community's ability to determine exposures through the dietary route with a less burdensome and costly method.

  18. The costs of functional gastrointestinal disorders and related signs and symptoms in infants: a systematic literature review and cost calculation for England.

    PubMed

    Mahon, James; Lifschitz, Carlos; Ludwig, Thomas; Thapar, Nikhil; Glanville, Julie; Miqdady, Mohamad; Saps, Miguel; Quak, Seng Hock; Lenoir Wijnkoop, Irene; Edwards, Mary; Wood, Hannah; Szajewska, Hania

    2017-11-14

    To estimate the cost of functional gastrointestinal disorders (FGIDs) and related signs and symptoms in infants to the third party payer and to parents. To estimate the cost of illness (COI) of infant FGIDs, a two-stage process was applied: a systematic literature review and a COI calculation. As no pertinent papers were found in the systematic literature review, a 'de novo' analysis was performed. For the latter, the potential costs for the third party payer (the National Health Service (NHS) in England) and for parents/carers for the treatment of FGIDs in infants were calculated, by using publicly available data. In constructing the calculation, estimates and assumptions (where necessary) were chosen to provide a lower bound (minimum) of the potential overall cost. In doing so, the interpretation of the calculation is that the true COI can be no lower than that estimated. Our calculation estimated that the total costs of treating FGIDs in infants in England were at least £72.3 million per year in 2014/2015 of which £49.1 million was NHS expenditure on prescriptions, community care and hospital treatment. Parents incurred £23.2 million in costs through purchase of over the counter remedies. The total cost presented here is likely to be a significant underestimate as only lower bound estimates were used where applicable, and for example, costs of alternative therapies, inpatient treatments or diagnostic tests, and time off work by parents could not be adequately estimated and were omitted from the calculation. The number and kind of prescribed products and products sold over the counter to treat FGIDs suggest that there are gaps between treatment guidelines, which emphasise parental reassurance and nutritional advice, and their implementation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  20. Parameter estimation of the copernicus decompression model with venous gas emboli in human divers.

    PubMed

    Gutvik, Christian R; Dunford, Richard G; Dujic, Zeljko; Brubakk, Alf O

    2010-07-01

    Decompression Sickness (DCS) may occur when divers decompress from a hyperbaric environment. To prevent this, decompression procedures are used to get safely back to the surface. The models whose procedures are calculated from, are traditionally validated using clinical symptoms as an endpoint. However, DCS is an uncommon phenomenon and the wide variation in individual response to decompression stress is poorly understood. And generally, using clinical examination alone for validation is disadvantageous from a modeling perspective. Currently, the only objective and quantitative measure of decompression stress is Venous Gas Emboli (VGE), measured by either ultrasonic imaging or Doppler. VGE has been shown to be statistically correlated with DCS, and is now widely used in science to evaluate decompression stress from a dive. Until recently no mathematical model has existed to predict VGE from a dive, which motivated the development of the Copernicus model. The present article compiles a selection experimental dives and field data containing computer recorded depth profiles associated with ultrasound measurements of VGE. It describes a parameter estimation problem to fit the model with these data. A total of 185 square bounce dives from DCIEM, Canada, 188 recreational dives with a mix of single, repetitive and multi-day exposures from DAN USA and 84 experimentally designed decompression dives from Split Croatia were used, giving a total of 457 dives. Five selected parameters in the Copernicus bubble model were assigned for estimation and a non-linear optimization problem was formalized with a weighted least square cost function. A bias factor to the DCIEM chamber dives was also included. A Quasi-Newton algorithm (BFGS) from the TOMLAB numerical package solved the problem which was proved to be convex. With the parameter set presented in this article, Copernicus can be implemented in any programming language to estimate VGE from an air dive.

  1. Short-term international migration trends in England and Wales from 2004 to 2009.

    PubMed

    Whitworth, Simon; Loukas, Konstantinos; McGregor, Ian

    2011-01-01

    Short-term migration estimates for England and Wales are the latest addition to the Office for National Statistics (ONS) migration statistics. This article discusses definitions of short-term migration and the methodology that is used to produce the estimates. Some of the estimates and the changes in the estimates over time are then discussed. The article includes previously unpublished short-term migration statistics and therefore helps to give a more complete picture of the size and characteristics of short-term international migration for England and Wales than has previously been possible. ONS have identified a clear user requirement for short-term migration estimates at local authority (LA) level. Consequently, attention is also paid to the progress that has been made and future work that is planned to distribute England and Wales short-term migration estimates to LA level.

  2. A 2D eye gaze estimation system with low-resolution webcam images

    NASA Astrophysics Data System (ADS)

    Ince, Ibrahim Furkan; Kim, Jin Woo

    2011-12-01

    In this article, a low-cost system for 2D eye gaze estimation with low-resolution webcam images is presented. Two algorithms are proposed for this purpose, one for the eye-ball detection with stable approximate pupil-center and the other one for the eye movements' direction detection. Eyeball is detected using deformable angular integral search by minimum intensity (DAISMI) algorithm. Deformable template-based 2D gaze estimation (DTBGE) algorithm is employed as a noise filter for deciding the stable movement decisions. While DTBGE employs binary images, DAISMI employs gray-scale images. Right and left eye estimates are evaluated separately. DAISMI finds the stable approximate pupil-center location by calculating the mass-center of eyeball border vertices to be employed for initial deformable template alignment. DTBGE starts running with initial alignment and updates the template alignment with resulting eye movements and eyeball size frame by frame. The horizontal and vertical deviation of eye movements through eyeball size is considered as if it is directly proportional with the deviation of cursor movements in a certain screen size and resolution. The core advantage of the system is that it does not employ the real pupil-center as a reference point for gaze estimation which is more reliable against corneal reflection. Visual angle accuracy is used for the evaluation and benchmarking of the system. Effectiveness of the proposed system is presented and experimental results are shown.

  3. Joint Transmit and Receive Filter Optimization for Sub-Nyquist Delay-Doppler Estimation

    NASA Astrophysics Data System (ADS)

    Lenz, Andreas; Stein, Manuel S.; Swindlehurst, A. Lee

    2018-05-01

    In this article, a framework is presented for the joint optimization of the analog transmit and receive filter with respect to a parameter estimation problem. At the receiver, conventional signal processing systems restrict the two-sided bandwidth of the analog pre-filter $B$ to the rate of the analog-to-digital converter $f_s$ to comply with the well-known Nyquist-Shannon sampling theorem. In contrast, here we consider a transceiver that by design violates the common paradigm $B\\leq f_s$. To this end, at the receiver, we allow for a higher pre-filter bandwidth $B>f_s$ and study the achievable parameter estimation accuracy under a fixed sampling rate when the transmit and receive filter are jointly optimized with respect to the Bayesian Cram\\'{e}r-Rao lower bound. For the case of delay-Doppler estimation, we propose to approximate the required Fisher information matrix and solve the transceiver design problem by an alternating optimization algorithm. The presented approach allows us to explore the Pareto-optimal region spanned by transmit and receive filters which are favorable under a weighted mean squared error criterion. We also discuss the computational complexity of the obtained transceiver design by visualizing the resulting ambiguity function. Finally, we verify the performance of the optimized designs by Monte-Carlo simulations of a likelihood-based estimator.

  4. Global Xenon-133 Emission Inventory Caused by Medical Isotope Production and Derived from the Worldwide Technetium-99m Demand

    NASA Astrophysics Data System (ADS)

    Kalinowski, Martin B.; Grosch, Martina; Hebel, Simon

    2014-03-01

    Emissions from medical isotope production are the most important source of background for atmospheric radioxenon measurements, which are an essential part of nuclear explosion monitoring. This article presents a new approach for estimating the global annual radioxenon emission inventory caused by medical isotope production using the amount of Tc-99m applications in hospitals as the basis. Tc-99m is the most commonly used isotope in radiology and dominates the medical isotope production. This paper presents the first estimate of the global production of Tc-99m. Depending on the production and transport scenario, global xenon emissions of 11-45 PBq/year can be derived from the global isotope demand. The lower end of this estimate is in good agreement with other estimations which are making use of reported releases and realistic process simulations. This proves the validity of the complementary assessment method proposed in this paper. It may be of relevance for future emission scenarios and for estimating the contribution to the global source term from countries and operators that do not make sufficient radioxenon release information available. It depends on sound data on medical treatments with radio-pharmaceuticals and on technical information on the production process of the supplier. This might help in understanding the apparent underestimation of the global emission inventory that has been found by atmospheric transport modelling.

  5. Estimating inverse-probability weights for longitudinal data with dropout or truncation: The xtrccipw command.

    PubMed

    Daza, Eric J; Hudgens, Michael G; Herring, Amy H

    Individuals may drop out of a longitudinal study, rendering their outcomes unobserved but still well defined. However, they may also undergo truncation (for example, death), beyond which their outcomes are no longer meaningful. Kurland and Heagerty (2005, Biostatistics 6: 241-258) developed a method to conduct regression conditioning on nontruncation, that is, regression conditioning on continuation (RCC), for longitudinal outcomes that are monotonically missing at random (for example, because of dropout). This method first estimates the probability of dropout among continuing individuals to construct inverse-probability weights (IPWs), then fits generalized estimating equations (GEE) with these IPWs. In this article, we present the xtrccipw command, which can both estimate the IPWs required by RCC and then use these IPWs in a GEE estimator by calling the glm command from within xtrccipw. In the absence of truncation, the xtrccipw command can also be used to run a weighted GEE analysis. We demonstrate the xtrccipw command by analyzing an example dataset and the original Kurland and Heagerty (2005) data. We also use xtrccipw to illustrate some empirical properties of RCC through a simulation study.

  6. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    PubMed

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  7. Estimating inverse-probability weights for longitudinal data with dropout or truncation: The xtrccipw command

    PubMed Central

    Hudgens, Michael G.; Herring, Amy H.

    2017-01-01

    Individuals may drop out of a longitudinal study, rendering their outcomes unobserved but still well defined. However, they may also undergo truncation (for example, death), beyond which their outcomes are no longer meaningful. Kurland and Heagerty (2005, Biostatistics 6: 241–258) developed a method to conduct regression conditioning on nontruncation, that is, regression conditioning on continuation (RCC), for longitudinal outcomes that are monotonically missing at random (for example, because of dropout). This method first estimates the probability of dropout among continuing individuals to construct inverse-probability weights (IPWs), then fits generalized estimating equations (GEE) with these IPWs. In this article, we present the xtrccipw command, which can both estimate the IPWs required by RCC and then use these IPWs in a GEE estimator by calling the glm command from within xtrccipw. In the absence of truncation, the xtrccipw command can also be used to run a weighted GEE analysis. We demonstrate the xtrccipw command by analyzing an example dataset and the original Kurland and Heagerty (2005) data. We also use xtrccipw to illustrate some empirical properties of RCC through a simulation study. PMID:29755297

  8. A Bayesian approach to multivariate measurement system assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Michael Scott

    This article considers system assessment for multivariate measurements and presents a Bayesian approach to analyzing gauge R&R study data. The evaluation of variances for univariate measurement becomes the evaluation of covariance matrices for multivariate measurements. The Bayesian approach ensures positive definite estimates of the covariance matrices and easily provides their uncertainty. Furthermore, various measurement system assessment criteria are easily evaluated. The approach is illustrated with data from a real gauge R&R study as well as simulated data.

  9. A Bayesian approach to multivariate measurement system assessment

    DOE PAGES

    Hamada, Michael Scott

    2016-07-01

    This article considers system assessment for multivariate measurements and presents a Bayesian approach to analyzing gauge R&R study data. The evaluation of variances for univariate measurement becomes the evaluation of covariance matrices for multivariate measurements. The Bayesian approach ensures positive definite estimates of the covariance matrices and easily provides their uncertainty. Furthermore, various measurement system assessment criteria are easily evaluated. The approach is illustrated with data from a real gauge R&R study as well as simulated data.

  10. Mobile Phone Surveys for Collecting Population-Level Estimates in Low- and Middle-Income Countries: A Literature Review.

    PubMed

    Gibson, Dustin G; Pereira, Amanda; Farrenkopf, Brooke A; Labrique, Alain B; Pariyo, George W; Hyder, Adnan A

    2017-05-05

    National and subnational level surveys are important for monitoring disease burden, prioritizing resource allocation, and evaluating public health policies. As mobile phone access and ownership become more common globally, mobile phone surveys (MPSs) offer an opportunity to supplement traditional public health household surveys. The objective of this study was to systematically review the current landscape of MPSs to collect population-level estimates in low- and middle-income countries (LMICs). Primary and gray literature from 7 online databases were systematically searched for studies that deployed MPSs to collect population-level estimates. Titles and abstracts were screened on primary inclusion and exclusion criteria by two research assistants. Articles that met primary screening requirements were read in full and screened for secondary eligibility criteria. Articles included in review were grouped into the following three categories by their survey modality: (1) interactive voice response (IVR), (2) short message service (SMS), and (3) human operator or computer-assisted telephone interviews (CATI). Data were abstracted by two research assistants. The conduct and reporting of the review conformed to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. A total of 6625 articles were identified through the literature review. Overall, 11 articles were identified that contained 19 MPS (CATI, IVR, or SMS) surveys to collect population-level estimates across a range of topics. MPSs were used in Latin America (n=8), the Middle East (n=1), South Asia (n=2), and sub-Saharan Africa (n=8). Nine articles presented results for 10 CATI surveys (10/19, 53%). Two articles discussed the findings of 6 IVR surveys (6/19, 32%). Three SMS surveys were identified from 2 articles (3/19, 16%). Approximately 63% (12/19) of MPS were delivered to mobile phone numbers collected from previously administered household surveys. The majority of MPS (11/19, 58%) were panel surveys where a cohort of participants, who often were provided a mobile phone upon a face-to-face enrollment, were surveyed multiple times. Very few reports of population-level MPS were identified. Of the MPS that were identified, the majority of surveys were conducted using CATI. Due to the limited number of identified IVR and SMS surveys, the relative advantages and disadvantages among the three survey modalities cannot be adequately assessed. The majority of MPS were sent to mobile phone numbers that were collected from a previously administered household survey. There is limited evidence on whether a random digit dialing (RDD) approach or a simple random sample of mobile network provided list of numbers can produce a population representative survey. ©Dustin G Gibson, Amanda Pereira, Brooke A Farrenkopf, Alain B Labrique, George W Pariyo, Adnan A Hyder. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 05.05.2017.

  11. Prevalence and incidence of epilepsy in the Nordic countries.

    PubMed

    Syvertsen, Marte; Koht, Jeanette; Nakken, Karl O

    2015-10-06

    Updated knowledge on the prevalence of epilepsy is valuable for planning of health services to this large and complex patient group. Comprehensive epidemiological research on epilepsy has been undertaken, but because of variations in methodology, the results are difficult to compare. The objective of this article is to present evidence-based estimates of the prevalence and incidence of epilepsy in the Nordic countries. The article is based on a search in PubMed with the search terms epilepsy and epidemiology, combined with each of the Nordic countries separately. Altogether 38 original articles reported incidence and/or prevalence rates of epilepsy in a Nordic country. Four studies had investigated the prevalence of active epilepsy in all age groups, with results ranging from 3.4 to 7.6 per 1,000 inhabitants. Only two studies had investigated the incidence of epilepsy in a prospective material that included all age groups. The reported incidence amounted to 33 and 34 per 100,000 person-years respectively. A prospective study that only included adults reported an incidence of 56 per 100,000 person-years. We estimate that approximately 0.6% of the population of the Nordic countries have active epilepsy, i.e. approximately 30,000 persons in Norway. Epilepsy is thus one of the most common neurological disorders. The incidence data are more uncertain, but we may reasonably assume that 30-60 new cases occur per 100,000 person-years.

  12. Do less effective teachers choose professional development does it matter?

    PubMed

    Barrett, Nathan; Butler, J S; Toma, Eugenia F

    2012-10-01

    In an ongoing effort to improve teacher quality, most states require continuing education or professional development for their in-service teachers. Studies evaluating the effectiveness of various professional development programs have assumed a normal distribution of quality of teachers participating in the programs. Because participation in many professional development programs is either targeted or voluntary, this article suggests past evaluations of the effectiveness of professional development may be subject to selection bias and policy recommendations may be premature. This article presents an empirical framework for evaluating professional development programs where treatment is potentially nonrandom, and explicitly accounts for the teacher's prior effectiveness in the classroom as a factor that may influence participation in professional development. This article controls for the influence of selection bias on professional development outcomes by generating a matched sample based on propensity scores and then estimating the program's effect. In applying this framework to the professional development program examined in this article, less effective teachers are found to be more likely to participate in the program, and correcting for this selection leads to different conclusions regarding the program's effectiveness than when ignoring teacher selection patterns.

  13. Presentation approaches for enhancing interpretability of patient-reported outcomes (PROs) in meta-analysis: a protocol for a systematic survey of Cochrane reviews.

    PubMed

    Devji, Tahira; Johnston, Bradley C; Patrick, Donald L; Bhandari, Mohit; Thabane, Lehana; Guyatt, Gordon H

    2017-09-27

    Meta-analyses of clinical trials often provide sufficient information for decision-makers to evaluate whether chance can explain apparent differences between interventions. Interpretation of the magnitude and importance of treatment effects beyond statistical significance can, however, be challenging, particularly for patient-reported outcomes (PROs) measured using questionnaires with which clinicians have limited familiarity. The objectives of our study are to systematically evaluate Cochrane systematic review authors' approaches to calculation, reporting and interpretation of pooled estimates of patient-reported outcome measures (PROMs) in meta-analyses. We will conduct a methodological survey of a random sample of Cochrane systematic reviews published from 1 January 2015 to 1 April 2017 that report at least one statistically significant pooled result for at least one PRO in the abstract. Author pairs will independently review all titles, abstracts and full texts identified by the literature search, and they will extract data using a standardised data extraction form. We will extract the following: year of publication, number of included trials, number of included participants, clinical area, type of intervention(s) and control(s), type of meta-analysis and use of the Grading of Recommendations, Assessment, Development and Evaluation approach to rate the quality of evidence, as well as information regarding the characteristics of PROMs, calculation and presentation of PROM effect estimates and interpretation of PROM effect estimates. We will document and summarise the methods used for the analysis, reporting and interpretation of each summary effect measure. We will summarise categorical variables with frequencies and percentages and continuous outcomes as means and/or medians and associated measures of dispersion. Ethics approval for this study is not required. We will disseminate the results of this review in peer-reviewed publications and conference presentations. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Parameter identification of thermophilic anaerobic degradation of valerate.

    PubMed

    Flotats, Xavier; Ahring, Birgitte K; Angelidaki, Irini

    2003-01-01

    The considered mathematical model of the decomposition of valerate presents three unknown kinetic parameters, two unknown stoichiometric coefficients, and three unknown initial concentrations for biomass. Applying a structural identifiability study, we concluded that it is necessary to perform simultaneous batch experiments with different initial conditions for estimating these parameters. Four simultaneous batch experiments were conducted at 55 degrees C, characterized by four different initial acetate concentrations. Product inhibition of valerate degradation by acetate was considered. Practical identification was done optimizing the sum of the multiple determination coefficients for all measured state variables and for all experiments simultaneously. The estimated values of kinetic parameters and stoichiometric coefficients were characterized by the parameter correlation matrix, the confidence interval, and the student's t-test at 5% significance level with positive results except for the saturation constant, for which more experiments for improving its identifiability should be conducted. In this article, we discuss kinetic parameter estimation methods.

  15. An information measure for class discrimination. [in remote sensing of crop observation

    NASA Technical Reports Server (NTRS)

    Shen, S. S.; Badhwar, G. D.

    1986-01-01

    This article describes a separability measure for class discrimination. This measure is based on the Fisher information measure for estimating the mixing proportion of two classes. The Fisher information measure not only provides a means to assess quantitatively the information content in the features for separating classes, but also gives the lower bound for the variance of any unbiased estimate of the mixing proportion based on observations of the features. Unlike most commonly used separability measures, this measure is not dependent on the form of the probability distribution of the features and does not imply a specific estimation procedure. This is important because the probability distribution function that describes the data for a given class does not have simple analytic forms, such as a Gaussian. Results of applying this measure to compare the information content provided by three Landsat-derived feature vectors for the purpose of separating small grains from other crops are presented.

  16. A quantile regression model for failure-time data with time-dependent covariates

    PubMed Central

    Gorfine, Malka; Goldberg, Yair; Ritov, Ya’acov

    2017-01-01

    Summary Since survival data occur over time, often important covariates that we wish to consider also change over time. Such covariates are referred as time-dependent covariates. Quantile regression offers flexible modeling of survival data by allowing the covariates to vary with quantiles. This article provides a novel quantile regression model accommodating time-dependent covariates, for analyzing survival data subject to right censoring. Our simple estimation technique assumes the existence of instrumental variables. In addition, we present a doubly-robust estimator in the sense of Robins and Rotnitzky (1992, Recovery of information and adjustment for dependent censoring using surrogate markers. In: Jewell, N. P., Dietz, K. and Farewell, V. T. (editors), AIDS Epidemiology. Boston: Birkhaäuser, pp. 297–331.). The asymptotic properties of the estimators are rigorously studied. Finite-sample properties are demonstrated by a simulation study. The utility of the proposed methodology is demonstrated using the Stanford heart transplant dataset. PMID:27485534

  17. A double hit model for the distribution of time to AIDS onset

    NASA Astrophysics Data System (ADS)

    Chillale, Nagaraja Rao

    2013-09-01

    Incubation time is a key epidemiologic descriptor of an infectious disease. In the case of HIV infection this is a random variable and is probably the longest one. The probability distribution of incubation time is the major determinant of the relation between the incidences of HIV infection and its manifestation to Aids. This is also one of the key factors used for accurate estimation of AIDS incidence in a region. The present article i) briefly reviews the work done, points out uncertainties in estimation of AIDS onset time and stresses the need for its precise estimation, ii) highlights some of the modelling features of onset distribution including immune failure mechanism, and iii) proposes a 'Double Hit' model for the distribution of time to AIDS onset in the cases of (a) independent and (b) dependent time variables of the two markers and examined the applicability of a few standard probability models.

  18. Habitat suitability criteria via parametric distributions: estimation, model selection and uncertainty

    USGS Publications Warehouse

    Som, Nicholas A.; Goodman, Damon H.; Perry, Russell W.; Hardy, Thomas B.

    2016-01-01

    Previous methods for constructing univariate habitat suitability criteria (HSC) curves have ranged from professional judgement to kernel-smoothed density functions or combinations thereof. We present a new method of generating HSC curves that applies probability density functions as the mathematical representation of the curves. Compared with previous approaches, benefits of our method include (1) estimation of probability density function parameters directly from raw data, (2) quantitative methods for selecting among several candidate probability density functions, and (3) concise methods for expressing estimation uncertainty in the HSC curves. We demonstrate our method with a thorough example using data collected on the depth of water used by juvenile Chinook salmon (Oncorhynchus tschawytscha) in the Klamath River of northern California and southern Oregon. All R code needed to implement our example is provided in the appendix. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  19. Sieve estimation in semiparametric modeling of longitudinal data with informative observation times.

    PubMed

    Zhao, Xingqiu; Deng, Shirong; Liu, Li; Liu, Lei

    2014-01-01

    Analyzing irregularly spaced longitudinal data often involves modeling possibly correlated response and observation processes. In this article, we propose a new class of semiparametric mean models that allows for the interaction between the observation history and covariates, leaving patterns of the observation process to be arbitrary. For inference on the regression parameters and the baseline mean function, a spline-based least squares estimation approach is proposed. The consistency, rate of convergence, and asymptotic normality of the proposed estimators are established. Our new approach is different from the usual approaches relying on the model specification of the observation scheme, and it can be easily used for predicting the longitudinal response. Simulation studies demonstrate that the proposed inference procedure performs well and is more robust. The analyses of bladder tumor data and medical cost data are presented to illustrate the proposed method.

  20. Use of antiepileptic drugs and risk of falls in old age: A systematic review.

    PubMed

    Haasum, Ylva; Johnell, Kristina

    2017-12-01

    The aim of this study is to systematically review the scientific literature to investigate if use of antiepileptic drugs (AEDs) is associated with falls and/or recurrent falls in old age. We searched the literature for relevant articles in PubMed and Embase published up until 3rd December 2015. Studies on people aged 60 years and over with an observational design assessing the risk of fall in people exposed to AEDs compared to people not exposed to AED were included. We found 744 studies by searching Medline and Embase and an additional 9 studies by reviewing relevant reference lists. Of these studies, 13 fulfilled our predefined criteria. The articles were of various study design, sizes and follow-up times, and presented the results in different ways. Also, confounder adjustment varied considerably between the studies. Ten studies presented results for the association between use of any AED and any fall/injurious fall. Of these studies, 6 presented adjusted estimates, of which all but one showed statistically significant associations between use of any AED and any fall/injurious fall. Six studies investigated the association between use of any AED and recurrent falls. Of these, only 3 studies presented adjusted effect estimates of which 2 reached statistical significance for the association between use of AEDs and recurrent falls in elderly people. Our results indicate an association between use of AEDs and risk of falls and recurrent falls in older people. This finding may be clinically important given that a substantial amount of older people use these drugs. However, further research is needed to increase the knowledge about the actual risk of falls when using these drugs in old age. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Signal Conditioning for the Kalman Filter: Application to Satellite Attitude Estimation with Magnetometer and Sun Sensors

    PubMed Central

    Esteban, Segundo; Girón-Sierra, Jose M.; Polo, Óscar R.; Angulo, Manuel

    2016-01-01

    Most satellites use an on-board attitude estimation system, based on available sensors. In the case of low-cost satellites, which are of increasing interest, it is usual to use magnetometers and Sun sensors. A Kalman filter is commonly recommended for the estimation, to simultaneously exploit the information from sensors and from a mathematical model of the satellite motion. It would be also convenient to adhere to a quaternion representation. This article focuses on some problems linked to this context. The state of the system should be represented in observable form. Singularities due to alignment of measured vectors cause estimation problems. Accommodation of the Kalman filter originates convergence difficulties. The article includes a new proposal that solves these problems, not needing changes in the Kalman filter algorithm. In addition, the article includes assessment of different errors, initialization values for the Kalman filter; and considers the influence of the magnetic dipole moment perturbation, showing how to handle it as part of the Kalman filter framework. PMID:27809250

  2. Signal Conditioning for the Kalman Filter: Application to Satellite Attitude Estimation with Magnetometer and Sun Sensors.

    PubMed

    Esteban, Segundo; Girón-Sierra, Jose M; Polo, Óscar R; Angulo, Manuel

    2016-10-31

    Most satellites use an on-board attitude estimation system, based on available sensors. In the case of low-cost satellites, which are of increasing interest, it is usual to use magnetometers and Sun sensors. A Kalman filter is commonly recommended for the estimation, to simultaneously exploit the information from sensors and from a mathematical model of the satellite motion. It would be also convenient to adhere to a quaternion representation. This article focuses on some problems linked to this context. The state of the system should be represented in observable form. Singularities due to alignment of measured vectors cause estimation problems. Accommodation of the Kalman filter originates convergence difficulties. The article includes a new proposal that solves these problems, not needing changes in the Kalman filter algorithm. In addition, the article includes assessment of different errors, initialization values for the Kalman filter; and considers the influence of the magnetic dipole moment perturbation, showing how to handle it as part of the Kalman filter framework.

  3. Verification of CFD model of plane jet used for smoke free zone separation in case of fire

    NASA Astrophysics Data System (ADS)

    Krajewski, Grzegorz; Suchy, Przemysław

    2018-01-01

    This paper presents the basic information about the use of air curtains in fire safety, as a barrier for heat and smoke. Mathematical model of an air curtain presented hereallows estimation of velocity of air in various points of space, including the velocity of air from an angled air curtain. Presented equations show how various parameters influence the performance of air curtain. Further, authors present results of their air curtain performance. Authors of that article have done tests in a real scale model. Tests results were used to verify chosen turbulence model and boundary conditions. Results of new studies are presented with regards to the performance of air curtain in case of fire, and final remarks on its design are given.

  4. The Impact of Obesity on Active Life Expectancy in Older American Men and Women

    ERIC Educational Resources Information Center

    Reynolds, Sandra L.; Saito, Yasuhiko; Crimmins, Eileen M.

    2005-01-01

    Purpose: The purpose of this article is to estimate the effect of obesity on both the length of life and length of nondisabled life for older Americans. Design and Methods: Using data from the first 3 waves of the Asset and Health Dynamics Among the Oldest Old (AHEAD) survey, this article develops estimates of total, active, and disabled life…

  5. The role of benefit transfer in ecosystem service valuation

    USGS Publications Warehouse

    Richardson, Leslie A.; Loomis, John; Kroeger, Timm; Casey, Frank

    2015-01-01

    The demand for timely monetary estimates of the economic value of nonmarket ecosystem goods and services has steadily increased over the last few decades. This article describes the use of benefit transfer to generate monetary value estimates of ecosystem services specifically. The article provides guidance for conducting such benefit transfers and summarizes advancements in benefit transfer methods, databases and analysis tools designed to facilitate its application.

  6. A comparison of three approaches to non-stationary flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Debele, S. E.; Strupczewski, W. G.; Bogdanowicz, E.

    2017-08-01

    Non-stationary flood frequency analysis (FFA) is applied to statistical analysis of seasonal flow maxima from Polish and Norwegian catchments. Three non-stationary estimation methods, namely, maximum likelihood (ML), two stage (WLS/TS) and GAMLSS (generalized additive model for location, scale and shape parameters), are compared in the context of capturing the effect of non-stationarity on the estimation of time-dependent moments and design quantiles. The use of a multimodel approach is recommended, to reduce the errors due to the model misspecification in the magnitude of quantiles. The results of calculations based on observed seasonal daily flow maxima and computer simulation experiments showed that GAMLSS gave the best results with respect to the relative bias and root mean square error in the estimates of trend in the standard deviation and the constant shape parameter, while WLS/TS provided better accuracy in the estimates of trend in the mean value. Within three compared methods the WLS/TS method is recommended to deal with non-stationarity in short time series. Some practical aspects of the GAMLSS package application are also presented. The detailed discussion of general issues related to consequences of climate change in the FFA is presented in the second part of the article entitled "Around and about an application of the GAMLSS package in non-stationary flood frequency analysis".

  7. Global prevalence of diabetes mellitus in patients with tuberculosis: a systematic review and meta-analysis protocol.

    PubMed

    Tankeu, Aurel T; Bigna, Jean Joël; Nansseu, Jobert Richie; Endomba, Francky Teddy A; Wafeu, Guy Sadeu; Kaze, Arnaud D; Noubiap, Jean Jacques

    2017-06-09

    Diabetes mellitus (DM) is an important risk factor for active tuberculosis (TB), which also adversely affect TB treatment outcomes. The escalating global DM epidemic is fuelling the burden of TB and should therefore be a major target in the strategy for ending TB. This review aims to estimate the global prevalence of DM in patients with TB. This systematic review will include cross-sectional, case-control or cohort studies of populations including patients diagnosed with TB that have reported the prevalence of DM using one of the fourth standard recommendations for screening and diagnosis. This protocol is written in accordance with recommendations from the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols 2015 statement. Relevant abstracts published in English/French from inception to 31 December 2016 will be searched in PubMed, Excerpta Medica Database and online journals. Two investigators will independently screen, select studies, extract data and assess the risk of bias in each study. The study-specific estimates will be pooled through a random-effects meta-analysis model to obtain an overall summary estimate of the prevalence of diabetes across the studies. Heterogeneity will be assessed, and we will pool studies judged to be clinically homogenous. On the other hand, statistical heterogeneity will be evaluated by the χ² test on Cochrane's Q statistic. Funnel-plots analysis and Egger's test will be used to investigate publication bias. Results will be presented by continent or geographic regions. This study is based on published data. An ethical approval is therefore not required. This systematic review and meta-analysis is expected to inform healthcare providers as well as general population on the co-occurrence of DM and TB. The final report will be published as an original article in a peer-reviewed journal, and will also be presented at conferences and submitted to relevant health authorities. We also plan to update the review every 5 years. PROSPERO International Prospective Register of Systematic Reviews (CRD42016049901). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. An inverse finance problem for estimation of the volatility

    NASA Astrophysics Data System (ADS)

    Neisy, A.; Salmani, K.

    2013-01-01

    Black-Scholes model, as a base model for pricing in derivatives markets has some deficiencies, such as ignoring market jumps, and considering market volatility as a constant factor. In this article, we introduce a pricing model for European-Options under jump-diffusion underlying asset. Then, using some appropriate numerical methods we try to solve this model with integral term, and terms including derivative. Finally, considering volatility as an unknown parameter, we try to estimate it by using our proposed model. For the purpose of estimating volatility, in this article, we utilize inverse problem, in which inverse problem model is first defined, and then volatility is estimated using minimization function with Tikhonov regularization.

  9. Estimating the Costs of Preventive Interventions

    ERIC Educational Resources Information Center

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  10. The Improved Estimation of Ratio of Two Population Proportions

    ERIC Educational Resources Information Center

    Solanki, Ramkrishna S.; Singh, Housila P.

    2016-01-01

    In this article, first we obtained the correct mean square error expression of Gupta and Shabbir's linear weighted estimator of the ratio of two population proportions. Later we suggested the general class of ratio estimators of two population proportions. The usual ratio estimator, Wynn-type estimator, Singh, Singh, and Kaur difference-type…

  11. Methodes entropiques appliquees au probleme inverse en magnetoencephalographie

    NASA Astrophysics Data System (ADS)

    Lapalme, Ervig

    2005-07-01

    This thesis is devoted to biomagnetic source localization using magnetoencephalography. This problem is known to have an infinite number of solutions. So methods are required to take into account anatomical and functional information on the solution. The work presented in this thesis uses the maximum entropy on the mean method to constrain the solution. This method originates from statistical mechanics and information theory. This thesis is divided into two main parts containing three chapters each. The first part reviews the magnetoencephalographic inverse problem: the theory needed to understand its context and the hypotheses for simplifying the problem. In the last chapter of this first part, the maximum entropy on the mean method is presented: its origins are explained and also how it is applied to our problem. The second part is the original work of this thesis presenting three articles; one of them already published and two others submitted for publication. In the first article, a biomagnetic source model is developed and applied in a theoretical con text but still demonstrating the efficiency of the method. In the second article, we go one step further towards a realistic modelization of the cerebral activation. The main priors are estimated using the magnetoencephalographic data. This method proved to be very efficient in realistic simulations. In the third article, the previous method is extended to deal with time signals thus exploiting the excellent time resolution offered by magnetoencephalography. Compared with our previous work, the temporal method is applied to real magnetoencephalographic data coming from a somatotopy experience and results agree with previous physiological knowledge about this kind of cognitive process.

  12. Imputing Risk Tolerance From Survey Responses

    PubMed Central

    Kimball, Miles S.; Sahm, Claudia R.; Shapiro, Matthew D.

    2010-01-01

    Economic theory assigns a central role to risk preferences. This article develops a measure of relative risk tolerance using responses to hypothetical income gambles in the Health and Retirement Study. In contrast to most survey measures that produce an ordinal metric, this article shows how to construct a cardinal proxy for the risk tolerance of each survey respondent. The article also shows how to account for measurement error in estimating this proxy and how to obtain consistent regression estimates despite the measurement error. The risk tolerance proxy is shown to explain differences in asset allocation across households. PMID:20407599

  13. Estimation of the biserial correlation and its sampling variance for use in meta-analysis.

    PubMed

    Jacobs, Perke; Viechtbauer, Wolfgang

    2017-06-01

    Meta-analyses are often used to synthesize the findings of studies examining the correlational relationship between two continuous variables. When only dichotomous measurements are available for one of the two variables, the biserial correlation coefficient can be used to estimate the product-moment correlation between the two underlying continuous variables. Unlike the point-biserial correlation coefficient, biserial correlation coefficients can therefore be integrated with product-moment correlation coefficients in the same meta-analysis. The present article describes the estimation of the biserial correlation coefficient for meta-analytic purposes and reports simulation results comparing different methods for estimating the coefficient's sampling variance. The findings indicate that commonly employed methods yield inconsistent estimates of the sampling variance across a broad range of research situations. In contrast, consistent estimates can be obtained using two methods that appear to be unknown in the meta-analytic literature. A variance-stabilizing transformation for the biserial correlation coefficient is described that allows for the construction of confidence intervals for individual coefficients with close to nominal coverage probabilities in most of the examined conditions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Applying Thiessen Polygon Catchment Areas and Gridded Population Weights to Estimate Conflict-Driven Population Changes in South Sudan

    NASA Astrophysics Data System (ADS)

    Jordan, L.

    2017-10-01

    Recent violence in South Sudan produced significant levels of conflict-driven migration undermining the accuracy and utility of both national and local level population forecasts commonly used in demographic estimates, public health metrics and food security proxies. This article explores the use of Thiessen Polygons and population grids (Gridded Population of the World, WorldPop and LandScan) as weights for estimating the catchment areas for settlement locations that serve large populations of internally displaced persons (IDP), in order to estimate the county-level in- and out-migration attributable to conflict-driven displacement between 2014-2015. Acknowledging IDP totals improves internal population estimates presented by global population databases. Unlike other forecasts, which produce spatially uniform increases in population, accounting for displaced population reveals that 15 percent of counties (n = 12) increased in population over 20 percent, and 30 percent of counties (n = 24) experienced zero or declining population growth, due to internal displacement and refugee out-migration. Adopting Thiessen Polygon catchment zones for internal migration estimation can be applied to other areas with United Nations IDP settlement data, such as Yemen, Somalia, and Nigeria.

  15. Directions of arrival estimation with planar antenna arrays in the presence of mutual coupling

    NASA Astrophysics Data System (ADS)

    Akkar, Salem; Harabi, Ferid; Gharsallah, Ali

    2013-06-01

    Directions of arrival (DoAs) estimation of multiple sources using an antenna array is a challenging topic in wireless communication. The DoAs estimation accuracy depends not only on the selected technique and algorithm, but also on the geometrical configuration of the antenna array used during the estimation. In this article the robustness of common planar antenna arrays against unaccounted mutual coupling is examined and their DoAs estimation capabilities are compared and analysed through computer simulations using the well-known MUltiple SIgnal Classification (MUSIC) algorithm. Our analysis is based on an electromagnetic concept to calculate an approximation of the impedance matrices that define the mutual coupling matrix (MCM). Furthermore, a CRB analysis is presented and used as an asymptotic performance benchmark of the studied antenna arrays. The impact of the studied antenna arrays geometry on the MCM structure is also investigated. Simulation results show that the UCCA has more robustness against unaccounted mutual coupling and performs better results than both UCA and URA geometries. The performed simulations confirm also that, although the UCCA achieves better performance under complicated scenarios, the URA shows better asymptotic (CRB) behaviour which promises more accuracy on DoAs estimation.

  16. Estimating power capability of aged lithium-ion batteries in presence of communication delays

    NASA Astrophysics Data System (ADS)

    Fridholm, Björn; Wik, Torsten; Kuusisto, Hannes; Klintberg, Anton

    2018-04-01

    Efficient control of electrified powertrains requires accurate estimation of the power capability of the battery for the next few seconds into the future. When implemented in a vehicle, the power estimation is part of a control loop that may contain several networked controllers which introduces time delays that may jeopardize stability. In this article, we present and evaluate an adaptive power estimation method that robustly can handle uncertain health status and time delays. A theoretical analysis shows that stability of the closed loop system can be lost if the resistance of the model is under-estimated. Stability can, however, be restored by filtering the estimated power at the expense of slightly reduced bandwidth of the signal. The adaptive algorithm is experimentally validated in lab tests using an aged lithium-ion cell subject to a high power load profile in temperatures from -20 to +25 °C. The upper voltage limit was set to 4.15 V and the lower voltage limit to 2.6 V, where significant non-linearities are occurring and the validity of the model is limited. After an initial transient when the model parameters are adapted, the prediction accuracy is within ± 2 % of the actually available power.

  17. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Safety envelope for load tolerance of structural element design based on multi-stage testing

    DOE PAGES

    Park, Chanyoung; Kim, Nam H.

    2016-09-06

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  19. Global or local construction materials for post-disaster reconstruction? Sustainability assessment of 20 post-disaster shelter designs

    PubMed Central

    Zea Escamilla, E.; Habert, G.

    2015-01-01

    This data article presents the life cycle inventories of 20 transitional shelter solutions. The data was gathered from the reports 8 shelter designs [1]; 10 post-disaster shelter designs [2]; the environmental impact of brick production outside of Europe [3]; and the optimization of bamboo-based post-disaster housing units for tropical and subtropical regions using LCA methodologies [4]. These reports include bill of quantities, plans, performance analysis, and lifespan of the studied shelters. The data from these reports was used to develop the Life Cycle Inventories (LCI). All the amounts were converted from their original units (length, volume and amount) into mass (kg) units and the transport distance into ton×km. These LCIs represent the production phases of each shelter and the transportation distances for the construction materials. Two types of distances were included, local (road) and international (freight ship), which were estimated based on the area of the country of study. Furthermore, the digital visualization of the shelters is presented for each of the 20 designs. Moreover, this data article presents a summary of the results for the categories Environment, Cost and Risk and the contribution to the environmental impact from the different building components of each shelter. These results are related to the article “Global or local construction materials for post-disaster reconstruction? Sustainability assessment of 20 post-disaster shelter designs”[5] PMID:26217807

  20. Global or local construction materials for post-disaster reconstruction? Sustainability assessment of 20 post-disaster shelter designs.

    PubMed

    Zea Escamilla, E; Habert, G

    2015-09-01

    This data article presents the life cycle inventories of 20 transitional shelter solutions. The data was gathered from the reports 8 shelter designs [1]; 10 post-disaster shelter designs [2]; the environmental impact of brick production outside of Europe [3]; and the optimization of bamboo-based post-disaster housing units for tropical and subtropical regions using LCA methodologies [4]. These reports include bill of quantities, plans, performance analysis, and lifespan of the studied shelters. The data from these reports was used to develop the Life Cycle Inventories (LCI). All the amounts were converted from their original units (length, volume and amount) into mass (kg) units and the transport distance into ton×km. These LCIs represent the production phases of each shelter and the transportation distances for the construction materials. Two types of distances were included, local (road) and international (freight ship), which were estimated based on the area of the country of study. Furthermore, the digital visualization of the shelters is presented for each of the 20 designs. Moreover, this data article presents a summary of the results for the categories Environment, Cost and Risk and the contribution to the environmental impact from the different building components of each shelter. These results are related to the article "Global or local construction materials for post-disaster reconstruction? Sustainability assessment of 20 post-disaster shelter designs"[5].

  1. Hourly test reference weather data in the changing climate of Finland for building energy simulations.

    PubMed

    Jylhä, Kirsti; Ruosteenoja, Kimmo; Jokisalo, Juha; Pilli-Sihvola, Karoliina; Kalamees, Targo; Mäkelä, Hanna; Hyvönen, Reijo; Drebs, Achim

    2015-09-01

    Dynamic building energy simulations need hourly weather data as input. The same high temporal resolution is required for assessments of future heating and cooling energy demand. The data presented in this article concern current typical values and estimated future changes in outdoor air temperature, wind speed, relative humidity and global, diffuse and normal solar radiation components. Simulated annual and seasonal delivered energy consumptions for heating of spaces, heating of ventilation supply air and cooling of spaces in the current and future climatic conditions are also presented for an example house, with district heating and a mechanical space cooling system. We provide details on how the synthetic future weather files were created and utilised as input data for dynamic building energy simulations by the IDA Indoor Climate and Energy program and also for calculations of heating and cooling degree-day sums. The information supplied here is related to the research article titled "Energy demand for the heating and cooling of residential houses in Finland in a changing climate" [1].

  2. Retrospective cost-effectiveness analyses for polio vaccination in the United States.

    PubMed

    Thompson, Kimberly M; Tebbens, Radboud J Duintjer

    2006-12-01

    The history of polio vaccination in the United States spans 50 years and includes different phases of the disease, multiple vaccines, and a sustained significant commitment of resources. We estimated cost-effectiveness ratios and assessed the net benefits of polio vaccination applicable at various points in time from the societal perspective and we discounted these back to appropriate points in time. We reconstructed vaccine price data from available sources and used these to retrospectively estimate the total costs of the U.S. historical polio vaccination strategies (all costs reported in year 2002 dollars). We estimate that the United States invested approximately US dollars 35 billion (1955 net present value, discount rate of 3%) in polio vaccines between 1955 and 2005 and will invest approximately US dollars 1.4 billion (1955 net present value, or US dollars 6.3 billion in 2006 net present value) between 2006 and 2015 assuming a policy of continued use of inactivated poliovirus vaccine (IPV) for routine vaccination. The historical and future investments translate into over 1.7 billion vaccinations that prevent approximately 1.1 million cases of paralytic polio and over 160,000 deaths (1955 net present values of approximately 480,000 cases and 73,000 deaths). Due to treatment cost savings, the investment implies net benefits of approximately US dollars 180 billion (1955 net present value), even without incorporating the intangible costs of suffering and death and of averted fear. Retrospectively, the U.S. investment in polio vaccination represents a highly valuable, cost-saving public health program. Observed changes in the cost-effectiveness ratio estimates over time suggest the need for living economic models for interventions that appropriately change with time. This article also demonstrates that estimates of cost-effectiveness ratios at any single time point may fail to adequately consider the context of the investment made to date and the importance of population and other dynamics, and shows the importance of dynamic modeling.

  3. Morphological analysis of pore size and connectivity in a thick mixed-culture biofilm.

    PubMed

    Rosenthal, Alex F; Griffin, James S; Wagner, Michael; Packman, Aaron I; Balogun, Oluwaseyi; Wells, George F

    2018-05-19

    Morphological parameters are commonly used to predict transport and metabolic kinetics in biofilms. Yet, quantification of biofilm morphology remains challenging due to imaging technology limitations and lack of robust analytical approaches. We present a novel set of imaging and image analysis techniques to estimate internal porosity, pore size distributions, and pore network connectivity to a depth of 1 mm at a resolution of 10 µm in a biofilm exhibiting both heterotrophic and nitrifying activity. Optical coherence tomography (OCT) scans revealed an extensive pore network with diameters as large as 110 µm directly connected to the biofilm surface and surrounding fluid. Thin section fluorescence in situ hybridization microscopy revealed ammonia oxidizing bacteria (AOB) distributed through the entire thickness of the biofilm. AOB were particularly concentrated in the biofilm around internal pores. Areal porosity values estimated from OCT scans were consistently lower than those estimated from multiphoton laser scanning microscopy, though the two imaging modalities showed a statistically significant correlation (r = 0.49, p<0.0001). Estimates of areal porosity were moderately sensitive to grey level threshold selection, though several automated thresholding algorithms yielded similar values to those obtained by manually thresholding performed by a panel of environmental engineering researchers (±25% relative error). These findings advance our ability to quantitatively describe the geometry of biofilm internal pore networks at length scales relevant to engineered biofilm reactors and suggest that internal pore structures provide crucial habitat for nitrifier growth. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  4. Alternative Methods for Handling Attrition

    PubMed Central

    Foster, E. Michael; Fang, Grace Y.

    2009-01-01

    Using data from the evaluation of the Fast Track intervention, this article illustrates three methods for handling attrition. Multiple imputation and ignorable maximum likelihood estimation produce estimates that are similar to those based on listwise-deleted data. A panel selection model that allows for selective dropout reveals that highly aggressive boys accumulate in the treatment group over time and produces a larger estimate of treatment effect. In contrast, this model produces a smaller treatment effect for girls. The article's conclusion discusses the strengths and weaknesses of the alternative approaches and outlines ways in which researchers might improve their handling of attrition. PMID:15358906

  5. Using the Human Activity Profile to Assess Functional Performance in Heart Failure.

    PubMed

    Ribeiro-Samora, Giane Amorim; Pereira, Danielle Aparecida Gomes; Vieira, Otávia Alves; de Alencar, Maria Clara Noman; Rodrigues, Roseane Santo; Carvalho, Maria Luiza Vieira; Montemezzo, Dayane; Britto, Raquel Rodrigues

    2016-01-01

    To investigate (1) the validity of using the Human Activity Profile (HAP) in patients with heart failure (HF) to estimate functional capacity; (2) the association between the HAP and 6-Minute Walk Test (6MWT) distance; and (3) the ability of the HAP to differentiate between New York Heart Association (NYHA) functional classes. In a cross-sectional study, we evaluated 62 clinically stable patients with HF (mean age, 47.98 years; NYHA class I-III). Variables included maximal functional capacity as measured by peak oxygen uptake ((Equation is included in full-text article.)O2) using a cardiopulmonary exercise test (CPET), peak (Equation is included in full-text article.)O2 as estimated by the HAP, and exercise capacity as measured by the 6MWT. The difference between the measured (CPET) and estimated (HAP) peak (Equation is included in full-text article.)O2 against the average values showed a bias of 2.18 mL/kg/min (P = .007). No agreement was seen between these measures when applying the Bland-Altman method. Peak (Equation is included in full-text article.)O2 in the HAP showed a moderate association with the 6MWT distance (r = 0.62; P < .0001). Peak (Equation is included in full-text article.)O2 in the HAP was able to statistically differentiate NYHA functional classes I, II, and III (P < .05). The estimated peak (Equation is included in full-text article.)O2 using the HAP was not concordant with the gold standard CPET measure. On the contrary, the HAP was able to differentiate NYHA functional class associated with the 6MWT distance; therefore, the HAP is a useful tool for assessing functional performance in patients with HF.

  6. [INFORMATION AWARENESS OF STUDENTS--FUTURE TECHNOLOGY FOR HEALTHY LIFESTYLES TEACHERS AND TRAINING IN THEIR EDUCATIONAL ACTIVITIES IN AREA OF HUMAN HEALTH PRESERVATION].

    PubMed

    Kalinina, I A

    2015-01-01

    In the article there are presented results of the questionnaire survey of students--future technology for healthy lifestyles teachers on issues of shaping of health and a healthy lifestyle. There is given an estimation of the degree of the formedness in students adjustment for healthy lifestyle, including eating behavior and nutrition ration. There were determined basic directions of the shaping of the health-saving competence of the school teacher.

  7. Statistical power for nonequivalent pretest-posttest designs. The impact of change-score versus ANCOVA models.

    PubMed

    Oakes, J M; Feldman, H A

    2001-02-01

    Nonequivalent controlled pretest-posttest designs are central to evaluation science, yet no practical and unified approach for estimating power in the two most widely used analytic approaches to these designs exists. This article fills the gap by presenting and comparing useful, unified power formulas for ANCOVA and change-score analyses, indicating the implications of each on sample-size requirements. The authors close with practical recommendations for evaluators. Mathematical details and a simple spreadsheet approach are included in appendices.

  8. An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures.

    PubMed

    Ponterotto, Joseph G; Ruckdeschel, Daniel E

    2007-12-01

    The present article addresses issues in reliability assessment that are often neglected in psychological research such as acceptable levels of internal consistency for research purposes, factors affecting the magnitude of coefficient alpha (alpha), and considerations for interpreting alpha within the research context. A new reliability matrix anchored in classical test theory is introduced to help researchers judge adequacy of internal consistency coefficients with research measures. Guidelines and cautions in applying the matrix are provided.

  9. [Clinical and immunological assessment of Polyoxidonium and Tantum Verde efficiency by catarrhal gingivitis treatment in children with chronic gastroduodenitis].

    PubMed

    Kazarina, L N; Pursanova, A E

    2014-01-01

    The article presents findings allowing estimating effect of local application of polioxidonium and yantum verde in 101 children aged 12-17 with chronic catarrhal gingivitis and chronic gastroduodenitis. Statistically significant PMA indeх decrease (40.1±2.3% till 1.4±0.6% (р<0,001)) proved the above mentioned therapy scheme to be highly effective for treatment of chronic catarrhal gingivitis in children with chronic gastroduodenitis.

  10. Ultrathin fiber poly-3-hydroxybutyrate, modified by silicon carbide nanoparticles

    NASA Astrophysics Data System (ADS)

    Olkhov, A. A.; Krutikova, A. A.; Goldshtrakh, M. A.; Staroverova, O. V.; Iordanskii, A. L.; Ischenko, A. A.

    2016-11-01

    The article presents the results of studies the composite fibrous material based on poly-3-hydroxybutyrate (PHB) and nano-size silicon carbide obtained by the electrospinning method. Size distribution of the silicon carbide nanoparticles in the fiber was estimated by X-ray diffraction technique. It is shown that immobilization of the SiC nanoparticles to the PHB fibers contributes to obtaining essentially smaller diameter of fibers, high physical-mechanical characteristics and increasing resistance to degradation in comparison with the fibers of PHB.

  11. The estimation of dynamic contact angle of ultra-hydrophobic surfaces using inclined surface and impinging droplet methods

    NASA Astrophysics Data System (ADS)

    Jasikova, Darina; Kotek, Michal

    2014-03-01

    The development of industrial technology also brings with optimized surface quality, particularly where there is contact with food. Application ultra-hydrophobic surface significantly reduces the growth of bacteria and facilitates cleaning processes. Testing and evaluation of surface quality are used two methods: impinging droplet and inclined surface method optimized with high speed shadowgraphy, which give information about dynamic contact angle. This article presents the results of research into new methods of measuring ultra-hydrophobic patented technology.

  12. Application of Synchrophasor Measurements for Improving Situational Awareness of the Power System

    NASA Astrophysics Data System (ADS)

    Obushevs, A.; Mutule, A.

    2018-04-01

    The paper focuses on the application of synchrophasor measurements that present unprecedented benefits compared to SCADA systems in order to facilitate the successful transformation of the Nordic-Baltic-and-European electric power system to operate with large amounts of renewable energy sources and improve situational awareness of the power system. The article describes new functionalities of visualisation tools to estimate a grid inertia level in real time with monitoring results between Nordic and Baltic power systems.

  13. Within-Tunnel Variations in Pressure Data for Three Transonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2014-01-01

    This paper compares the results of pressure measurements made on the same test article with the same test matrix in three transonic wind tunnels. A comparison is presented of the unexplained variance associated with polar replicates acquired in each tunnel. The impact of a significance component of systematic (not random) unexplained variance is reviewed, and the results of analyses of variance are presented to assess the degree of significant systematic error in these representative wind tunnel tests. Total uncertainty estimates are reported for 140 samples of pressure data, quantifying the effects of within-polar random errors and between-polar systematic bias errors.

  14. Home care for the disabled elderly: predictors and expected costs.

    PubMed Central

    Coughlin, T A; McBride, T D; Perozek, M; Liu, K

    1992-01-01

    While interest in publicly funded home care for the disabled elderly is keen, basic policy issues need to be addressed before an appropriate program can be adopted and financed. This article presents findings from a study in which the cost implications of anticipated behavioral responses (for example, caregiver substitution) are estimated. Using simulation techniques, the results demonstrate that anticipated behavioral responses would likely add between $1.8 and $2.7 billion (1990 dollars) to the costs of a public home care program. Results from a variety of cost simulations are presented. The data base for the study was the 1982 National Long-Term Care Survey. PMID:1399652

  15. Daily stock index return for the Canadian, UK, and US equity markets, compiled by Morgan Stanley Capital International, obtained from Datastream.

    PubMed

    Li, Leon

    2018-02-01

    The data presented in this article are related to the research article entitled "Testing and comparing the performance of dynamic variance and correlation models in value-at-risk estimation. North American Journal of Economics and Finance, 40, 116-135. doi:10.1016/j.najef.2017.02.006 (Li, 2017) [1]. Data on daily stock index return for the Canadian, UK, and US equity markets, as compiled by Morgan Stanley Capital International, are provided in this paper. The country indices comprise at least 80% of the stock market capitalization of each country. The data cover the period from January 1, 1990, through September 8, 2016, and include 6963 observations. All stock prices are stated in dollars.

  16. Gibbs Sampler-Based λ-Dynamics and Rao-Blackwell Estimator for Alchemical Free Energy Calculation.

    PubMed

    Ding, Xinqiang; Vilseck, Jonah Z; Hayes, Ryan L; Brooks, Charles L

    2017-06-13

    λ-dynamics is a generalized ensemble method for alchemical free energy calculations. In traditional λ-dynamics, the alchemical switch variable λ is treated as a continuous variable ranging from 0 to 1 and an empirical estimator is utilized to approximate the free energy. In the present article, we describe an alternative formulation of λ-dynamics that utilizes the Gibbs sampler framework, which we call Gibbs sampler-based λ-dynamics (GSLD). GSLD, like traditional λ-dynamics, can be readily extended to calculate free energy differences between multiple ligands in one simulation. We also introduce a new free energy estimator, the Rao-Blackwell estimator (RBE), for use in conjunction with GSLD. Compared with the current empirical estimator, the advantage of RBE is that RBE is an unbiased estimator and its variance is usually smaller than the current empirical estimator. We also show that the multistate Bennett acceptance ratio equation or the unbinned weighted histogram analysis method equation can be derived using the RBE. We illustrate the use and performance of this new free energy computational framework by application to a simple harmonic system as well as relevant calculations of small molecule relative free energies of solvation and binding to a protein receptor. Our findings demonstrate consistent and improved performance compared with conventional alchemical free energy methods.

  17. Auditing of suppliers as the requirement of quality management systems in construction

    NASA Astrophysics Data System (ADS)

    Harasymiuk, Jolanta; Barski, Janusz

    2017-07-01

    The choice of a supplier of construction materials can be important factor of increase or reduction of building works costs. Construction materials present from 40 for 70% of investment task depending on kind of works being provided for realization. There is necessity of estimate of suppliers from the point of view of effectiveness of construction undertaking and necessity from the point of view of conformity of taken operation by executives of construction job and objects within the confines of systems of managements quality being initiated in their organizations. The estimate of suppliers of construction materials and subexecutives of special works is formal requirement in quality management systems, which meets the requirements of the ISO 9001 standard. The aim of this paper is to show possibilities of making use of anaudit for estimate of credibility and reliability of the supplier of construction materials. The article describes kinds of audits, that were carried in quality management systems, with particular taking into consideration audits called as second-site. One characterizes the estimate criterions of qualitative ability and method of choice of the supplier of construction materials. The paper shows also propositions of exemplary questions, that would be estimated in audit process, the way of conducting of this estimate and conditionality of estimate.

  18. Modeling good research practices--overview: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--1.

    PubMed

    Caro, J Jaime; Briggs, Andrew H; Siebert, Uwe; Kuntz, Karen M

    2012-01-01

    Models--mathematical frameworks that facilitate estimation of the consequences of health care decisions--have become essential tools for health technology assessment. Evolution of the methods since the first ISPOR Modeling Task Force reported in 2003 has led to a new Task Force, jointly convened with the Society for Medical Decision Making, and this series of seven articles presents the updated recommendations for best practices in conceptualizing models; implementing state-transition approaches, discrete event simulations, or dynamic transmission models; and dealing with uncertainty and validating and reporting models transparently. This overview article introduces the work of the Task Force, provides all the recommendations, and discusses some quandaries that require further elucidation. The audience for these articles includes those who build models, stakeholders who utilize their results, and, indeed, anyone concerned with the use of models to support decision making. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Quality Improvement of Chrome-Diamond Coatings on Flowing Chrome Plating

    NASA Astrophysics Data System (ADS)

    Belyaev, V. N.; Koslyuk, A. Yu; Lobunets, A. V.; Andreyev, A. S.

    2016-04-01

    The research results of the process of flowing chrome plating of internal surfaces of long-length cylindrical articles with the usage of electrolyte with ultra-dispersed diamonds when continuous article rotation, while chromium-plating, are presented. During experiments the following varying technological parameters: electrolyte temperature and article frequency rotation were chosen, and experimental samples were obtained. Estimation of porosity, micro-hardness, thickness of chrome coatings and uniformity were performed as well as the precipitation structure by the method of scanning electron microscopy. The results showed that the use of ultra-dispersed diamonds and realization of the scheme with rotation of detail-cathode when flowing chromium-plating allows one to increase servicing characteristics of the coating due to the decrease of grains size of chrome coating and porosity, and due to the increase of micro-hardness, so confirming the efficiency of using the suggested scheme of coating application and the given type of ultra-dispersed fillers when chromium-plating.

  20. Flowfield Analysis of a Small Entry Probe (SPRITE) Tested in an Arc Jet

    NASA Technical Reports Server (NTRS)

    Prabhu, Dinesh K.

    2011-01-01

    Results of simulations of flow of an arc-heated stream around a 14-inch diameter 45 sphere-cone configuration are presented. Computations are first benchmarked against pressure and heat flux measurements made using copper slug calorimeters of different shapes and sizes. The influence of catalycity of copper on computed results is investigated. Good agreements between predictions and measurements are obtained by assuming the copper slug to be partially catalytic to atomic recombination. With total enthalpy estimates obtained from these preliminary computations, calculations are then performed for the test article, with the nozzle and test article considered as an integrated whole the same procedure adopted for calorimeter simulations. The resulting heat fluxes at select points on the test article (points at which fully instrumented plugs were placed) are used in material thermal response code calculations. Predicted time histories of temperature are compared against thermocouple data from the instrumented plugs, and recession determined. Good agreement is obtained for in-depth thermocouples.

  1. Refining estimates of public health spending as measured in national health expenditure accounts: the Canadian experience.

    PubMed

    Ballinger, Geoff

    2007-01-01

    The recent focus on public health stemming from, among other things, severe acute respiratory syndrome and avian flu has created an imperative to refine health-spending estimates in the Canadian Health Accounts. This article presents the Canadian experience in attempting to address the challenges associated with developing the needed taxonomies for systematically capturing, measuring, and analyzing the national investment in the Canadian public health system. The first phase of this process was completed in 2005, which was a 2-year project to estimate public health spending based on a more classic definition by removing the administration component of the previously combined public health and administration category. Comparing the refined public health estimate with recent data from the Organization for Economic Cooperation and Development still positions Canada with the highest share of total health expenditure devoted to public health than any other country reporting. The article also provides an analysis of the comparability of public health estimates across jurisdictions within Canada as well as a discussion of the recommendations for ongoing improvement of public health spending estimates. The Canadian Institute for Health Information is an independent, not-for-profit organization that provides Canadians with essential statistics and analysis on the performance of the Canadian health system, the delivery of healthcare, and the health status of Canadians. The Canadian Institute for Health Information administers more than 20 databases and registries, including Canada's Health Accounts, which tracks historically 40 categories of health spending by 5 sources of finance for 13 provincial and territorial jurisdictions. Until 2005, expenditure on public health services in the Canadian Health Accounts included measures to prevent the spread of communicable disease, food and drug safety, health inspections, health promotion, community mental health programs, public health nursing, as well as all the costs for the general administration of government health departments.

  2. Models for estimating and projecting global, regional and national prevalence and disease burden of asthma: protocol for a systematic review.

    PubMed

    Bhuia, Mohammad Romel; Nwaru, Bright I; Weir, Christopher J; Sheikh, Aziz

    2017-05-17

    Models that have so far been used to estimate and project the prevalence and disease burden of asthma are in most cases inadequately described and irreproducible. We aim systematically to describe and critique the existing models in relation to their strengths, limitations and reproducibility, and to determine the appropriate models for estimating and projecting the prevalence and disease burden of asthma. We will search the following electronic databases to identify relevant literature published from 1980 to 2017: Medline, Embase, WHO Library and Information Services and Web of Science Core Collection. We will identify additional studies by searching the reference list of all the retrieved papers and contacting experts. We will include observational studies that used models for estimating and/or projecting prevalence and disease burden of asthma regarding human population of any age and sex. Two independent reviewers will assess the studies for inclusion and extract data from included papers. Data items will include authors' names, publication year, study aims, data source and time period, study population, asthma outcomes, study methodology, model type, model settings, study variables, methods of model derivation, methods of parameter estimation and/or projection, model fit information, key findings and identified research gaps. A detailed critical narrative synthesis of the models will be undertaken in relation to their strengths, limitations and reproducibility. A quality assessment checklist and scoring framework will be used to determine the appropriate models for estimating and projecting the prevalence anddiseaseburden of asthma. We will not collect any primary data for this review, and hence there is no need for formal National Health Services Research Ethics Committee approval. We will present our findings at scientific conferences and publish the findings in the peer-reviewed scientific journal. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    NASA Technical Reports Server (NTRS)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component analysis is proposed to suppress the multicollinearities in order to make these Jacobians robust and physically meaningful.

  4. Sizing the Problem of Improving Discovery and Access to NIH-Funded Data: A Preliminary Study

    PubMed Central

    2015-01-01

    Objective This study informs efforts to improve the discoverability of and access to biomedical datasets by providing a preliminary estimate of the number and type of datasets generated annually by research funded by the U.S. National Institutes of Health (NIH). It focuses on those datasets that are “invisible” or not deposited in a known repository. Methods We analyzed NIH-funded journal articles that were published in 2011, cited in PubMed and deposited in PubMed Central (PMC) to identify those that indicate data were submitted to a known repository. After excluding those articles, we analyzed a random sample of the remaining articles to estimate how many and what types of invisible datasets were used in each article. Results About 12% of the articles explicitly mention deposition of datasets in recognized repositories, leaving 88% that are invisible datasets. Among articles with invisible datasets, we found an average of 2.9 to 3.4 datasets, suggesting there were approximately 200,000 to 235,000 invisible datasets generated from NIH-funded research published in 2011. Approximately 87% of the invisible datasets consist of data newly collected for the research reported; 13% reflect reuse of existing data. More than 50% of the datasets were derived from live human or non-human animal subjects. Conclusion In addition to providing a rough estimate of the total number of datasets produced per year by NIH-funded researchers, this study identifies additional issues that must be addressed to improve the discoverability of and access to biomedical research data: the definition of a “dataset,” determination of which (if any) data are valuable for archiving and preservation, and better methods for estimating the number of datasets of interest. Lack of consensus amongst annotators about the number of datasets in a given article reinforces the need for a principled way of thinking about how to identify and characterize biomedical datasets. PMID:26207759

  5. Measurement of psychological disorders using cognitive diagnosis models.

    PubMed

    Templin, Jonathan L; Henson, Robert A

    2006-09-01

    Cognitive diagnosis models are constrained (multiple classification) latent class models that characterize the relationship of questionnaire responses to a set of dichotomous latent variables. Having emanated from educational measurement, several aspects of such models seem well suited to use in psychological assessment and diagnosis. This article presents the development of a new cognitive diagnosis model for use in psychological assessment--the DINO (deterministic input; noisy "or" gate) model--which, as an illustrative example, is applied to evaluate and diagnose pathological gamblers. As part of this example, a demonstration of the estimates obtained by cognitive diagnosis models is provided. Such estimates include the probability an individual meets each of a set of dichotomous Diagnostic and Statistical Manual of Mental Disorders (text revision [DSM-IV-TR]; American Psychiatric Association, 2000) criteria, resulting in an estimate of the probability an individual meets the DSM-IV-TR definition for being a pathological gambler. Furthermore, a demonstration of how the hypothesized underlying factors contributing to pathological gambling can be measured with the DINO model is presented, through use of a covariance structure model for the tetrachoric correlation matrix of the dichotomous latent variables representing DSM-IV-TR criteria. Copyright 2006 APA

  6. Fault detection using a two-model test for changes in the parameters of an autoregressive time series

    NASA Technical Reports Server (NTRS)

    Scholtz, P.; Smyth, P.

    1992-01-01

    This article describes an investigation of a statistical hypothesis testing method for detecting changes in the characteristics of an observed time series. The work is motivated by the need for practical automated methods for on-line monitoring of Deep Space Network (DSN) equipment to detect failures and changes in behavior. In particular, on-line monitoring of the motor current in a DSN 34-m beam waveguide (BWG) antenna is used as an example. The algorithm is based on a measure of the information theoretic distance between two autoregressive models: one estimated with data from a dynamic reference window and one estimated with data from a sliding reference window. The Hinkley cumulative sum stopping rule is utilized to detect a change in the mean of this distance measure, corresponding to the detection of a change in the underlying process. The basic theory behind this two-model test is presented, and the problem of practical implementation is addressed, examining windowing methods, model estimation, and detection parameter assignment. Results from the five fault-transition simulations are presented to show the possible limitations of the detection method, and suggestions for future implementation are given.

  7. Investigation of electrophysical properties of allotropic modifications of carbon in the range of temperatures 140-400 K

    NASA Astrophysics Data System (ADS)

    Goshev, A. A.; Eseev, M. K.; Volkov, A. S.; Lyah, N. L.

    2017-09-01

    The paper presents the results of the investigation of allotropic modifications of carbon (coal, graphite, fullerenes, CNTs. Dependences of conductivity on the field frequency in the temperature range 140-400 K are presented. The characteristic features associated with the structure and types of hybridization are revealed. Calculation of the activation energy of carriers was performed. As well article presents experimental study of electrical properties of polymeric composites, reinforced different types of allotropic modifications of carbon (CNTs, graphite, fullerenes, coal) in alternating electrical field in frequency band from 0.01 Hz to 10 MHz. The threshold of percolation of polymer composites with various types of additives and their influence for conduction properties was estimated.

  8. Experimental Design for Parameter Estimation of Gene Regulatory Networks

    PubMed Central

    Timmer, Jens

    2012-01-01

    Systems biology aims for building quantitative models to address unresolved issues in molecular biology. In order to describe the behavior of biological cells adequately, gene regulatory networks (GRNs) are intensively investigated. As the validity of models built for GRNs depends crucially on the kinetic rates, various methods have been developed to estimate these parameters from experimental data. For this purpose, it is favorable to choose the experimental conditions yielding maximal information. However, existing experimental design principles often rely on unfulfilled mathematical assumptions or become computationally demanding with growing model complexity. To solve this problem, we combined advanced methods for parameter and uncertainty estimation with experimental design considerations. As a showcase, we optimized three simulated GRNs in one of the challenges from the Dialogue for Reverse Engineering Assessment and Methods (DREAM). This article presents our approach, which was awarded the best performing procedure at the DREAM6 Estimation of Model Parameters challenge. For fast and reliable parameter estimation, local deterministic optimization of the likelihood was applied. We analyzed identifiability and precision of the estimates by calculating the profile likelihood. Furthermore, the profiles provided a way to uncover a selection of most informative experiments, from which the optimal one was chosen using additional criteria at every step of the design process. In conclusion, we provide a strategy for optimal experimental design and show its successful application on three highly nonlinear dynamic models. Although presented in the context of the GRNs to be inferred for the DREAM6 challenge, the approach is generic and applicable to most types of quantitative models in systems biology and other disciplines. PMID:22815723

  9. Measuring Housework Participation: The Gap between "Stylised" Questionnaire Estimates and Diary-Based Estimates

    ERIC Educational Resources Information Center

    Kan, Man Yee

    2008-01-01

    This article compares stylised (questionnaire-based) estimates and diary-based estimates of housework time collected from the same respondents. Data come from the Home On-line Study (1999-2001), a British national household survey that contains both types of estimates (sample size = 632 men and 666 women). It shows that the gap between the two…

  10. An Evaluation of Empirical Bayes's Estimation of Value-Added Teacher Performance Measures

    ERIC Educational Resources Information Center

    Guarino, Cassandra M.; Maxfield, Michelle; Reckase, Mark D.; Thompson, Paul N.; Wooldridge, Jeffrey M.

    2015-01-01

    Empirical Bayes's (EB) estimation has become a popular procedure used to calculate teacher value added, often as a way to make imprecise estimates more reliable. In this article, we review the theory of EB estimation and use simulated and real student achievement data to study the ability of EB estimators to properly rank teachers. We compare the…

  11. Mortality table construction

    NASA Astrophysics Data System (ADS)

    Sutawanir

    2015-12-01

    Mortality tables play important role in actuarial studies such as life annuities, premium determination, premium reserve, valuation pension plan, pension funding. Some known mortality tables are CSO mortality table, Indonesian Mortality Table, Bowers mortality table, Japan Mortality table. For actuary applications some tables are constructed with different environment such as single decrement, double decrement, and multiple decrement. There exist two approaches in mortality table construction : mathematics approach and statistical approach. Distribution model and estimation theory are the statistical concepts that are used in mortality table construction. This article aims to discuss the statistical approach in mortality table construction. The distributional assumptions are uniform death distribution (UDD) and constant force (exponential). Moment estimation and maximum likelihood are used to estimate the mortality parameter. Moment estimation methods are easier to manipulate compared to maximum likelihood estimation (mle). However, the complete mortality data are not used in moment estimation method. Maximum likelihood exploited all available information in mortality estimation. Some mle equations are complicated and solved using numerical methods. The article focus on single decrement estimation using moment and maximum likelihood estimation. Some extension to double decrement will introduced. Simple dataset will be used to illustrated the mortality estimation, and mortality table.

  12. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  13. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    NASA Astrophysics Data System (ADS)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate the energy resource flows measurement imbalances, and to filter invalid measurements at the data acquisition and processing stage in performing monitoring of an automated energy resource monitoring and accounting system.

  14. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    PubMed

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  15. Why don't people buy long-term-care insurance?

    PubMed

    Cramer, Anne Theisen; Jensen, Gail A

    2006-07-01

    The objective of this article was to assess the determinants of an individual's decision to purchase long-term-care (LTC) insurance. This article focuses on the decision to purchase a new policy as opposed to renewing an existing policy. This study gave special consideration to the role of policy price, the savings associated with buying a policy now as opposed to later, the purchaser's education, and the purchaser's income. Using data from the 2002 Health and Retirement Survey, we estimated logistic regressions to model consumer decisions to purchase LTC insurance. We explored several alternative measures of the price of a policy. Price was a significant determinant in decisions to purchase coverage. The demand for coverage, however, was price inelastic, with elasticities ranging from -0.23 to -0.87, depending on the specification of the model. The education level and income of the purchaser were also important. This analysis provides the first estimates of price elasticity of demand for LTC insurance. The finding that demand is very price inelastic suggests that state initiatives that effectively subsidize premiums as a way of stimulating purchases are likely to meet with very limited success in the present environment.

  16. What Is the Return on Investment for Implementation of a Crew Resource Management Program at an Academic Medical Center?

    PubMed

    Moffatt-Bruce, Susan D; Hefner, Jennifer L; Mekhjian, Hagop; McAlearney, John S; Latimer, Tina; Ellison, Chris; McAlearney, Ann Scheck

    Crew Resource Management (CRM) training has been used successfully within hospital units to improve quality and safety. This article presents a description of a health system-wide implementation of CRM focusing on the return on investment (ROI). The costs included training, programmatic fixed costs, time away from work, and leadership time. Cost savings were calculated based on the reduction in avoidable adverse events and cost estimates from the literature. Between July 2010 and July 2013, roughly 3000 health system employees across 12 areas were trained, costing $3.6 million. The total number of adverse events avoided was 735-a 25.7% reduction in observed relative to expected events. Savings ranged from a conservative estimate of $12.6 million to as much as $28.0 million. Therefore, the overall ROI for CRM training was in the range of $9.1 to $24.4 million. CRM presents a financially viable way to systematically organize for quality improvement.

  17. Disk storage management for LHCb based on Data Popularity estimator

    NASA Astrophysics Data System (ADS)

    Hushchyn, Mikhail; Charpentier, Philippe; Ustyuzhanin, Andrey

    2015-12-01

    This paper presents an algorithm providing recommendations for optimizing the LHCb data storage. The LHCb data storage system is a hybrid system. All datasets are kept as archives on magnetic tapes. The most popular datasets are kept on disks. The algorithm takes the dataset usage history and metadata (size, type, configuration etc.) to generate a recommendation report. This article presents how we use machine learning algorithms to predict future data popularity. Using these predictions it is possible to estimate which datasets should be removed from disk. We use regression algorithms and time series analysis to find the optimal number of replicas for datasets that are kept on disk. Based on the data popularity and the number of replicas optimization, the algorithm minimizes a loss function to find the optimal data distribution. The loss function represents all requirements for data distribution in the data storage system. We demonstrate how our algorithm helps to save disk space and to reduce waiting times for jobs using this data.

  18. New best estimates for radionuclide solid-liquid distribution coefficients in soils. Part 2: naturally occurring radionuclides.

    PubMed

    Vandenhove, H; Gil-García, C; Rigol, A; Vidal, M

    2009-09-01

    Predicting the transfer of radionuclides in the environment for normal release, accidental, disposal or remediation scenarios in order to assess exposure requires the availability of an important number of generic parameter values. One of the key parameters in environmental assessment is the solid liquid distribution coefficient, K(d), which is used to predict radionuclide-soil interaction and subsequent radionuclide transport in the soil column. This article presents a review of K(d) values for uranium, radium, lead, polonium and thorium based on an extensive literature survey, including recent publications. The K(d) estimates were presented per soil groups defined by their texture and organic matter content (Sand, Loam, Clay and Organic), although the texture class seemed not to significantly affect K(d). Where relevant, other K(d) classification systems are proposed and correlations with soil parameters are highlighted. The K(d) values obtained in this compilation are compared with earlier review data.

  19. A re-examination of the mere exposure effect: The influence of repeated exposure on recognition, familiarity, and liking.

    PubMed

    Montoya, R Matthew; Horton, Robert S; Vevea, Jack L; Citkowicz, Martyna; Lauber, Elissa A

    2017-05-01

    To evaluate the veracity of models of the mere exposure effect and to understand the processes that moderate the effect, we conducted a meta-analysis of the influence of repeated exposure on liking, familiarity, recognition, among other evaluations. We estimated parameters from 268 curve estimates drawn from 81 articles and revealed that the mere exposure effect was characterized by a positive slope and negative quadratic effect consistent with an inverted-U shaped curve. In fact, such curves were associated with (a) all visual, but not auditory stimuli; (b) exposure durations shorter than 10 s and longer than 1 min; (c) both homogeneous and heterogeneous presentation types; and (d) ratings that were taken after all stimuli were presented. We conclude that existing models for the mere exposure effect do not adequately account for the findings, and we provide a framework to help guide future research. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. A systematic review of causes of sudden and severe headache (Thunderclap Headache): should lists be evidence based?

    PubMed Central

    2014-01-01

    Background There are many potential causes of sudden and severe headache (thunderclap headache), the most important of which is aneurysmal subarachnoid haemorrhage. Published academic reviews report a wide range of causes. We sought to create a definitive list of causes, other than aneurysmal subarachnoid haemorrhage, using a systematic review. Methods Systematic Review of EMBASE and MEDLINE databases using pre-defined search criteria up to September 2009. We extracted data from any original research paper or case report describing a case of someone presenting with a sudden and severe headache, and summarized the published causes. Results Our search identified over 21,000 titles, of which 1224 articles were scrutinized in full. 213 articles described 2345 people with sudden and severe headache, and we identified 6 English language academic review articles. A total of 119 causes were identified, of which 46 (38%) were not mentioned in published academic review articles. Using capture-recapture analysis, we estimate that our search was 98% complete. There is only one population-based estimate of the incidence of sudden and severe headache at 43 cases per 100,000. In cohort studies, the most common causes identified were primary headaches or headaches of uncertain cause. Vasoconstriction syndromes are commonly mentioned in case reports or case series. The most common cause not mentioned in academic reviews was pneumocephalus. 70 non-English language articles were identified but these did not contain additional causes. Conclusions There are over 100 different published causes of sudden and severe headache, other than aneurysmal subarachnoid haemorrhage. We have now made a definitive list of causes for future reference which we intend to maintain. There is a need for an up to date population based description of cause of sudden and severe headache as the modern epidemiology of thunderclap headache may require updating in the light of research on cerebral vasoconstriction syndromes. PMID:25123846

  1. How to use bibliometric methods in evaluation of scientific research? An example from Finnish schizophrenia research.

    PubMed

    Koskinen, Johanna; Isohanni, Matti; Paajala, Henna; Jääskeläinen, Erika; Nieminen, Pentti; Koponen, Hannu; Tienari, Pekka; Miettunen, Jouko

    2008-01-01

    We present bibliometric methods that can be utilized in evaluation processes of scientific work. In this paper, we present some practical clues using Finnish schizophrenia research as an example and comparing the research output of different institutions. Bibliometric data and indicators including publication counts, impact factors and received citations were used as tools for evaluating research performance in Finnish schizophrenia research. The articles and citations were searched from the Web of Science database. We used schizophrenia as a keyword and defined address Finland, and limited years to 1996-2005. When we analysed Finnish schizophrenia research, altogether 265 articles met our criteria. There were differences in impact factors and received citations between institutions. The number of annually published Finnish schizophrenia articles has tripled since the mid-1990s. International co-operation was common (43%). Bibliometric methods revealed differences between institutions, indicating that the methods can be applied in research evaluation. The coverage of databases as well as the precision of their search engines can be seen as limitations. Bibliometric methods offer a practical and impartial way to estimate publication profiles of researchers and research groups. According to our experience, these methods can be used as an evaluation instrument in research together with other methods, such as expert opinions and panels.

  2. Constrained low-rank matrix estimation: phase transitions, approximate message passing and applications

    NASA Astrophysics Data System (ADS)

    Lesieur, Thibault; Krzakala, Florent; Zdeborová, Lenka

    2017-07-01

    This article is an extended version of previous work of Lesieur et al (2015 IEEE Int. Symp. on Information Theory Proc. pp 1635-9 and 2015 53rd Annual Allerton Conf. on Communication, Control and Computing (IEEE) pp 680-7) on low-rank matrix estimation in the presence of constraints on the factors into which the matrix is factorized. Low-rank matrix factorization is one of the basic methods used in data analysis for unsupervised learning of relevant features and other types of dimensionality reduction. We present a framework to study the constrained low-rank matrix estimation for a general prior on the factors, and a general output channel through which the matrix is observed. We draw a parallel with the study of vector-spin glass models—presenting a unifying way to study a number of problems considered previously in separate statistical physics works. We present a number of applications for the problem in data analysis. We derive in detail a general form of the low-rank approximate message passing (Low-RAMP) algorithm, that is known in statistical physics as the TAP equations. We thus unify the derivation of the TAP equations for models as different as the Sherrington-Kirkpatrick model, the restricted Boltzmann machine, the Hopfield model or vector (xy, Heisenberg and other) spin glasses. The state evolution of the Low-RAMP algorithm is also derived, and is equivalent to the replica symmetric solution for the large class of vector-spin glass models. In the section devoted to result we study in detail phase diagrams and phase transitions for the Bayes-optimal inference in low-rank matrix estimation. We present a typology of phase transitions and their relation to performance of algorithms such as the Low-RAMP or commonly used spectral methods.

  3. The social costs of dangerous products: an empirical investigation.

    PubMed

    Shapiro, Sidney; Ruttenberg, Ruth; Leigh, Paul

    2009-01-01

    Defective consumer products impose significant costs on consumers and third parties when they cause fatalities and injuries. This Article develops a novel approach to measuring the true extent of such costs, which may not be accurately captured under current methods of estimating the cost of dangerous products. Current analysis rests on a narrowly defined set of costs, excluding certain types of costs. The cost-of-injury estimates utilized in this Article address this omission by quantifying and incorporating these costs to provide a more complete picture of the true impact of defective consumer products. The new estimates help to gauge the true value of the civil liability system.

  4. Estimation of rates-across-sites distributions in phylogenetic substitution models.

    PubMed

    Susko, Edward; Field, Chris; Blouin, Christian; Roger, Andrew J

    2003-10-01

    Previous work has shown that it is often essential to account for the variation in rates at different sites in phylogenetic models in order to avoid phylogenetic artifacts such as long branch attraction. In most current models, the gamma distribution is used for the rates-across-sites distributions and is implemented as an equal-probability discrete gamma. In this article, we introduce discrete distribution estimates with large numbers of equally spaced rate categories allowing us to investigate the appropriateness of the gamma model. With large numbers of rate categories, these discrete estimates are flexible enough to approximate the shape of almost any distribution. Likelihood ratio statistical tests and a nonparametric bootstrap confidence-bound estimation procedure based on the discrete estimates are presented that can be used to test the fit of a parametric family. We applied the methodology to several different protein data sets, and found that although the gamma model often provides a good parametric model for this type of data, rate estimates from an equal-probability discrete gamma model with a small number of categories will tend to underestimate the largest rates. In cases when the gamma model assumption is in doubt, rate estimates coming from the discrete rate distribution estimate with a large number of rate categories provide a robust alternative to gamma estimates. An alternative implementation of the gamma distribution is proposed that, for equal numbers of rate categories, is computationally more efficient during optimization than the standard gamma implementation and can provide more accurate estimates of site rates.

  5. The Positioning Accuracy of BAUV Using Fusion of Data from USBL System and Movement Parameters Measurements

    PubMed Central

    Krzysztof, Naus; Aleksander, Nowak

    2016-01-01

    The article presents a study of the accuracy of estimating the position coordinates of BAUV (Biomimetic Autonomous Underwater Vehicle) by the extended Kalman filter (EKF) method. The fusion of movement parameters measurements and position coordinates fixes was applied. The movement parameters measurements are carried out by on-board navigation devices, while the position coordinates fixes are done by the USBL (Ultra Short Base Line) system. The problem of underwater positioning and the conceptual design of the BAUV navigation system constructed at the Naval Academy (Polish Naval Academy—PNA) are presented in the first part of the paper. The second part consists of description of the evaluation results of positioning accuracy, the genesis of the problem of selecting method for underwater positioning, and the mathematical description of the method of estimating the position coordinates using the EKF method by the fusion of measurements with on-board navigation and measurements obtained with the USBL system. The main part contains a description of experimental research. It consists of a simulation program of navigational parameter measurements carried out during the BAUV passage along the test section. Next, the article covers the determination of position coordinates on the basis of simulated parameters, using EKF and DR methods and the USBL system, which are then subjected to a comparative analysis of accuracy. The final part contains systemic conclusions justifying the desirability of applying the proposed fusion method of navigation parameters for the BAUV positioning. PMID:27537884

  6. The Positioning Accuracy of BAUV Using Fusion of Data from USBL System and Movement Parameters Measurements.

    PubMed

    Krzysztof, Naus; Aleksander, Nowak

    2016-08-15

    The article presents a study of the accuracy of estimating the position coordinates of BAUV (Biomimetic Autonomous Underwater Vehicle) by the extended Kalman filter (EKF) method. The fusion of movement parameters measurements and position coordinates fixes was applied. The movement parameters measurements are carried out by on-board navigation devices, while the position coordinates fixes are done by the USBL (Ultra Short Base Line) system. The problem of underwater positioning and the conceptual design of the BAUV navigation system constructed at the Naval Academy (Polish Naval Academy-PNA) are presented in the first part of the paper. The second part consists of description of the evaluation results of positioning accuracy, the genesis of the problem of selecting method for underwater positioning, and the mathematical description of the method of estimating the position coordinates using the EKF method by the fusion of measurements with on-board navigation and measurements obtained with the USBL system. The main part contains a description of experimental research. It consists of a simulation program of navigational parameter measurements carried out during the BAUV passage along the test section. Next, the article covers the determination of position coordinates on the basis of simulated parameters, using EKF and DR methods and the USBL system, which are then subjected to a comparative analysis of accuracy. The final part contains systemic conclusions justifying the desirability of applying the proposed fusion method of navigation parameters for the BAUV positioning.

  7. Maintaining and Enhancing Diversity of Sampled Protein Conformations in Robotics-Inspired Methods.

    PubMed

    Abella, Jayvee R; Moll, Mark; Kavraki, Lydia E

    2018-01-01

    The ability to efficiently sample structurally diverse protein conformations allows one to gain a high-level view of a protein's energy landscape. Algorithms from robot motion planning have been used for conformational sampling, and several of these algorithms promote diversity by keeping track of "coverage" in conformational space based on the local sampling density. However, large proteins present special challenges. In particular, larger systems require running many concurrent instances of these algorithms, but these algorithms can quickly become memory intensive because they typically keep previously sampled conformations in memory to maintain coverage estimates. In addition, robotics-inspired algorithms depend on defining useful perturbation strategies for exploring the conformational space, which is a difficult task for large proteins because such systems are typically more constrained and exhibit complex motions. In this article, we introduce two methodologies for maintaining and enhancing diversity in robotics-inspired conformational sampling. The first method addresses algorithms based on coverage estimates and leverages the use of a low-dimensional projection to define a global coverage grid that maintains coverage across concurrent runs of sampling. The second method is an automatic definition of a perturbation strategy through readily available flexibility information derived from B-factors, secondary structure, and rigidity analysis. Our results show a significant increase in the diversity of the conformations sampled for proteins consisting of up to 500 residues when applied to a specific robotics-inspired algorithm for conformational sampling. The methodologies presented in this article may be vital components for the scalability of robotics-inspired approaches.

  8. A Systematic Review of the Frequency of Neurocyticercosis with a Focus on People with Epilepsy

    PubMed Central

    Ndimubanzi, Patrick C.; Carabin, Hélène; Budke, Christine M.; Nguyen, Hai; Qian, Ying-Jun; Rainwater, Elizabeth; Dickey, Mary; Reynolds, Stephanie; Stoner, Julie A.

    2010-01-01

    Background The objective of this study is to conduct a systematic review of studies reporting the frequency of neurocysticercosis (NCC) worldwide. Methods/Principal Findings PubMed, Commonwealth Agricultural Bureau (CAB) abstracts and 23 international databases were systematically searched for articles published from January 1, 1990 to June 1, 2008. Articles were evaluated for inclusion by at least two researchers focusing on study design and methods. Data were extracted independently using standardized forms. A random-effects binomial model was used to estimate the proportion of NCC among people with epilepsy (PWE). Overall, 565 articles were retrieved and 290 (51%) selected for further analysis. After a second analytic phase, only 4.5% of articles, all of which used neuroimaging for the diagnosis of NCC, were reviewed. Only two studies, both from the US, estimated an incidence rate of NCC using hospital discharge data. The prevalence of NCC in a random sample of village residents was reported from one study where 9.1% of the population harboured brain lesions of NCC. The proportion of NCC among different study populations varied widely. However, the proportion of NCC in PWE was a lot more consistent. The pooled estimate for this population was 29.0% (95%CI: 22.9%–35.5%). These results were not sensitive to the inclusion or exclusion of any particular study. Conclusion/Significance Only one study has estimated the prevalence of NCC in a random sample of all residents. Hence, the prevalence of NCC worldwide remains unknown. However, the pooled estimate for the proportion of NCC among PWE was very robust and could be used, in conjunction with estimates of the prevalence and incidence of epilepsy, to estimate this component of the burden of NCC in endemic areas. The previously recommended guidelines for the diagnostic process and for declaring NCC an international reportable disease would improve the knowledge on the global frequency of NCC. PMID:21072231

  9. A systematic review of the frequency of neurocyticercosis with a focus on people with epilepsy.

    PubMed

    Ndimubanzi, Patrick C; Carabin, Hélène; Budke, Christine M; Nguyen, Hai; Qian, Ying-Jun; Rainwater, Elizabeth; Dickey, Mary; Reynolds, Stephanie; Stoner, Julie A

    2010-11-02

    The objective of this study is to conduct a systematic review of studies reporting the frequency of neurocysticercosis (NCC) worldwide. PubMed, Commonwealth Agricultural Bureau (CAB) abstracts and 23 international databases were systematically searched for articles published from January 1, 1990 to June 1, 2008. Articles were evaluated for inclusion by at least two researchers focusing on study design and methods. Data were extracted independently using standardized forms. A random-effects binomial model was used to estimate the proportion of NCC among people with epilepsy (PWE). Overall, 565 articles were retrieved and 290 (51%) selected for further analysis. After a second analytic phase, only 4.5% of articles, all of which used neuroimaging for the diagnosis of NCC, were reviewed. Only two studies, both from the US, estimated an incidence rate of NCC using hospital discharge data. The prevalence of NCC in a random sample of village residents was reported from one study where 9.1% of the population harboured brain lesions of NCC. The proportion of NCC among different study populations varied widely. However, the proportion of NCC in PWE was a lot more consistent. The pooled estimate for this population was 29.0% (95%CI: 22.9%-35.5%). These results were not sensitive to the inclusion or exclusion of any particular study. Only one study has estimated the prevalence of NCC in a random sample of all residents. Hence, the prevalence of NCC worldwide remains unknown. However, the pooled estimate for the proportion of NCC among PWE was very robust and could be used, in conjunction with estimates of the prevalence and incidence of epilepsy, to estimate this component of the burden of NCC in endemic areas. The previously recommended guidelines for the diagnostic process and for declaring NCC an international reportable disease would improve the knowledge on the global frequency of NCC.

  10. A torque estimator for a traveling wave ultrasonic motor--application to an active claw.

    PubMed

    Giraud, Frédéric; Semail, Betty

    2006-08-01

    Depending on its electrical-to-mechanical energy conversion process, the torque on a traveling wave ultrasonic motor (TWUM)'s shaft is not directly proportional to a measurable electrical variable, such as current or voltage. But it is derived from a complicated process at the stator/rotor interface. The load torque is thus quite unknown, and this can be a disadvantage in applications in which a torque limitation is required or a torque measurement is needed. The aim of this article is to come up with a straightforward torque estimator on a TWUM. For that purpose, the motor is modeled; this modeling leads to different estimator strategies. More specifically, we chose a strategy for which a speed sensor is useless, relying only on the stator's resonant behavior. The parameters of the motor needed for the estimator are measured afterward, and some nonlinearities are identified and taken into account. Several experimental trials then are carried out to check the performance of the estimator. A claw actuated by a TWUM is presented because this is a typical application in which the knowledge of the torque helps guarantee the safety of the device.

  11. Reduced rank models for travel time estimation of low order mode pulses.

    PubMed

    Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M

    2013-10-01

    Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.

  12. Estimating the benefits of public health policies that reduce harmful consumption.

    PubMed

    Ashley, Elizabeth M; Nardinelli, Clark; Lavaty, Rosemarie A

    2015-05-01

    For products such as tobacco and junk food, where policy interventions are often designed to decrease consumption, affected consumers gain utility from improvements in lifetime health and longevity but also lose utility associated with the activity of consuming the product. In the case of anti-smoking policies, even though published estimates of gross health and longevity benefits are up to 900 times higher than the net consumer benefits suggested by a more direct willingness-to-pay estimation approach, there is little recognition in the cost-benefit and cost-effectiveness literature that gross estimates will overstate intrapersonal welfare improvements when utility losses are not netted out. This paper presents a general framework for analyzing policies that are designed to reduce inefficiently high consumption and provides a rule of thumb for the relationship between net and gross consumer welfare effects: where there exists a plausible estimate of the tax that would allow consumers to fully internalize health costs, the ratio of the tax to the per-unit long-term cost can provide an upper bound on the ratio of net to gross benefits. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  13. A novel measure of effect size for mediation analysis.

    PubMed

    Lachowicz, Mark J; Preacher, Kristopher J; Kelley, Ken

    2018-06-01

    Mediation analysis has become one of the most popular statistical methods in the social sciences. However, many currently available effect size measures for mediation have limitations that restrict their use to specific mediation models. In this article, we develop a measure of effect size that addresses these limitations. We show how modification of a currently existing effect size measure results in a novel effect size measure with many desirable properties. We also derive an expression for the bias of the sample estimator for the proposed effect size measure and propose an adjusted version of the estimator. We present a Monte Carlo simulation study conducted to examine the finite sampling properties of the adjusted and unadjusted estimators, which shows that the adjusted estimator is effective at recovering the true value it estimates. Finally, we demonstrate the use of the effect size measure with an empirical example. We provide freely available software so that researchers can immediately implement the methods we discuss. Our developments here extend the existing literature on effect sizes and mediation by developing a potentially useful method of communicating the magnitude of mediation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. A different approach to estimate nonlinear regression model using numerical methods

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper concerns with the computational methods namely the Gauss-Newton method, Gradient algorithm methods (Newton-Raphson method, Steepest Descent or Steepest Ascent algorithm method, the Method of Scoring, the Method of Quadratic Hill-Climbing) based on numerical analysis to estimate parameters of nonlinear regression model in a very different way. Principles of matrix calculus have been used to discuss the Gradient-Algorithm methods. Yonathan Bard [1] discussed a comparison of gradient methods for the solution of nonlinear parameter estimation problems. However this article discusses an analytical approach to the gradient algorithm methods in a different way. This paper describes a new iterative technique namely Gauss-Newton method which differs from the iterative technique proposed by Gorden K. Smyth [2]. Hans Georg Bock et.al [10] proposed numerical methods for parameter estimation in DAE’s (Differential algebraic equation). Isabel Reis Dos Santos et al [11], Introduced weighted least squares procedure for estimating the unknown parameters of a nonlinear regression metamodel. For large-scale non smooth convex minimization the Hager and Zhang (HZ) conjugate gradient Method and the modified HZ (MHZ) method were presented by Gonglin Yuan et al [12].

  15. Global CO2 emissions from cement production

    NASA Astrophysics Data System (ADS)

    Andrew, Robbie M.

    2018-01-01

    The global production of cement has grown very rapidly in recent years, and after fossil fuels and land-use change, it is the third-largest source of anthropogenic emissions of carbon dioxide. The required data for estimating emissions from global cement production are poor, and it has been recognised that some global estimates are significantly inflated. Here we assemble a large variety of available datasets and prioritise official data and emission factors, including estimates submitted to the UNFCCC plus new estimates for China and India, to present a new analysis of global process emissions from cement production. We show that global process emissions in 2016 were 1.45±0.20 Gt CO2, equivalent to about 4 % of emissions from fossil fuels. Cumulative emissions from 1928 to 2016 were 39.3±2.4 Gt CO2, 66 % of which have occurred since 1990. Emissions in 2015 were 30 % lower than those recently reported by the Global Carbon Project. The data associated with this article can be found at https://doi.org/10.5281/zenodo.831455.

  16. Childbearing in adolescents aged 12-15 years in low resource countries: a neglected issue. New estimates from demographic and household surveys in 42 countries.

    PubMed

    Neal, Sarah; Matthews, Zoë; Frost, Melanie; Fogstad, Helga; Camacho, Alma V; Laski, Laura

    2012-09-01

    There is strong evidence that the health risks associated with adolescent pregnancy are concentrated among the youngest girls (e.g. those under 16 years). Fertility rates in this age group have not previously been comprehensively estimated and published. By drawing data from 42 large, nationally representative household surveys in low resource countries carried out since 2003 this article presents estimates of age-specific birth rates for girls aged 12-15, and the percentage of girls who give birth at age 15 or younger. From these we estimate that approximately 2.5 million births occur to girls aged under 16 in low resource countries each year. The highest rates are found in Sub-Saharan Africa, where in Chad, Guinea, Mali, Mozambique, Niger and Sierra Leone more than 10% of girls become mothers before they are 16. Strategies to reduce these high levels are vital if we are to alleviate poor reproductive health. © 2012 The Authors  Acta Obstetricia et Gynecologica Scandinavica© 2012 Nordic Federation of Societies of Obstetrics and Gynecology.

  17. Multirate state and parameter estimation in an antibiotic fermentation with delayed measurements.

    PubMed

    Gudi, R D; Shah, S L; Gray, M R

    1994-12-01

    This article discusses issues related to estimation and monitoring of fermentation processes that exhibit endogenous metabolism and time-varying maintenance activity. Such culture-related activities hamper the use of traditional, software sensor-based algorithms, such as the extended kalman filter (EKF). In the approach presented here, the individual effects of the endogenous decay and the true maintenance processes have been lumped to represent a modified maintenance coefficient, m(c). Model equations that relate measurable process outputs, such as the carbon dioxide evolution rate (CER) and biomass, to the observable process parameters (such as net specific growth rate and the modified maintenance coefficient) are proposed. These model equations are used in an estimator that can formally accommodate delayed, infrequent measurements of the culture states (such as the biomass) as well as frequent, culture-related secondary measurements (such as the CER). The resulting multirate software sensor-based estimation strategy is used to monitor biomass profiles as well as profiles of critical fermentation parameters, such as the specific growth for a fed-batch fermentation of Streptomyces clavuligerus.

  18. State-of-charge estimation in lithium-ion batteries: A particle filter approach

    NASA Astrophysics Data System (ADS)

    Tulsyan, Aditya; Tsai, Yiting; Gopaluni, R. Bhushan; Braatz, Richard D.

    2016-11-01

    The dynamics of lithium-ion batteries are complex and are often approximated by models consisting of partial differential equations (PDEs) relating the internal ionic concentrations and potentials. The Pseudo two-dimensional model (P2D) is one model that performs sufficiently accurately under various operating conditions and battery chemistries. Despite its widespread use for prediction, this model is too complex for standard estimation and control applications. This article presents an original algorithm for state-of-charge estimation using the P2D model. Partial differential equations are discretized using implicit stable algorithms and reformulated into a nonlinear state-space model. This discrete, high-dimensional model (consisting of tens to hundreds of states) contains implicit, nonlinear algebraic equations. The uncertainty in the model is characterized by additive Gaussian noise. By exploiting the special structure of the pseudo two-dimensional model, a novel particle filter algorithm that sweeps in time and spatial coordinates independently is developed. This algorithm circumvents the degeneracy problems associated with high-dimensional state estimation and avoids the repetitive solution of implicit equations by defining a 'tether' particle. The approach is illustrated through extensive simulations.

  19. Estimation technique of corrective effects for forecasting of reliability of the designed and operated objects of the generating systems

    NASA Astrophysics Data System (ADS)

    Truhanov, V. N.; Sultanov, M. M.

    2017-11-01

    In the present article researches of statistical material on the refusals and malfunctions influencing operability of heat power installations have been conducted. In this article the mathematical model of change of output characteristics of the turbine depending on number of the refusals revealed in use has been presented. The mathematical model is based on methods of mathematical statistics, probability theory and methods of matrix calculation. The novelty of this model is that it allows to predict the change of the output characteristic in time, and the operating influences have been presented in an explicit form. As desirable dynamics of change of the output characteristic (function, reliability) the law of distribution of Veybull which is universal is adopted since at various values of parameters it turns into other types of distributions (for example, exponential, normal, etc.) It should be noted that the choice of the desirable law of management allows to determine the necessary management parameters with use of the saved-up change of the output characteristic in general. The output characteristic can be changed both on the speed of change of management parameters, and on acceleration of change of management parameters. In this article the technique of an assessment of the pseudo-return matrix has been stated in detail by the method of the smallest squares and the standard Microsoft Excel functions. Also the technique of finding of the operating effects when finding restrictions both for the output characteristic, and on management parameters has been considered. In the article the order and the sequence of finding of management parameters has been stated. A concrete example of finding of the operating effects in the course of long-term operation of turbines has been shown.

  20. Recov'Heat: An estimation tool of urban waste heat recovery potential in sustainable cities

    NASA Astrophysics Data System (ADS)

    Goumba, Alain; Chiche, Samuel; Guo, Xiaofeng; Colombert, Morgane; Bonneau, Patricia

    2017-02-01

    Waste heat recovery is considered as an efficient way to increase carbon-free green energy utilization and to reduce greenhouse gas emission. Especially in urban area, several sources such as sewage water, industrial process, waste incinerator plants, etc., are still rarely explored. Their integration into a district heating system providing heating and/or domestic hot water could be beneficial for both energy companies and local governments. EFFICACITY, a French research institute focused on urban energy transition, has developed an estimation tool for different waste heat sources potentially explored in a sustainable city. This article presents the development method of such a decision making tool which, by giving both energetic and economic analysis, helps local communities and energy service companies to make preliminary studies in heat recovery projects.

  1. Estimation of velocities via optical flow

    NASA Astrophysics Data System (ADS)

    Popov, A.; Miller, A.; Miller, B.; Stepanyan, K.

    2017-02-01

    This article presents an approach to the optical flow (OF) usage as a general navigation means providing the information about the linear and angular vehicle's velocities. The term of "OF" came from opto-electronic devices where it corresponds to a video sequence of images related to the camera motion either over static surfaces or set of objects. Even if the positions of these objects are unknown in advance, one can estimate the camera motion provided just by video sequence itself and some metric information, such as distance between the objects or the range to the surface. This approach is applicable to any passive observation system which is able to produce a sequence of images, such as radio locator or sonar. Here the UAV application of the OF is considered since it is historically

  2. Cost-benefit analysis of the 55-mph speed limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forester, T.H.; McNown, R.F.; Singell, L.D.

    1984-01-01

    This article presents the results of an empirical study which estimates the number of reduced fatalities as a result of the imposed 55-mph speed limit. Time series data for the US from 1952 to 1979 is employed in a regression model capturing the relation between fatalities, average speed, variability of speed, and the speed limit. Also discussed are the alternative approaches to valuing human life and the value of time. Provided is a series of benefit-cost ratios based on alternative measures of the benefits and costs from life saving. The paper concludes that the 55-mph speed limit is not costmore » efficient unless additional time on the highway is valued significantly below levels estimated in the best reasearch on the value of time. 12 references, 1 table.« less

  3. Reliability verification of vehicle speed estimate method in forensic videos.

    PubMed

    Kim, Jong-Hyuk; Oh, Won-Taek; Choi, Ji-Hun; Park, Jong-Chan

    2018-06-01

    In various types of traffic accidents, including car-to-car crash, vehicle-pedestrian collision, and hit-and-run accident, driver overspeed is one of the critical issues of traffic accident analysis. Hence, analysis of vehicle speed at the moment of accident is necessary. The present article proposes a vehicle speed estimate method (VSEM) applying a virtual plane and a virtual reference line to a forensic video. The reliability of the VSEM was verified by comparing the results obtained by applying the VSEM to videos from a test vehicle driving with a global positioning system (GPS)-based Vbox speed. The VSEM verified by these procedures was applied to real traffic accident examples to evaluate the usability of the VSEM. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Estimation of Coal Reserves for UCG in the Upper Silesian Coal Basin, Poland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bialecka, Barbara

    One of the prospective methods of coal utilization, especially in case of coal resources which are not mineable by means of conventional methods, is underground coal gasification (UCG). This technology allows recovery of coal energy 'in situ' and thus avoid the health and safety risks related to people which are inseparable from traditional coal extraction techniques.In Poland most mining areas are characterized by numerous coal beds where extraction was ceased on account of technical and economic reasons or safety issues. This article presents estimates of Polish hard coal resources, broken down into individual mines, that can constitute the basis ofmore » raw materials for the gasification process. Five mines, representing more than 4 thousand tons, appear to be UCG candidates.« less

  5. Estimating interaction on an additive scale between continuous determinants in a logistic regression model.

    PubMed

    Knol, Mirjam J; van der Tweel, Ingeborg; Grobbee, Diederick E; Numans, Mattijs E; Geerlings, Mirjam I

    2007-10-01

    To determine the presence of interaction in epidemiologic research, typically a product term is added to the regression model. In linear regression, the regression coefficient of the product term reflects interaction as departure from additivity. However, in logistic regression it refers to interaction as departure from multiplicativity. Rothman has argued that interaction estimated as departure from additivity better reflects biologic interaction. So far, literature on estimating interaction on an additive scale using logistic regression only focused on dichotomous determinants. The objective of the present study was to provide the methods to estimate interaction between continuous determinants and to illustrate these methods with a clinical example. and results From the existing literature we derived the formulas to quantify interaction as departure from additivity between one continuous and one dichotomous determinant and between two continuous determinants using logistic regression. Bootstrapping was used to calculate the corresponding confidence intervals. To illustrate the theory with an empirical example, data from the Utrecht Health Project were used, with age and body mass index as risk factors for elevated diastolic blood pressure. The methods and formulas presented in this article are intended to assist epidemiologists to calculate interaction on an additive scale between two variables on a certain outcome. The proposed methods are included in a spreadsheet which is freely available at: http://www.juliuscenter.nl/additive-interaction.xls.

  6. Benchmarking real-time RGBD odometry for light-duty UAVs

    NASA Astrophysics Data System (ADS)

    Willis, Andrew R.; Sahawneh, Laith R.; Brink, Kevin M.

    2016-06-01

    This article describes the theoretical and implementation challenges associated with generating 3D odometry estimates (delta-pose) from RGBD sensor data in real-time to facilitate navigation in cluttered indoor environments. The underlying odometry algorithm applies to general 6DoF motion; however, the computational platforms, trajectories, and scene content are motivated by their intended use on indoor, light-duty UAVs. Discussion outlines the overall software pipeline for sensor processing and details how algorithm choices for the underlying feature detection and correspondence computation impact the real-time performance and accuracy of the estimated odometry and associated covariance. This article also explores the consistency of odometry covariance estimates and the correlation between successive odometry estimates. The analysis is intended to provide users information needed to better leverage RGBD odometry within the constraints of their systems.

  7. Alternative Statistical Frameworks for Student Growth Percentile Estimation

    ERIC Educational Resources Information Center

    Lockwood, J. R.; Castellano, Katherine E.

    2015-01-01

    This article suggests two alternative statistical approaches for estimating student growth percentiles (SGP). The first is to estimate percentile ranks of current test scores conditional on past test scores directly, by modeling the conditional cumulative distribution functions, rather than indirectly through quantile regressions. This would…

  8. Penalized Nonlinear Least Squares Estimation of Time-Varying Parameters in Ordinary Differential Equations

    PubMed Central

    Cao, Jiguo; Huang, Jianhua Z.; Wu, Hulin

    2012-01-01

    Ordinary differential equations (ODEs) are widely used in biomedical research and other scientific areas to model complex dynamic systems. It is an important statistical problem to estimate parameters in ODEs from noisy observations. In this article we propose a method for estimating the time-varying coefficients in an ODE. Our method is a variation of the nonlinear least squares where penalized splines are used to model the functional parameters and the ODE solutions are approximated also using splines. We resort to the implicit function theorem to deal with the nonlinear least squares objective function that is only defined implicitly. The proposed penalized nonlinear least squares method is applied to estimate a HIV dynamic model from a real dataset. Monte Carlo simulations show that the new method can provide much more accurate estimates of functional parameters than the existing two-step local polynomial method which relies on estimation of the derivatives of the state function. Supplemental materials for the article are available online. PMID:23155351

  9. Smooth time-dependent receiver operating characteristic curve estimators.

    PubMed

    Martínez-Camblor, Pablo; Pardo-Fernández, Juan Carlos

    2018-03-01

    The receiver operating characteristic curve is a popular graphical method often used to study the diagnostic capacity of continuous (bio)markers. When the considered outcome is a time-dependent variable, two main extensions have been proposed: the cumulative/dynamic receiver operating characteristic curve and the incident/dynamic receiver operating characteristic curve. In both cases, the main problem for developing appropriate estimators is the estimation of the joint distribution of the variables time-to-event and marker. As usual, different approximations lead to different estimators. In this article, the authors explore the use of a bivariate kernel density estimator which accounts for censored observations in the sample and produces smooth estimators of the time-dependent receiver operating characteristic curves. The performance of the resulting cumulative/dynamic and incident/dynamic receiver operating characteristic curves is studied by means of Monte Carlo simulations. Additionally, the influence of the choice of the required smoothing parameters is explored. Finally, two real-applications are considered. An R package is also provided as a complement to this article.

  10. Estimating the Cost to do a Cost Estimate

    NASA Technical Reports Server (NTRS)

    Remer, D. S.; Buchanan, H. R.

    1998-01-01

    This article provides a model for estimating the cost required to do a cost estimate. Overruns may lead to concellation of a project. In 1991, we completed a study on the cost of doing cost estimates for the class of projects normally encountered in the development and implementation of equipment at the network of tracking stations operated by the Jet Propulsion Laboratory (JPL) for NASA.

  11. Assessing the Performance of the "Counterfactual as Self-Estimated by Program Participants": Results from a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Mueller, Christoph Emanuel; Gaus, Hansjoerg

    2015-01-01

    In this article, we test an alternative approach to creating a counterfactual basis for estimating individual and average treatment effects. Instead of using control/comparison groups or before-measures, the so-called Counterfactual as Self-Estimated by Program Participants (CSEPP) relies on program participants' self-estimations of their own…

  12. Users' Guide to USDA Estimates of the Cost of Raising a Child.

    ERIC Educational Resources Information Center

    Edwards, Carolyn S.

    In this article, estimates of the cost of raising a child, that are available from the U.S. Department of Agriculture, are described; the most widely requested estimates updated to current price levels are provided; and the most frequently asked questions about the use and interpretation of these estimates are answered. Information on additional…

  13. A Generalized DIF Effect Variance Estimator for Measuring Unsigned Differential Test Functioning in Mixed Format Tests

    ERIC Educational Resources Information Center

    Penfield, Randall D.; Algina, James

    2006-01-01

    One approach to measuring unsigned differential test functioning is to estimate the variance of the differential item functioning (DIF) effect across the items of the test. This article proposes two estimators of the DIF effect variance for tests containing dichotomous and polytomous items. The proposed estimators are direct extensions of the…

  14. Lord's Wald Test for Detecting Dif in Multidimensional Irt Models: A Comparison of Two Estimation Approaches

    ERIC Educational Resources Information Center

    Lee, Soo; Suh, Youngsuk

    2018-01-01

    Lord's Wald test for differential item functioning (DIF) has not been studied extensively in the context of the multidimensional item response theory (MIRT) framework. In this article, Lord's Wald test was implemented using two estimation approaches, marginal maximum likelihood estimation and Bayesian Markov chain Monte Carlo estimation, to detect…

  15. Crude mortality and loss of life expectancy of patients diagnosed with urothelial carcinoma of the urinary bladder in Norway.

    PubMed

    Andreassen, Bettina K; Myklebust, Tor Å; Haug, Erik S

    2017-02-01

    Reports from cancer registries often lack clinically relevant information, which would be useful in estimating the prognosis of individual patients with urothelial carcinoma of the urinary bladder (UCB). This article presents estimates of crude probabilities of death due to UCB and the expected loss of lifetime stratified for patient characteristics. In Norway, 10,332 patients were diagnosed with UCB between 2001 and 2010. The crude probabilities of death due to UCB were estimated, stratified by gender, age and T stage, using flexible parametric survival models. Based on these models, the loss in expectation of lifetime due to UCB was also estimated for the different strata. There is large variation in the estimated crude probabilities of death due to UCB (from 0.03 to 0.76 within 10 years since diagnosis) depending on age, gender and T stage. Furthermore, the expected loss of life expectancy is more than a decade for younger patients with muscle-invasive UCB and between a few months and 5 years for nonmuscle-invasive UCB. The suggested framework leads to clinically relevant prognostic risk estimates for individual patients diagnosed with UCB and the consequence in terms of loss of lifetime expectation. The published probability tables can be used in clinical praxis for risk communication.

  16. Cost-benefit analysis involving addictive goods: contingent valuation to estimate willingness-to-pay for smoking cessation.

    PubMed

    Weimer, David L; Vining, Aidan R; Thomas, Randall K

    2009-02-01

    The valuation of changes in consumption of addictive goods resulting from policy interventions presents a challenge for cost-benefit analysts. Consumer surplus losses from reduced consumption of addictive goods that are measured relative to market demand schedules overestimate the social cost of cessation interventions. This article seeks to show that consumer surplus losses measured using a non-addicted demand schedule provide a better assessment of social cost. Specifically, (1) it develops an addiction model that permits an estimate of the smoker's compensating variation for the elimination of addiction; (2) it employs a contingent valuation survey of current smokers to estimate their willingness-to-pay (WTP) for a treatment that would eliminate addiction; (3) it uses the estimate of WTP from the survey to calculate the fraction of consumer surplus that should be viewed as consumer value; and (4) it provides an estimate of this fraction. The exercise suggests that, as a tentative first and rough rule-of-thumb, only about 75% of the loss of the conventionally measured consumer surplus should be counted as social cost for policies that reduce the consumption of cigarettes. Additional research to estimate this important rule-of-thumb is desirable to address the various caveats relevant to this study. Copyright (c) 2008 John Wiley & Sons, Ltd.

  17. SPATIO-TEMPORAL MODELING OF AGRICULTURAL YIELD DATA WITH AN APPLICATION TO PRICING CROP INSURANCE CONTRACTS

    PubMed Central

    Ozaki, Vitor A.; Ghosh, Sujit K.; Goodwin, Barry K.; Shirota, Ricardo

    2009-01-01

    This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Paraná (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited. PMID:19890450

  18. Implementing a method of screening one-well hydraulic barrier design alternatives.

    PubMed

    Rubin, Hillel; Shoemaker, Christine A; Köngeter, Jürgen

    2009-01-01

    This article provides details of applying the method developed by the authors (Rubin et al. 2008b) for screening one-well hydraulic barrier design alternatives. The present article with its supporting information (manual and electronic spreadsheets with a case history example) provides the reader complete details and examples of solving the set of nonlinear equations developed by Rubin et al. (2008b). It allows proper use of the analytical solutions and also depicting the various charts given by Rubin et al. (2008b). The final outputs of the calculations are the required position and the discharge of the pumping well. If the contaminant source is nonaqueous phase liquid (NAPL) entrapped within the aquifer, then the method provides an estimate of the aquifer remediation progress (which is a by-product) due to operating the hydraulic barrier.

  19. A place-based model of local activity spaces: individual place exposure and characteristics

    NASA Astrophysics Data System (ADS)

    Hasanzadeh, Kamyar; Laatikainen, Tiina; Kyttä, Marketta

    2018-01-01

    Researchers for long have hypothesized relationships between mobility, urban context, and health. Despite the ample amount of discussions, the empirical findings corroborating such associations remain to be marginal in the literature. It is growingly believed that the weakness of the observed associations can be largely explained by the common misspecification of the geographical context. Researchers coming from different fields have developed a wide range of methods for estimating the extents of these geographical contexts. In this article, we argue that no single approach yet has sufficiently been capable of capturing the complexity of human mobility patterns. Subsequently, we discuss that reaching a better understanding of individual activity spaces can be possible through a spatially sensitive estimation of place exposure. Following this discussion, we take an integrative person and place-based approach to create an individualized residential exposure model (IREM) to estimate the local activity spaces (LAS) of the individuals. This model is created using data collected through public participation GIS. Following a brief comparison of IREM with other commonly used LAS models, the article continues by presenting an empirical study of aging citizens in Helsinki area to demonstrate the usability of the proposed framework. In this study, we identify the main dimensions of LASs and seek their associations with socio-demographic characteristics of individuals and their location in the region. The promising results from comparisons and the interesting findings from the empirical part suggest both a methodological and conceptual improvement in capturing the complexity of local activity spaces.

  20. Estimating Gender Wage Gaps: A Data Update

    ERIC Educational Resources Information Center

    McDonald, Judith A.; Thornton, Robert J.

    2016-01-01

    In the authors' 2011 "JEE" article, "Estimating Gender Wage Gaps," they described an interesting class project that allowed students to estimate the current gender earnings gap for recent college graduates using data from the National Association of Colleges and Employers (NACE). Unfortunately, since 2012, NACE no longer…

  1. Manual and automatic locomotion scoring systems in dairy cows: a review.

    PubMed

    Schlageter-Tello, Andrés; Bokkers, Eddie A M; Koerkamp, Peter W G Groot; Van Hertem, Tom; Viazzi, Stefano; Romanini, Carlos E B; Halachmi, Ilan; Bahr, Claudia; Berckmans, Daniël; Lokhorst, Kees

    2014-09-01

    The objective of this review was to describe, compare and evaluate agreement, reliability, and validity of manual and automatic locomotion scoring systems (MLSSs and ALSSs, respectively) used in dairy cattle lameness research. There are many different types of MLSSs and ALSSs. Twenty-five MLSSs were found in 244 articles. MLSSs use different types of scale (ordinal or continuous) and different gait and posture traits need to be observed. The most used MLSS (used in 28% of the references) is based on asymmetric gait, reluctance to bear weight, and arched back, and is scored on a five-level scale. Fifteen ALSSs were found that could be categorized according to three approaches: (a) the kinetic approach measures forces involved in locomotion, (b) the kinematic approach measures time and distance of variables associated to limb movement and some specific posture variables, and (c) the indirect approach uses behavioural variables or production variables as indicators for impaired locomotion. Agreement and reliability estimates were scarcely reported in articles related to MLSSs. When reported, inappropriate statistical methods such as PABAK and Pearson and Spearman correlation coefficients were commonly used. Some of the most frequently used MLSSs were poorly evaluated for agreement and reliability. Agreement and reliability estimates for the original four-, five- or nine-level MLSS, expressed in percentage of agreement, kappa and weighted kappa, showed large ranges among and sometimes also within articles. After the transformation into a two-level scale, agreement and reliability estimates showed acceptable estimates (percentage of agreement ≥ 75%; kappa and weighted kappa ≥ 0.6), but still estimates showed a large variation between articles. Agreement and reliability estimates for ALSSs were not reported in any article. Several ALSSs use MLSSs as a reference for model calibration and validation. However, varying agreement and reliability estimates of MLSSs make a clear definition of a lameness case difficult, and thus affect the validity of ALSSs. MLSSs and ALSSs showed limited validity for hoof lesion detection and pain assessment. The utilization of MLSSs and ALSSs should aim to the prevention and efficient management of conditions that induce impaired locomotion. Long-term studies comparing MLSSs and ALSSs while applying various strategies to detect and control unfavourable conditions leading to impaired locomotion are required to determine the usefulness of MLSSs and ALSSs for securing optimal production and animal welfare in practice. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Production scheduling with ant colony optimization

    NASA Astrophysics Data System (ADS)

    Chernigovskiy, A. S.; Kapulin, D. V.; Noskova, E. E.; Yamskikh, T. N.; Tsarev, R. Yu

    2017-10-01

    The optimum solution of the production scheduling problem for manufacturing processes at an enterprise is crucial as it allows one to obtain the required amount of production within a specified time frame. Optimum production schedule can be found using a variety of optimization algorithms or scheduling algorithms. Ant colony optimization is one of well-known techniques to solve the global multi-objective optimization problem. In the article, the authors present a solution of the production scheduling problem by means of an ant colony optimization algorithm. A case study of the algorithm efficiency estimated against some others production scheduling algorithms is presented. Advantages of the ant colony optimization algorithm and its beneficial effect on the manufacturing process are provided.

  3. CEM-Consumer Exposure Model Download and Install Instructions

    EPA Pesticide Factsheets

    CEM contains a combination of models and default parameters which are used to estimate inhalation, dermal, and oral exposures to consumer products and articles for a wide variety of product and article use categories.

  4. Repository Planning, Design, and Engineering: Part II-Equipment and Costing.

    PubMed

    Baird, Phillip M; Gunter, Elaine W

    2016-08-01

    Part II of this article discusses and provides guidance on the equipment and systems necessary to operate a repository. The various types of storage equipment and monitoring and support systems are presented in detail. While the material focuses on the large repository, the requirements for a small-scale startup are also presented. Cost estimates and a cost model for establishing a repository are presented. The cost model presents an expected range of acquisition costs for the large capital items in developing a repository. A range of 5,000-7,000 ft(2) constructed has been assumed, with 50 frozen storage units, to reflect a successful operation with growth potential. No design or engineering costs, permit or regulatory costs, or smaller items such as the computers, software, furniture, phones, and barcode readers required for operations have been included.

  5. Suspect Screening Analysis of Chemicals in Consumer Products.

    PubMed

    Phillips, Katherine A; Yau, Alice; Favela, Kristin A; Isaacs, Kristin K; McEachran, Andrew; Grulke, Christopher; Richard, Ann M; Williams, Antony J; Sobus, Jon R; Thomas, Russell S; Wambaugh, John F

    2018-03-06

    A two-dimensional gas chromatography-time-of-flight/mass spectrometry (GC×GC-TOF/MS) suspect screening analysis method was used to rapidly characterize chemicals in 100 consumer products-which included formulations (e.g., shampoos, paints), articles (e.g., upholsteries, shower curtains), and foods (cereals)-and therefore supports broader efforts to prioritize chemicals based on potential human health risks. Analyses yielded 4270 unique chemical signatures across the products, with 1602 signatures tentatively identified using the National Institute of Standards and Technology 2008 spectral database. Chemical standards confirmed the presence of 119 compounds. Of the 1602 tentatively identified chemicals, 1404 were not present in a public database of known consumer product chemicals. Reported data and model predictions of chemical functional use were applied to evaluate the tentative chemical identifications. Estimated chemical concentrations were compared to manufacturer-reported values and other measured data. Chemical presence and concentration data can now be used to improve estimates of chemical exposure, and refine estimates of risk posed to human health and the environment.

  6. Estimates of electricity requirements for the recovery of mineral commodities, with examples applied to sub-Saharan Africa

    USGS Publications Warehouse

    Bleiwas, Donald I.

    2011-01-01

    To produce materials from mine to market it is necessary to overcome obstacles that include the force of gravity, the strength of molecular bonds, and technological inefficiencies. These challenges are met by the application of energy to accomplish the work that includes the direct use of electricity, fossil fuel, and manual labor. The tables and analyses presented in this study contain estimates of electricity consumption for the mining and processing of ores, concentrates, intermediate products, and industrial and refined metallic commodities on a kilowatt-hour per unit basis, primarily the metric ton or troy ounce. Data contained in tables pertaining to specific currently operating facilities are static, as the amount of electricity consumed to process or produce a unit of material changes over time for a great number of reasons. Estimates were developed from diverse sources that included feasibility studies, company-produced annual and sustainability reports, conference proceedings, discussions with government and industry experts, journal articles, reference texts, and studies by nongovernmental organizations.

  7. [A method for obtaining redshifts of quasars based on wavelet multi-scaling feature matching].

    PubMed

    Liu, Zhong-Tian; Li, Xiang-Ru; Wu, Fu-Chao; Zhao, Yong-Heng

    2006-09-01

    The LAMOST project, the world's largest sky survey project being implemented in China, is expected to obtain 10(5) quasar spectra. The main objective of the present article is to explore methods that can be used to estimate the redshifts of quasar spectra from LAMOST. Firstly, the features of the broad emission lines are extracted from the quasar spectra to overcome the disadvantage of low signal-to-noise ratio. Then the redshifts of quasar spectra can be estimated by using the multi-scaling feature matching. The experiment with the 15, 715 quasars from the SDSS DR2 shows that the correct rate of redshift estimated by the method is 95.13% within an error range of 0. 02. This method was designed to obtain the redshifts of quasar spectra with relative flux and a low signal-to-noise ratio, which is applicable to the LAMOST data and helps to study quasars and the large-scale structure of the universe etc.

  8. Sizing gaseous emboli using Doppler embolic signal intensity.

    PubMed

    Banahan, Caroline; Hague, James P; Evans, David H; Patel, Rizwan; Ramnarine, Kumar V; Chung, Emma M L

    2012-05-01

    Extension of transcranial Doppler embolus detection to estimation of bubble size has historically been hindered by difficulties in applying scattering theory to the interpretation of clinical data. This article presents a simplified approach to the sizing of air emboli based on analysis of Doppler embolic signal intensity, by using an approximation to the full scattering theory that can be solved to estimate embolus size. Tests using simulated emboli show that our algorithm is theoretically capable of sizing 90% of "emboli" to within 10% of their true radius. In vitro tests show that 69% of emboli can be sized to within 20% of their true value under ideal conditions, which reduces to 30% of emboli if the beam and vessel are severely misaligned. Our results demonstrate that estimation of bubble size during clinical monitoring could be used to distinguish benign microbubbles from potentially harmful macrobubbles during intraoperative clinical monitoring. Copyright © 2012 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  9. Meta-Analysis With Complex Research Designs: Dealing With Dependence From Multiple Measures and Multiple Group Comparisons

    PubMed Central

    Scammacca, Nancy; Roberts, Greg; Stuebing, Karla K.

    2013-01-01

    Previous research has shown that treating dependent effect sizes as independent inflates the variance of the mean effect size and introduces bias by giving studies with more effect sizes more weight in the meta-analysis. This article summarizes the different approaches to handling dependence that have been advocated by methodologists, some of which are more feasible to implement with education research studies than others. A case study using effect sizes from a recent meta-analysis of reading interventions is presented to compare the results obtained from different approaches to dealing with dependence. Overall, mean effect sizes and variance estimates were found to be similar, but estimates of indexes of heterogeneity varied. Meta-analysts are advised to explore the effect of the method of handling dependence on the heterogeneity estimates before conducting moderator analyses and to choose the approach to dependence that is best suited to their research question and their data set. PMID:25309002

  10. Comments on new classification, treatment algorithm and prognosis-estimating systems for sigmoid volvulus and ileosigmoid knotting: necessity and utility.

    PubMed

    Aksungur, N; Korkut, E

    2018-05-24

    We read Atamanalp classification, treatment algorithm and prognosis-estimating systems for sigmoid volvulus (SV) and ileosigmoid knotting (ISK) in Colorectal Disease [1,2]. Our comments relate to necessity and utility of these new classification systems. Classification or staging systems are generally used in malignant or premalignant pathologies such as colorectal cancers [3] or polyps [4]. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Cost analysis and facility reimbursement in the long-term health care industry.

    PubMed Central

    Ullmann, S G

    1984-01-01

    This article examines costs and develops a system of prospective reimbursement for the industry committed to long-term health care. Together with estimates of average cost functions--for purposes of determining those factors affecting the costs of long-term health care, the author examines in depth the cost effects of patient mix and facility quality. Policy implications are indicated. The article estimates cost savings and predicted improvements in facility performance resulting from adoption of a prospective reimbursement system. PMID:6427138

  12. Estimating the effects of harmonic voltage fluctuations on the temperature rise of squirrel-cage motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emanuel, A.E.

    1991-03-01

    This article presents a preliminary analysis of the effect of randomly varying harmonic voltages on the temperature rise of squirrel-cage motors. The stochastic process of random variations of harmonic voltages is defined by means of simple statistics (mean, standard deviation, type of distribution). Computational models based on a first-order approximation of the motor losses and on the Monte Carlo method yield results which prove that equipment with large thermal time-constant is capable of withstanding for a short period of time larger distortions than THD = 5%.

  13. [REM sleep behavior disorders in Parkinson's disease].

    PubMed

    Liashenko, E A; Poluéktov, M G; Levin, O S

    2014-01-01

    The article presents a literature review on REM sleep behavior disorder (RBD). The loss of REM atonia of sleep, such that patients act out the contents of their dreams, is described. The most important implication of research into this area is that patients with idiopathic RBD are at very high risk of developing synuclein-mediated neurodegenerative disease (Parkinson's disease, dementia with Lewy bodies and multiple system atrophy), with risk estimates that approximate 40-65% at 10 years. Thus, RBD is a reliable marker of prodromal synucleinopathy that open possibilities for neuroprotective therapy.

  14. Is There a Doctor Onboard? Medical Emergencies at 40,000 Feet.

    PubMed

    Donner, Howard J

    2017-05-01

    It is estimated 2.75 billion people travel aboard commercial airlines every year and 44,000 in-flight medical emergencies occur worldwide each year. Wilderness medicine requires a commonsense and improvisational approach to medical issues. A sudden call for assistance in the austere and unfamiliar surroundings of an airliner cabin may present the responding medical professional with a "wilderness medicine" experience. From resource management to equipment, this article sheds light on the unique conditions, challenges, and constraints of the flight environment. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Designing a New Payment Model for Oral Care in Seniors.

    PubMed

    Jones, Judith A; Monopoli, Michael

    2017-10-01

    With 10,000 baby boomers turning 65 every day, many will be on fixed incomes and will lose dental insurance upon retirement. This article presents why a dental benefit in Medicare might save the US government money, and who would likely benefit. It details an approach to estimating costs of inclusion of a dental benefit in Medicare, and compares the proposed approach to existing proposals. Additionally, the ensuing steps needed to advance the conversation to include oral health in healthcare for the aged will be discussed.

  16. Magnetic transmission gear finite element simulation with iron pole hysteresis

    NASA Astrophysics Data System (ADS)

    Filippini, Mattia; Alotto, Piergiorgio; Glehn, Gregor; Hameyer, Kay

    2018-04-01

    Ferromagnetic poles in a magnetic transmission gear require particular attention during their design process. Usually, during the numerical simulation of these devices the effects of hysteresis for loss estimation are neglected and considered only during post-processing calculations. Since the literature lacks hysteresis models, this paper adopts a homogenized hysteretic model able to include eddy current and hysteresis losses in 2D laminated materials for iron poles. In this article the results related to the hysteresis in a magnetic gear are presented and compared to the non-hysteretic approach.

  17. Simulation of the usage of Gaussian mixture models for the purpose of modelling virtual mass spectrometry data.

    PubMed

    Plechawska, Małgorzata; Polańska, Joanna

    2009-01-01

    This article presents the method of the processing of mass spectrometry data. Mass spectra are modelled with Gaussian Mixture Models. Every peak of the spectrum is represented by a single Gaussian. Its parameters describe the location, height and width of the corresponding peak of the spectrum. An authorial version of the Expectation Maximisation Algorithm was used to perform all calculations. Errors were estimated with a virtual mass spectrometer. The discussed tool was originally designed to generate a set of spectra within defined parameters.

  18. Analysis of factors influencing organic fruit and vegetable purchasing in Istanbul, Turkey.

    PubMed

    Oraman, Yasemin; Unakitan, Gökhan

    2010-01-01

    This article examines the influences on the purchasing decisions of fruit and vegetable consumers and presents findings from a survey conducted with 385 respondents living in urban areas in Istanbul, Turkey. It uses a binary logistic model to estimate factor effects in organic fruit and vegetable purchasing in Turkey. The results indicate that concern for human health and safety is a key factor that influences consumer preferences for organic food. Findings will help organic product suppliers understand the key factors influencing consumer purchasing and consumption behaviors.

  19. UAV Control on the Basis of 3D Landmark Bearing-Only Observations.

    PubMed

    Karpenko, Simon; Konovalenko, Ivan; Miller, Alexander; Miller, Boris; Nikolaev, Dmitry

    2015-11-27

    The article presents an approach to the control of a UAV on the basis of 3D landmark observations. The novelty of the work is the usage of the 3D RANSAC algorithm developed on the basis of the landmarks' position prediction with the aid of a modified Kalman-type filter. Modification of the filter based on the pseudo-measurements approach permits obtaining unbiased UAV position estimation with quadratic error characteristics. Modeling of UAV flight on the basis of the suggested algorithm shows good performance, even under significant external perturbations.

  20. Unbiased Estimates of Variance Components with Bootstrap Procedures

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2007-01-01

    This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…

  1. IRT Item Parameter Recovery with Marginal Maximum Likelihood Estimation Using Loglinear Smoothing Models

    ERIC Educational Resources Information Center

    Casabianca, Jodi M.; Lewis, Charles

    2015-01-01

    Loglinear smoothing (LLS) estimates the latent trait distribution while making fewer assumptions about its form and maintaining parsimony, thus leading to more precise item response theory (IRT) item parameter estimates than standard marginal maximum likelihood (MML). This article provides the expectation-maximization algorithm for MML estimation…

  2. ON ASYMPTOTIC DISTRIBUTION AND ASYMPTOTIC EFFICIENCY OF LEAST SQUARES ESTIMATORS OF SPATIAL VARIOGRAM PARAMETERS. (R827257)

    EPA Science Inventory

    Abstract

    In this article, we consider the least-squares approach for estimating parameters of a spatial variogram and establish consistency and asymptotic normality of these estimators under general conditions. Large-sample distributions are also established under a sp...

  3. Why Might Relative Fit Indices Differ between Estimators?

    ERIC Educational Resources Information Center

    Weng, Li-Jen; Cheng, Chung-Ping

    1997-01-01

    Relative fit indices using the null model as the reference point in computation may differ across estimation methods, as this article illustrates by comparing maximum likelihood, ordinary least squares, and generalized least squares estimation in structural equation modeling. The illustration uses a covariance matrix for six observed variables…

  4. Using Google Scholar to Estimate the Impact of Journal Articles in Education

    ERIC Educational Resources Information Center

    van Aalst, Jan

    2010-01-01

    This article discusses the potential of Google Scholar as an alternative or complement to the Web of Science and Scopus for measuring the impact of journal articles in education. Three handbooks on research in science education, language education, and educational technology were used to identify a sample of 112 accomplished scholars. Google…

  5. Correction of odds ratios in case-control studies for exposure misclassification with partial knowledge of the degree of agreement among experts who assessed exposures.

    PubMed

    Burstyn, Igor; Gustafson, Paul; Pintos, Javier; Lavoué, Jérôme; Siemiatycki, Jack

    2018-02-01

    Estimates of association between exposures and diseases are often distorted by error in exposure classification. When the validity of exposure assessment is known, this can be used to adjust these estimates. When exposure is assessed by experts, even if validity is not known, we sometimes have information about interrater reliability. We present a Bayesian method for translating the knowledge of interrater reliability, which is often available, into knowledge about validity, which is often needed but not directly available, and applying this to correct odds ratios (OR). The method allows for inclusion of observed potential confounders in the analysis, as is common in regression-based control for confounding. Our method uses a novel type of prior on sensitivity and specificity. The approach is illustrated with data from a case-control study of lung cancer risk and occupational exposure to diesel engine emissions, in which exposure assessment was made by detailed job history interviews with study subjects followed by expert judgement. Using interrater agreement measured by kappas (κ), we estimate sensitivity and specificity of exposure assessment and derive misclassification-corrected confounder-adjusted OR. Misclassification-corrected and confounder-adjusted OR obtained with the most defensible prior had a posterior distribution centre of 1.6 with 95% credible interval (Crl) 1.1 to 2.6. This was on average greater in magnitude than frequentist point estimate of 1.3 (95% Crl 1.0 to 1.7). The method yields insights into the degree of exposure misclassification and appears to reduce attenuation bias due to misclassification of exposure while the estimated uncertainty increased. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. An Embedded Sensor Node Microcontroller with Crypto-Processors.

    PubMed

    Panić, Goran; Stecklina, Oliver; Stamenković, Zoran

    2016-04-27

    Wireless sensor network applications range from industrial automation and control, agricultural and environmental protection, to surveillance and medicine. In most applications, data are highly sensitive and must be protected from any type of attack and abuse. Security challenges in wireless sensor networks are mainly defined by the power and computing resources of sensor devices, memory size, quality of radio channels and susceptibility to physical capture. In this article, an embedded sensor node microcontroller designed to support sensor network applications with severe security demands is presented. It features a low power 16-bitprocessor core supported by a number of hardware accelerators designed to perform complex operations required by advanced crypto algorithms. The microcontroller integrates an embedded Flash and an 8-channel 12-bit analog-to-digital converter making it a good solution for low-power sensor nodes. The article discusses the most important security topics in wireless sensor networks and presents the architecture of the proposed hardware solution. Furthermore, it gives details on the chip implementation, verification and hardware evaluation. Finally, the chip power dissipation and performance figures are estimated and analyzed.

  7. An Embedded Sensor Node Microcontroller with Crypto-Processors

    PubMed Central

    Panić, Goran; Stecklina, Oliver; Stamenković, Zoran

    2016-01-01

    Wireless sensor network applications range from industrial automation and control, agricultural and environmental protection, to surveillance and medicine. In most applications, data are highly sensitive and must be protected from any type of attack and abuse. Security challenges in wireless sensor networks are mainly defined by the power and computing resources of sensor devices, memory size, quality of radio channels and susceptibility to physical capture. In this article, an embedded sensor node microcontroller designed to support sensor network applications with severe security demands is presented. It features a low power 16-bitprocessor core supported by a number of hardware accelerators designed to perform complex operations required by advanced crypto algorithms. The microcontroller integrates an embedded Flash and an 8-channel 12-bit analog-to-digital converter making it a good solution for low-power sensor nodes. The article discusses the most important security topics in wireless sensor networks and presents the architecture of the proposed hardware solution. Furthermore, it gives details on the chip implementation, verification and hardware evaluation. Finally, the chip power dissipation and performance figures are estimated and analyzed. PMID:27128925

  8. A systematic review and meta-analysis of the proportion of dogs surrendered for dog-related and owner-related reasons.

    PubMed

    Lambert, Kim; Coe, Jason; Niel, Lee; Dewey, Cate; Sargeant, Jan M

    2015-01-01

    Companion-animal relinquishment is a worldwide phenomenon that leaves companion animals homeless. Knowing why humans make the decision to end their relationship with a companion-animal can help in our understanding of this complex societal issue and can help to develop preventive strategies. A systematic review and meta-analysis was conducted to summarize reasons why dogs are surrendered, and determine if certain study characteristics were associated with the reported proportions of reasons for surrender. Articles investigating one or more reasons for dog surrender were selected from the references of a published scoping review. Two reviewers assessed the titles and abstracts of these articles, identifying 39 relevant articles. From these, 21 articles were further excluded because of ineligible study design, insufficient data available for calculating a proportion, or no data available for dogs. Data were extracted from 18 articles and meta-analysis was conducted on articles investigating reasons for dog surrender to a shelter (n=9) or dog surrender for euthanasia (n=5). Three studies were excluded from meta-analysis because they were duplicate populations. Other reasons for excluding studies from meta-analysis were, (1) the study only investigated reasons for dog re-relinquishment (n=2) and (2) the study sample size was <10 (n=1). Two articles investigated reasons for both dog surrender to a shelter and dog surrender for euthanasia. Results of meta-analysis found owner health/illness as a reason for dog surrender to a shelter had an overall estimate of 4.6% (95% CI: 4.1%, 5.2%). For all other identified reasons for surrender there was significant variation in methodology among studies preventing further meta-analysis. Univariable meta-regression was conducted to explore sources of variation among these studies. Country was identified as a significant source of variation (p<0.01) among studies reporting behavioural problems as a reason for dog surrender for euthanasia. The overall estimate for studies from Australia was 10% (95% CI: 8.0%, 12.0%; I(2)=15.5%), compared to 16% (95% CI: 15.0%, 18.0%; I(2)=20.2%) for studies from other countries. The present systematic review and meta-analysis highlights the need for further research and standardization of data collection to improve understanding of the reasons for dog relinquishment. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Intraclass correlation estimates for cancer screening outcomes: estimates and applications in the design of group-randomized cancer screening studies.

    PubMed

    Hade, Erinn M; Murray, David M; Pennell, Michael L; Rhoda, Dale; Paskett, Electra D; Champion, Victoria L; Crabtree, Benjamin F; Dietrich, Allen; Dignan, Mark B; Farmer, Melissa; Fenton, Joshua J; Flocke, Susan; Hiatt, Robert A; Hudson, Shawna V; Mitchell, Michael; Monahan, Patrick; Shariff-Marco, Salma; Slone, Stacey L; Stange, Kurt; Stewart, Susan L; Strickland, Pamela A Ohman

    2010-01-01

    Screening has become one of our best tools for early detection and prevention of cancer. The group-randomized trial is the most rigorous experimental design for evaluating multilevel interventions. However, identifying the proper sample size for a group-randomized trial requires reliable estimates of intraclass correlation (ICC) for screening outcomes, which are not available to researchers. We present crude and adjusted ICC estimates for cancer screening outcomes for various levels of aggregation (physician, clinic, and county) and provide an example of how these ICC estimates may be used in the design of a future trial. Investigators working in the area of cancer screening were contacted and asked to provide crude and adjusted ICC estimates using the analysis of variance method estimator. Of the 29 investigators identified, estimates were obtained from 10 investigators who had relevant data. ICC estimates were calculated from 13 different studies, with more than half of the studies collecting information on colorectal screening. In the majority of cases, ICC estimates could be adjusted for age, education, and other demographic characteristics, leading to a reduction in the ICC. ICC estimates varied considerably by cancer site and level of aggregation of the groups. Previously, only two articles had published ICCs for cancer screening outcomes. We have complied more than 130 crude and adjusted ICC estimates covering breast, cervical, colon, and prostate screening and have detailed them by level of aggregation, screening measure, and study characteristics. We have also demonstrated their use in planning a future trial and the need for the evaluation of the proposed interval estimator for binary outcomes under conditions typically seen in GRTs.

  10. The Problem With Estimating Public Health Spending.

    PubMed

    Leider, Jonathon P

    2016-01-01

    Accurate information on how much the United States spends on public health is critical. These estimates affect planning efforts; reflect the value society places on the public health enterprise; and allows for the demonstration of cost-effectiveness of programs, policies, and services aimed at increasing population health. Yet, at present, there are a limited number of sources of systematic public health finance data. Each of these sources is collected in different ways, for different reasons, and so yields strikingly different results. This article aims to compare and contrast all 4 current national public health finance data sets, including data compiled by Trust for America's Health, the Association of State and Territorial Health Officials (ASTHO), the National Association of County and City Health Officials (NACCHO), and the Census, which underlie the oft-cited National Health Expenditure Account estimates of public health activity. In FY2008, ASTHO estimates that state health agencies spent $24 billion ($94 per capita on average, median $79), while the Census estimated all state governmental agencies including state health agencies spent $60 billion on public health ($200 per capita on average, median $166). Census public health data suggest that local governments spent an average of $87 per capita (median $57), whereas NACCHO estimates that reporting LHDs spent $64 per capita on average (median $36) in FY2008. We conclude that these estimates differ because the various organizations collect data using different means, data definitions, and inclusion/exclusion criteria--most notably around whether to include spending by all agencies versus a state/local health department, and whether behavioral health, disability, and some clinical care spending are included in estimates. Alongside deeper analysis of presently underutilized Census administrative data, we see harmonization efforts and the creation of a standardized expenditure reporting system as a way to meaningfully systematize reporting of public health spending and revenue.

  11. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images.

    PubMed

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity.

  12. 3D indoor modeling using a hand-held embedded system with multiple laser range scanners

    NASA Astrophysics Data System (ADS)

    Hu, Shaoxing; Wang, Duhu; Xu, Shike

    2016-10-01

    Accurate three-dimensional perception is a key technology for many engineering applications, including mobile mapping, obstacle detection and virtual reality. In this article, we present a hand-held embedded system designed for constructing 3D representation of structured indoor environments. Different from traditional vehicle-borne mobile mapping methods, the system presented here is capable of efficiently acquiring 3D data while an operator carrying the device traverses through the site. It consists of a simultaneous localization and mapping(SLAM) module, a 3D attitude estimate module and a point cloud processing module. The SLAM is based on a scan matching approach using a modern LIDAR system, and the 3D attitude estimate is generated by a navigation filter using inertial sensors. The hardware comprises three 2D time-flight laser range finders and an inertial measurement unit(IMU). All the sensors are rigidly mounted on a body frame. The algorithms are developed on the frame of robot operating system(ROS). The 3D model is constructed using the point cloud library(PCL). Multiple datasets have shown robust performance of the presented system in indoor scenarios.

  13. Assessing the fit of site-occupancy models

    USGS Publications Warehouse

    MacKenzie, D.I.; Bailey, L.L.

    2004-01-01

    Few species are likely to be so evident that they will always be detected at a site when present. Recently a model has been developed that enables estimation of the proportion of area occupied, when the target species is not detected with certainty. Here we apply this modeling approach to data collected on terrestrial salamanders in the Plethodon glutinosus complex in the Great Smoky Mountains National Park, USA, and wish to address the question 'how accurately does the fitted model represent the data?' The goodness-of-fit of the model needs to be assessed in order to make accurate inferences. This article presents a method where a simple Pearson chi-square statistic is calculated and a parametric bootstrap procedure is used to determine whether the observed statistic is unusually large. We found evidence that the most global model considered provides a poor fit to the data, hence estimated an overdispersion factor to adjust model selection procedures and inflate standard errors. Two hypothetical datasets with known assumption violations are also analyzed, illustrating that the method may be used to guide researchers to making appropriate inferences. The results of a simulation study are presented to provide a broader view of the methods properties.

  14. [Evaluation of occupational medicine service tasks in the context of the Occupational Medicine Service Act, article 12, on the basis of statistical indicators in the Pomorskie voivodship].

    PubMed

    Parszuto, Jacek; Jaremin, Bogdan; Tukalska-Parszuto, Maria

    2009-01-01

    Occupational health service is based on legal regulations. We have made an attempt to estimate the implementation of the tasks resulting from article 12 of the Occupational Medicine Service Act introduced in 1998. In this paper we analyzed statistical data concerning the number of prophylactic health contracts, economic entities and health insurance payers. The data come from the Nofer Institute of Occupational Medicine, Central Statistical Office and Social Insurance Institution. Contract Coverage Rate (CCR) has been introduced for the purpose of this research. The data show that in 2007, the Contract Coverage Rate (CCR) for the Pomorskie voivodeship (province) accounted for 45.7%, with the median value of 14.4% for all voivodeships in Poland. According to the gathered statistical data, it should be concluded that the implementation of article 12 is insufficient. The amendment to the Act introducing the provision on written contracts is an opportunity to provide an effective mechanism, by which the present situation can be improved and the rates raised to a satisfactory level.

  15. Time, frequency, and time-varying Granger-causality measures in neuroscience.

    PubMed

    Cekic, Sezen; Grandjean, Didier; Renaud, Olivier

    2018-05-20

    This article proposes a systematic methodological review and an objective criticism of existing methods enabling the derivation of time, frequency, and time-varying Granger-causality statistics in neuroscience. The capacity to describe the causal links between signals recorded at different brain locations during a neuroscience experiment is indeed of primary interest for neuroscientists, who often have very precise prior hypotheses about the relationships between recorded brain signals. The increasing interest and the huge number of publications related to this topic calls for this systematic review, which describes the very complex methodological aspects underlying the derivation of these statistics. In this article, we first present a general framework that allows us to review and compare Granger-causality statistics in the time domain, and the link with transfer entropy. Then, the spectral and the time-varying extensions are exposed and discussed together with their estimation and distributional properties. Although not the focus of this article, partial and conditional Granger causality, dynamical causal modelling, directed transfer function, directed coherence, partial directed coherence, and their variant are also mentioned. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Prevalence, awareness, and associated risk factors of hypertension in older adults in Africa: a systematic review and meta-analysis protocol.

    PubMed

    Bosu, William K; Aheto, Justice M K; Zucchelli, Eugenio; Reilly, Siobhan

    2017-10-04

    The health of older persons has not been a major priority in many African countries. Hypertension is one of the common health problems of older persons. However, there is little information on the prevalence of hypertension in older adults in Africa. This is in spite of the fact that Africa has the highest age-standardized prevalence of hypertension in the world. We therefore present this protocol to conduct a systematic review and meta-analysis on the prevalence of hypertension and the level of its awareness among older persons living in Africa. Major databases (EMBASE, MEDLINE, Academic Search Complete, CINAHL, PsycINFO) and unpublished literature will be searched to identify population-based studies on hypertension in adults aged 50 years and older living in Africa. Eligible articles are those which use the 140/90-mmHg cutoff to diagnose hypertension and were published from 1980 to present. We will exclude subjects in restricted environments such as patients and refugees. Articles will be independently evaluated by two reviewers to determine if they meet the inclusion criteria. They will also evaluate the quality of included studies using a validated tool by Hoy and colleagues for prevalence studies. The main outcome is the prevalence of hypertension while the explanatory variables include demographic, socio-economic, dietary, lifestyle and behavioural factors. Effect sizes in bivariate and multivariate analyses will be presented as odds or prevalence ratios. We will explore for heterogeneity of the standard errors across the studies, and if appropriate, we will perform a meta-analysis using a random-effects model to present a summary estimate of the prevalence of hypertension in this population. The estimates of the prevalence, the risk factors and the level of awareness of hypertension could help in galvanizing efforts at prioritizing the cardiovascular health of older persons in Africa. PROSPERO CRD42017056474.

  17. Point cloud modeling using the homogeneous transformation for non-cooperative pose estimation

    NASA Astrophysics Data System (ADS)

    Lim, Tae W.

    2015-06-01

    A modeling process to simulate point cloud range data that a lidar (light detection and ranging) sensor produces is presented in this paper in order to support the development of non-cooperative pose (relative attitude and position) estimation approaches which will help improve proximity operation capabilities between two adjacent vehicles. The algorithms in the modeling process were based on the homogeneous transformation, which has been employed extensively in robotics and computer graphics, as well as in recently developed pose estimation algorithms. Using a flash lidar in a laboratory testing environment, point cloud data of a test article was simulated and compared against the measured point cloud data. The simulated and measured data sets match closely, validating the modeling process. The modeling capability enables close examination of the characteristics of point cloud images of an object as it undergoes various translational and rotational motions. Relevant characteristics that will be crucial in non-cooperative pose estimation were identified such as shift, shadowing, perspective projection, jagged edges, and differential point cloud density. These characteristics will have to be considered in developing effective non-cooperative pose estimation algorithms. The modeling capability will allow extensive non-cooperative pose estimation performance simulations prior to field testing, saving development cost and providing performance metrics of the pose estimation concepts and algorithms under evaluation. The modeling process also provides "truth" pose of the test objects with respect to the sensor frame so that the pose estimation error can be quantified.

  18. Quantifying patient preferences for symptomatic breast clinic referral: a decision analysis study.

    PubMed

    Quinlan, Aisling; O'Brien, Kirsty K; Galvin, Rose; Hardy, Colin; McDonnell, Ronan; Joyce, Doireann; McDowell, Ronald D; Aherne, Emma; Keogh, Claire; O'Sullivan, Katriona; Fahey, Tom

    2018-05-31

    Decision analysis study that incorporates patient preferences and probability estimates to investigate the impact of women's preferences for referral or an alternative strategy of watchful waiting if faced with symptoms that could be due to breast cancer. Community-based study. Asymptomatic women aged 30-60 years. Participants were presented with 11 health scenarios that represent the possible consequences of symptomatic breast problems. Participants were asked the risk of death that they were willing to take in order to avoid the health scenario using the standard gamble utility method. This process was repeated for all 11 health scenarios. Formal decision analysis for the preferred individual decision was then estimated for each participant. The preferred diagnostic strategy was either watchful waiting or referral to a breast clinic. Sensitivity analysis was used to examine how each varied according to changes in the probabilities of the health scenarios. A total of 35 participants completed the interviews, with a median age 41 years (IQR 35-47 years). The majority of the study sample was employed (n=32, 91.4%), with a third-level (university) education (n=32, 91.4%) and with knowledge of someone with breast cancer (n=30, 85.7%). When individual preferences were accounted for, 25 (71.4%) patients preferred watchful waiting to referral for triple assessment as their preferred initial diagnostic strategy. Sensitivity analysis shows that referral for triple assessment becomes the dominant strategy at the upper probability estimate (18%) of breast cancer in the community. Watchful waiting is an acceptable strategy for most women who present to their general practitioner (GP) with breast symptoms. These findings suggest that current referral guidelines should take more explicit account of women's preferences in relation to their GPs initial management strategy. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Performance analysis of cross-layer design with average PER constraint over MIMO fading channels

    NASA Astrophysics Data System (ADS)

    Dang, Xiaoyu; Liu, Yan; Yu, Xiangbin

    2015-12-01

    In this article, a cross-layer design (CLD) scheme for multiple-input and multiple-output system with the dual constraints of imperfect feedback and average packet error rate (PER) is presented, which is based on the combination of the adaptive modulation and the automatic repeat request protocols. The design performance is also evaluated over wireless Rayleigh fading channel. With the constraint of target PER and average PER, the optimum switching thresholds (STs) for attaining maximum spectral efficiency (SE) are developed. An effective iterative algorithm for finding the optimal STs is proposed via Lagrange multiplier optimisation. With different thresholds available, the analytical expressions of the average SE and PER are provided for the performance evaluation. To avoid the performance loss caused by the conventional single estimate, multiple outdated estimates (MOE) method, which utilises multiple previous channel estimation information, is presented for CLD to improve the system performance. It is shown that numerical simulations for average PER and SE are in consistent with the theoretical analysis and that the developed CLD with average PER constraint can meet the target PER requirement and show better performance in comparison with the conventional CLD with instantaneous PER constraint. Especially, the CLD based on the MOE method can obviously increase the system SE and reduce the impact of feedback delay greatly.

  20. Modeling In Vivo Interactions of Engineered Nanoparticles in the Pulmonary Alveolar Lining Fluid

    PubMed Central

    Mukherjee, Dwaipayan; Porter, Alexandra; Ryan, Mary; Schwander, Stephan; Chung, Kian Fan; Tetley, Teresa; Zhang, Junfeng; Georgopoulos, Panos

    2015-01-01

    Increasing use of engineered nanomaterials (ENMs) in consumer products may result in widespread human inhalation exposures. Due to their high surface area per unit mass, inhaled ENMs interact with multiple components of the pulmonary system, and these interactions affect their ultimate fate in the body. Modeling of ENM transport and clearance in vivo has traditionally treated tissues as well-mixed compartments, without consideration of nanoscale interaction and transformation mechanisms. ENM agglomeration, dissolution and transport, along with adsorption of biomolecules, such as surfactant lipids and proteins, cause irreversible changes to ENM morphology and surface properties. The model presented in this article quantifies ENM transformation and transport in the alveolar air to liquid interface and estimates eventual alveolar cell dosimetry. This formulation brings together established concepts from colloidal and surface science, physics, and biochemistry to provide a stochastic framework capable of capturing essential in vivo processes in the pulmonary alveolar lining layer. The model has been implemented for in vitro solutions with parameters estimated from relevant published in vitro measurements and has been extended here to in vivo systems simulating human inhalation exposures. Applications are presented for four different ENMs, and relevant kinetic rates are estimated, demonstrating an approach for improving human in vivo pulmonary dosimetry. PMID:26240755

  1. Reconstructive dosimetry for cutaneous radiation syndrome

    PubMed Central

    Lima, C.M.A.; Lima, A.R.; Degenhardt, Ä.L.; Valverde, N.J.; Da Silva, F.C.A.

    2015-01-01

    According to the International Atomic Energy Agency (IAEA), a relatively significant number of radiological accidents have occurred in recent years mainly because of the practices referred to as potentially high-risk activities, such as radiotherapy, large irradiators and industrial radiography, especially in gammagraphy assays. In some instances, severe injuries have occurred in exposed persons due to high radiation doses. In industrial radiography, 80 cases involving a total of 120 radiation workers, 110 members of the public including 12 deaths have been recorded up to 2014. Radiological accidents in industrial practices in Brazil have mainly resulted in development of cutaneous radiation syndrome (CRS) in hands and fingers. Brazilian data include 5 serious cases related to industrial gammagraphy, affecting 7 radiation workers and 19 members of the public; however, none of them were fatal. Some methods of reconstructive dosimetry have been used to estimate the radiation dose to assist in prescribing medical treatment. The type and development of cutaneous manifestations in the exposed areas of a person is the first achievable gross dose estimation. This review article presents the state-of-the-art reconstructive dosimetry methods enabling estimation of local radiation doses and provides guidelines for medical handling of the exposed individuals. The review also presents the Chilean and Brazilian radiological accident cases to highlight the importance of reconstructive dosimetry. PMID:26445332

  2. Abschätzung des Einflusses von Parameterunsicherheiten bei der Planung und Auswertung von Tracertests unter Verwendung von Ensembleprognosen

    NASA Astrophysics Data System (ADS)

    Klotzsch, Stephan; Binder, Martin; Händel, Falk

    2017-06-01

    While planning tracer tests, uncertainties in geohydraulic parameters should be considered as an important factor. Neglecting these uncertainties can lead to missing the tracer breakthrough, for example. One way to consider uncertainties during tracer test design is the so called ensemble forecast. The applicability of this method to geohydrological problems is demonstrated by coupling the method with two analytical solute transport models. The algorithm presented in this article is suitable for prediction as well as parameter estimation. The parameter estimation function can be used in a tracer test for reducing the uncertainties in the measured data which can improve the initial prediction. The algorithm was implemented into a software tool which is freely downloadable from the website of the Institute for Groundwater Management at TU Dresden, Germany.

  3. Incorporating sign-dependence in health-related social welfare functions.

    PubMed

    Attema, Arthur E

    2015-04-01

    It is important to measure people's preferences regarding the trade-off between efficiency and equity in health to make public decisions that are in a society's best interests. This article demonstrates the usefulness of social welfare functions to obtain these measurements. Insights from individual decision making, in particular, prospect theory, turn out to be helpful to estimate societal preferences more accurately. The author shows how one can disentangle the effects of loss aversion in this estimation. The presented approach also allows for sign-dependent societal utility and equity weighting functions. Recent empirical studies that used this approach with choices concerning quality of life of other people reported the presence of substantial inequity aversion both for gains and for losses, as well as loss aversion. Several examples demonstrate the relevance of these insights for preference elicitations and health economic evaluations.

  4. Review Article "Valuating the intangible effects of natural hazards - review and analysis of the costing methods"

    NASA Astrophysics Data System (ADS)

    Markantonis, V.; Meyer, V.; Schwarze, R.

    2012-05-01

    The "intangible" or "non-market" effects are those costs of natural hazards which are not, or at least not easily measurable in monetary terms, as for example, impacts on health, cultural heritage or the environment. The intangible effects are often not included in costs assessments of natural hazards leading to an incomplete and biased cost assessment. However, several methods exist which try to estimate these effects in a non-monetary or monetary form. The objective of the present paper is to review and evaluate methods for estimating the intangible effects of natural hazards, specifically related to health and environmental effects. Existing methods are analyzed and compared using various criteria, research gaps are identified, application recommendations are provided, and valuation issues that should be addressed by the scientific community are highlighted.

  5. Savings estimate for a Medicare insured group

    PubMed Central

    Birnbaum, Howard; Holland, Stephen K.; Lenhart, Gregory; Reilly, Helena L.; Hoffman, Kevin; Pardo, Dennis P.

    1991-01-01

    Estimates of the savings potential of a managed-care program for a Medicare retiree population in Michigan under a hypothetical Medicare insured group (MIG) are presented in this article. In return for receiving an experience-rated capitation payment, a MIG would administer all Medicare and employer complementary benefits for its enrollees. A study of the financial and operational feasibility of implementing a MIG for retirees of a national corporation involving an analysis of 1986 claims data finds that selected managed-care initiatives implemented by a MIG would generate an annual savings of 3.8 percent of total (Medicare plus complementary) expenditures. Although savings are less than the 5 percent to be retained by Medicare, this finding illustrates the potential for savings from managed-care initiatives to Medicare generally and to MIGs elsewhere, where savings may be greater if constraints are less restrictive. PMID:10113700

  6. Sensor fault detection and isolation system for a condensation process.

    PubMed

    Castro, M A López; Escobar, R F; Torres, L; Aguilar, J F Gómez; Hernández, J A; Olivares-Peregrino, V H

    2016-11-01

    This article presents the design of a sensor Fault Detection and Isolation (FDI) system for a condensation process based on a nonlinear model. The condenser is modeled by dynamic and thermodynamic equations. For this work, the dynamic equations are described by three pairs of differential equations which represent the energy balance between the fluids. The thermodynamic equations consist in algebraic heat transfer equations and empirical equations, that allow for the estimation of heat transfer coefficients. The FDI system consists of a bank of two nonlinear high-gain observers, in order to detect, estimate and to isolate the fault in any of both outlet temperature sensors. The main contributions of this work were the experimental validation of the condenser nonlinear model and the FDI system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Electron induced inelastic and ionization cross section for plasma modeling

    NASA Astrophysics Data System (ADS)

    Verma, Pankaj; Mahato, Dibyendu; Kaur, Jaspreet; Antony, Bobby

    2016-09-01

    The present paper reports electron impact total inelastic and ionization cross section for silicon, germanium, and tin tetrahalides at energies varying from ionization threshold of the target to 5000 eV. These cross section data over a wide energy domain are very essential to understand the physico-chemical processes involved in various environments such as plasma modeling, semiconductor etching, atmospheric sciences, biological sciences, and radiation physics. However, the cross section data on the above mentioned molecules are scarce. In the present article, we report the computation of total inelastic cross section using spherical complex optical potential formalism and the estimation of ionization cross section through a semi-empirical method. The present ionization cross section result obtained for SiCl4 shows excellent agreement with previous measurements, while other molecules have not yet been investigated experimentally. Present results show more consistent behaviour than previous theoretical estimates. Besides cross sections, we have also studied the correlation of maximum ionization cross section with the square root of the ratio of polarizability to ionization potential for the molecules with known polarizabilities. A linear relation is observed between these quantities. This correlation is used to obtain approximate polarizability volumes for SiBr4, SiI4, GeCl4, GeBr4, and GeI4 molecules.

  8. Comparison of estimation techniques for a forest inventory in which double sampling for stratification is used

    Treesearch

    Michael S. Williams

    2001-01-01

    A number of different estimators can be used when forest inventory plots cover two or more distinctly different condition classes. In this article the properties of two approximate Horvitz- Thompson (HT) estimators, a ratio of means (RM), and a mean of ratios (MR) estimator are explored in the framework of double sampling for stratification. Relevant theoretical...

  9. Model-assisted estimation of forest resources with generalized additive models

    Treesearch

    Jean D. Opsomer; F. Jay Breidt; Gretchen G. Moisen; Goran Kauermann

    2007-01-01

    Multiphase surveys are often conducted in forest inventories, with the goal of estimating forested area and tree characteristics over large regions. This article describes how design-based estimation of such quantities, based on information gathered during ground visits of sampled plots, can be made more precise by incorporating auxiliary information available from...

  10. Five Methods for Estimating Angoff Cut Scores with IRT

    ERIC Educational Resources Information Center

    Wyse, Adam E.

    2017-01-01

    This article illustrates five different methods for estimating Angoff cut scores using item response theory (IRT) models. These include maximum likelihood (ML), expected a priori (EAP), modal a priori (MAP), and weighted maximum likelihood (WML) estimators, as well as the most commonly used approach based on translating ratings through the test…

  11. A Nonparametric Approach to Estimate Classification Accuracy and Consistency

    ERIC Educational Resources Information Center

    Lathrop, Quinn N.; Cheng, Ying

    2014-01-01

    When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…

  12. Accuracy Rates of Sex Estimation by Forensic Anthropologists through Comparison with DNA Typing Results in Forensic Casework.

    PubMed

    Thomas, Richard M; Parks, Connie L; Richard, Adam H

    2016-09-01

    A common task in forensic anthropology involves the estimation of the biological sex of a decedent by exploiting the sexual dimorphism between males and females. Estimation methods are often based on analysis of skeletal collections of known sex and most include a research-based accuracy rate. However, the accuracy rates of sex estimation methods in actual forensic casework have rarely been studied. This article uses sex determinations based on DNA results from 360 forensic cases to develop accuracy rates for sex estimations conducted by forensic anthropologists. The overall rate of correct sex estimation from these cases is 94.7% with increasing accuracy rates as more skeletal material is available for analysis and as the education level and certification of the examiner increases. Nine of 19 incorrect assessments resulted from cases in which one skeletal element was available, suggesting that the use of an "undetermined" result may be more appropriate for these cases. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  13. Measuring the impact of major life events upon happiness.

    PubMed

    Ballas, Dimitris; Dorling, Danny

    2007-12-01

    In recent years there have been numerous attempts to define and measure happiness in various contexts and pertaining to a wide range of disciplines, ranging from neuroscience and psychology to philosophy, economics and social policy. This article builds on recent work by economists who attempt to estimate happiness regressions using large random samples of individuals in order to calculate monetary 'compensating amounts' for different life 'events'. We estimate happiness regressions using the 'major life event' and 'happiness' data from the British Household Panel Survey. The data and methods used in this article suggest that in contrast to living states such as 'being married', it is more events such as 'starting a new relationship' that have the highest positive effect on happiness. This is closely followed by 'employment-related gains' (in contrast to employment status). Also, women who become pregnant on average report higher than average levels of subjective happiness (in contrast to 'being a parent'). Other events that appear to be associated with happiness according to our analysis include 'personal education-related events' (e.g. starting a new course, graduating from University, passing exams) and 'finance/house related events' (e.g. buying a new house). On the other hand, the event that has the highest negative impact upon happiness according to our analysis is 'the end of my relationship' closely followed by 'death of a parent'. Adverse health events pertaining to the parents of the respondents also have a high negative coefficient and so does an employment-related loss. The analysis presented in this article suggests that what matters the most in people's lives in Britain is to have good dynamic interpersonal relationships and to be respected at work with that respect being constantly renewed. These 'goods' are as much reflected through dynamic events as static situations. Relationships at work appear to be of a similar order of importance to those at home. Other factors that contribute to higher than average levels of subjective happiness, at least at a superficial level, include delaying death and keeping illness at bay, having babies, buying homes and cars and passing exams. The analysis presented here also suggests that people should not expect too much from their holidays and wider families. The findings presented in this article may help us to understand a little better the propensity for groups to be more or less happy and may help us to begin to better understand the importance of the dynamics of social context-the context in which we come to terms with reward and loss.

  14. Missing data handling in non-inferiority and equivalence trials: A systematic review.

    PubMed

    Rabe, Brooke A; Day, Simon; Fiero, Mallorie H; Bell, Melanie L

    2018-05-25

    Non-inferiority (NI) and equivalence clinical trials test whether a new treatment is therapeutically no worse than, or equivalent to, an existing standard of care. Missing data in clinical trials have been shown to reduce statistical power and potentially bias estimates of effect size; however, in NI and equivalence trials, they present additional issues. For instance, they may decrease sensitivity to differences between treatment groups and bias toward the alternative hypothesis of NI (or equivalence). Our primary aim was to review the extent of and methods for handling missing data (model-based methods, single imputation, multiple imputation, complete case), the analysis sets used (Intention-To-Treat, Per-Protocol, or both), and whether sensitivity analyses were used to explore departures from assumptions about the missing data. We conducted a systematic review of NI and equivalence trials published between May 2015 and April 2016 by searching the PubMed database. Articles were reviewed primarily by 2 reviewers, with 6 articles reviewed by both reviewers to establish consensus. Of 109 selected articles, 93% reported some missing data in the primary outcome. Among those, 50% reported complete case analysis, and 28% reported single imputation approaches for handling missing data. Only 32% reported conducting analyses of both intention-to-treat and per-protocol populations. Only 11% conducted any sensitivity analyses to test assumptions with respect to missing data. Missing data are common in NI and equivalence trials, and they are often handled by methods which may bias estimates and lead to incorrect conclusions. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Clinical course of untreated cerebral cavernous malformations: a meta-analysis of individual patient data.

    PubMed

    Horne, Margaret A; Flemming, Kelly D; Su, I-Chang; Stapf, Christian; Jeon, Jin Pyeong; Li, Da; Maxwell, Susanne S; White, Philip; Christianson, Teresa J; Agid, Ronit; Cho, Won-Sang; Oh, Chang Wan; Wu, Zhen; Zhang, Jun-Ting; Kim, Jeong Eun; Ter Brugge, Karel; Willinsky, Robert; Brown, Robert D; Murray, Gordon D; Al-Shahi Salman, Rustam

    2016-02-01

    Cerebral cavernous malformations (CCMs) can cause symptomatic intracranial haemorrhage (ICH), but the estimated risks are imprecise and predictors remain uncertain. We aimed to obtain precise estimates and predictors of the risk of ICH during untreated follow-up in an individual patient data meta-analysis. We invited investigators of published cohorts of people aged at least 16 years, identified by a systematic review of Ovid MEDLINE and Embase from inception to April 30, 2015, to provide individual patient data on clinical course from CCM diagnosis until first CCM treatment or last available follow-up. We used survival analysis to estimate the 5-year risk of symptomatic ICH due to CCMs (primary outcome), multivariable Cox regression to identify baseline predictors of outcome, and random-effects models to pool estimates in a meta-analysis. Among 1620 people in seven cohorts from six studies, 204 experienced ICH during 5197 person-years of follow-up (Kaplan-Meier estimated 5-year risk 15·8%, 95% CI 13·7-17·9). The primary outcome of ICH within 5 years of CCM diagnosis was associated with clinical presentation with ICH or new focal neurological deficit (FND) without brain imaging evidence of recent haemorrhage versus other modes of presentation (hazard ratio 5·6, 95% CI 3·2-9·7) and with brainstem CCM location versus other locations (4·4, 2·3-8·6), but age, sex, and CCM multiplicity did not add independent prognostic information. The 5-year estimated risk of ICH during untreated follow-up was 3·8% (95% CI 2·1-5·5) for 718 people with non-brainstem CCM presenting without ICH or FND, 8·0% (0·1-15·9) for 80 people with brainstem CCM presenting without ICH or FND, 18·4% (13·3-23·5) for 327 people with non-brainstem CCM presenting with ICH or FND, and 30·8% (26·3-35·2) for 495 people with brainstem CCM presenting with ICH or FND. Mode of clinical presentation and CCM location are independently associated with ICH within 5 years of CCM diagnosis. These findings can inform decisions about CCM treatment. UK Medical Research Council, Chief Scientist Office of the Scottish Government, and UK Stroke Association. Copyright © 2016 Horne et al. Open Access article distributed under the terms of CC BY. Published by Elsevier Ltd.. All rights reserved.

  16. Estimating the Number of Paediatric Fevers Associated with Malaria Infection Presenting to Africa's Public Health Sector in 2007

    PubMed Central

    Gething, Peter W.; Kirui, Viola C.; Alegana, Victor A.; Okiro, Emelda A.; Noor, Abdisalan M.; Snow, Robert W.

    2010-01-01

    Background As international efforts to increase the coverage of artemisinin-based combination therapy in public health sectors gather pace, concerns have been raised regarding their continued indiscriminate presumptive use for treating all childhood fevers. The availability of rapid-diagnostic tests to support practical and reliable parasitological diagnosis provides an opportunity to improve the rational treatment of febrile children across Africa. However, the cost effectiveness of diagnosis-based treatment polices will depend on the presumed numbers of fevers harbouring infection. Here we compute the number of fevers likely to present to public health facilities in Africa and the estimated number of these fevers likely to be infected with Plasmodium falciparum malaria parasites. Methods and Findings We assembled first administrative-unit level data on paediatric fever prevalence, treatment-seeking rates, and child populations. These data were combined in a geographical information system model that also incorporated an adjustment procedure for urban versus rural areas to produce spatially distributed estimates of fever burden amongst African children and the subset likely to present to public sector clinics. A second data assembly was used to estimate plausible ranges for the proportion of paediatric fevers seen at clinics positive for P. falciparum in different endemicity settings. We estimated that, of the 656 million fevers in African 0–4 y olds in 2007, 182 million (28%) were likely to have sought treatment in a public sector clinic of which 78 million (43%) were likely to have been infected with P. falciparum (range 60–103 million). Conclusions Spatial estimates of childhood fevers and care-seeking rates can be combined with a relational risk model of infection prevalence in the community to estimate the degree of parasitemia in those fevers reaching public health facilities. This quantification provides an important baseline comparison of malarial and nonmalarial fevers in different endemicity settings that can contribute to ongoing scientific and policy debates about optimum clinical and financial strategies for the introduction of new diagnostics. These models are made publicly available with the publication of this paper. Please see later in the article for the Editors' Summary PMID:20625548

  17. Questionnaire-based Prevalence of Food Insecurity in Iran: A Review Article.

    PubMed

    Daneshi-Maskooni, Milad; Shab-Bidar, Sakineh; Badri-Fariman, Mahtab; Aubi, Erfan; Mohammadi, Younes; Jafarnejad, Sadegh; Djafarian, Kurosh

    2017-11-01

    Data on the questionnaire-based prevalence of food insecurity are needed to develop food and nutrition security studies and policies. The present study aimed to assess the questionnaire-based prevalence of food insecurity in Iran. A systematic search of cross-sectional studies were conducted on databases including PubMed, Google Scholar, Scopus, Magiran, Iranmedex, SID and Medlib up to 29 Oct 2015. Estimation of food insecurity prevalence was according to the instruments including 9-items-HFIAS, 18 and 6-items USDA (US-HFSSM) and Radimer/Cernel food security questionnaires. Pooled effect was estimated using random-effect model and heterogeneity was assessed by Cochran's Q and I 2 tests. Thirteen articles included in the study based on screening and assessment of eligibility. The questionnaire-based prevalence of food insecurity was 49.2% (CI95%: 43.8-54.6). The according to sub-groups analysis, the food insecurity without and with hunger was 29.6% (CI95%: 25.7-33.6) and 19.2% (CI95%: 16-22.3), respectively. The about half of the population were food insecure. The food insecurity without hunger was more than the food insecurity with hunger. An ongoing food insecurity assessment system is needed to support evidence-informed policy and to plan interventions to increase the food security in different areas.

  18. The hormesis database: the occurrence of hormetic dose responses in the toxicological literature.

    PubMed

    Calabrese, Edward J; Blain, Robyn B

    2011-10-01

    In 2005 we published an assessment of dose responses that satisfied a priori evaluative criteria for inclusion within the relational retrieval hormesis database (Calabrese and Blain, 2005). The database included information on study characteristics (e.g., biological model, gender, age and other relevant aspects, number of doses, dose distribution/range, quantitative features of the dose response, temporal features/repeat measures, and physical/chemical properties of the agents). The 2005 article covered information for about 5000 dose responses; the present article has been expanded to cover approximately 9000 dose responses. This assessment extends and strengthens the conclusion of the 2005 paper that the hormesis concept is broadly generalizable, being independent of biological model, endpoint measured and chemical class/physical agent. It also confirmed the definable quantitative features of hormetic dose responses in which the strong majority of dose responses display maximum stimulation less than twice that of the control group and a stimulatory width that is within approximately 10-20-fold of the estimated toxicological or pharmacological threshold. The remarkable consistency of the quantitative features of the hormetic dose response suggests that hormesis may provide an estimate of biological plasticity that is broadly generalized across plant, microbial and animal (invertebrate and vertebrate) models. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Poor Man's Virtual Camera: Real-Time Simultaneous Matting and Camera Pose Estimation.

    PubMed

    Szentandrasi, Istvan; Dubska, Marketa; Zacharias, Michal; Herout, Adam

    2016-03-18

    Today's film and advertisement production heavily uses computer graphics combined with living actors by chromakeying. The matchmoving process typically takes a considerable manual effort. Semi-automatic matchmoving tools exist as well, but they still work offline and require manual check-up and correction. In this article, we propose an instant matchmoving solution for green screen. It uses a recent technique of planar uniform marker fields. Our technique can be used in indie and professional filmmaking as a cheap and ultramobile virtual camera, and for shot prototyping and storyboard creation. The matchmoving technique based on marker fields of shades of green is very computationally efficient: we developed and present in the article a mobile application running at 33 FPS. Our technique is thus available to anyone with a smartphone at low cost and with easy setup, opening space for new levels of filmmakers' creative expression.

  20. International Network of Passive Correlation Ranging for Orbit Determination of a Geostationary Satellite

    NASA Astrophysics Data System (ADS)

    Kaliuzhnyi, Mykola; Bushuev, Felix; Shulga, Oleksandr; Sybiryakova, Yevgeniya; Shakun, Leonid; Bezrukovs, Vladislavs; Moskalenko, Sergiy; Kulishenko, Vladislav; Malynovskyi, Yevgen

    2016-12-01

    An international network of passive correlation ranging of a geostationary telecommunication satellite is considered in the article. The network is developed by the RI "MAO". The network consists of five spatially separated stations of synchronized reception of DVB-S signals of digital satellite TV. The stations are located in Ukraine and Latvia. The time difference of arrival (TDOA) on the network stations of the DVB-S signals, radiated by the satellite, is a measured parameter. The results of TDOA estimation obtained by the network in May-August 2016 are presented in the article. Orbital parameters of the tracked satellite are determined using measured values of the TDOA and two models of satellite motion: the analytical model SGP4/SDP4 and the model of numerical integration of the equations of satellite motion. Both models are realized using the free low-level space dynamics library OREKIT (ORbit Extrapolation KIT).

  1. Land Use and Land Cover Change in Forest Frontiers: The Role of Household Life Cycles

    NASA Technical Reports Server (NTRS)

    Walker, Robert

    2002-01-01

    Tropical deforestation remains a critical issue given its present rate and a widespread consensus regarding its implications for the global carbon cycle and biodiversity. Nowhere is the problem more pronounced than in the Amazon basin, home to the world's largest intact, tropical forest. This article addresses land cover change processes at household level in the Amazon basin, and to this end adapts a concept of domestic life cycle to the current institutional environment of tropical frontiers. In particular, it poses a risk minimization model that integrates demography with market-based factors such as transportation costs and accessibility. In essence, the article merges the theory of Chayanov with the household economy framework, in which markets exist for inputs (including labor), outputs, and capital. The risk model is specified and estimated, using survey data for 261 small producers along the Transamazon Highway in the eastern sector of the Brazilian Amazon.

  2. The total occlusal convergence of the abutment of a partial fixed dental prosthesis: A definition and a clinical technique for its assessment

    PubMed Central

    Mamoun, John S.

    2013-01-01

    The abutment(s) of a partial fixed dental prosthesis (PFDP) should have a minimal total occlusal convergence (TOC), also called a taper, in order to ensure adequate retention of a PFDP that will be made for the abutment(s), given the height of the abutment(s). This article reviews the concept of PFDP abutment TOC and presents an alternative definition of what TOC is, defining it as the extent to which the shape of an abutment differs from an ideal cylinder shape of an abutment. This article also reviews experimental results concerning what is the ideal TOC in degrees and explores clinical techniques of estimating the TOC of a crown abutment. The author suggests that Dentists use high magnification loupes (×6-8 magnification or greater) or a surgical operating microscope when preparing crown abutments, to facilitate creating a minimum abutment TOC. PMID:24932130

  3. Analysis of mortality data from the former USSR: age-period-cohort analysis.

    PubMed

    Willekens, F; Scherbov, S

    1992-01-01

    The objective of this article is to review research on age-period-cohort (APC) analysis of mortality and to trace the effects of contemporary and historical factors on mortality change in the former USSR. Several events in USSR history have exerted a lasting influence on its people. These influences may be captured by an APC model in which the period effects measure the impact of contemporary factors and the cohort effects the past history of individuals which cannot be attributed to age or stage in the life cycle. APC models are extensively applied in the study of mortality. This article presents the statistical theory of the APC models and shows that they belong to the family of generalized linear models. The parameters of the APC model may therefore be estimated by any package of loglinear analysis that allows for hybrid loglinear models.

  4. Predictive Models and Tools for Screening Chemicals under TSCA: Consumer Exposure Models 1.5

    EPA Pesticide Factsheets

    CEM contains a combination of models and default parameters which are used to estimate inhalation, dermal, and oral exposures to consumer products and articles for a wide variety of product and article use categories.

  5. Application of geostatistics to risk assessment.

    PubMed

    Thayer, William C; Griffith, Daniel A; Goodrum, Philip E; Diamond, Gary L; Hassett, James M

    2003-10-01

    Geostatistics offers two fundamental contributions to environmental contaminant exposure assessment: (1) a group of methods to quantitatively describe the spatial distribution of a pollutant and (2) the ability to improve estimates of the exposure point concentration by exploiting the geospatial information present in the data. The second contribution is particularly valuable when exposure estimates must be derived from small data sets, which is often the case in environmental risk assessment. This article addresses two topics related to the use of geostatistics in human and ecological risk assessments performed at hazardous waste sites: (1) the importance of assessing model assumptions when using geostatistics and (2) the use of geostatistics to improve estimates of the exposure point concentration (EPC) in the limited data scenario. The latter topic is approached here by comparing design-based estimators that are familiar to environmental risk assessors (e.g., Land's method) with geostatistics, a model-based estimator. In this report, we summarize the basics of spatial weighting of sample data, kriging, and geostatistical simulation. We then explore the two topics identified above in a case study, using soil lead concentration data from a Superfund site (a skeet and trap range). We also describe several areas where research is needed to advance the use of geostatistics in environmental risk assessment.

  6. MoisturEC: A New R Program for Moisture Content Estimation from Electrical Conductivity Data.

    PubMed

    Terry, Neil; Day-Lewis, Frederick D; Werkema, Dale; Lane, John W

    2018-03-06

    Noninvasive geophysical estimation of soil moisture has potential to improve understanding of flow in the unsaturated zone for problems involving agricultural management, aquifer recharge, and optimization of landfill design and operations. In principle, several geophysical techniques (e.g., electrical resistivity, electromagnetic induction, and nuclear magnetic resonance) offer insight into soil moisture, but data-analysis tools are needed to "translate" geophysical results into estimates of soil moisture, consistent with (1) the uncertainty of this translation and (2) direct measurements of moisture. Although geostatistical frameworks exist for this purpose, straightforward and user-friendly tools are required to fully capitalize on the potential of geophysical information for soil-moisture estimation. Here, we present MoisturEC, a simple R program with a graphical user interface to convert measurements or images of electrical conductivity (EC) to soil moisture. Input includes EC values, point moisture estimates, and definition of either Archie parameters (based on experimental or literature values) or empirical data of moisture vs. EC. The program produces two- and three-dimensional images of moisture based on available EC and direct measurements of moisture, interpolating between measurement locations using a Tikhonov regularization approach. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  7. Energy use, entropy and extra-terrestrial civilizations

    NASA Astrophysics Data System (ADS)

    Hetesi, Zsolt

    2010-03-01

    The possible number of extra-terrestrial civilizations is estimated by the Drake-equation. Many articles pointed out that there are missing factors and over-estimations in the original equation. In this article we will point out that assuming some axioms there might be several limits for a technical civilization. The key role of the energy use and the problem of the centres and periphery strongly influence the value of the Llifetime of a civilization. Our development have several edifications of the investigations of the growth of an alien civilization.

  8. On the estimability of parameters in undifferenced, uncombined GNSS network and PPP-RTK user models by means of $mathcal {S}$ S -system theory

    NASA Astrophysics Data System (ADS)

    Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.

    2016-01-01

    The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is furthermore demonstrated by the fact that all models in this article are derived in multi-epoch mode, allowing to incorporate dynamic model constraints on all or subsets of parameters.

  9. The direct and indirect costs of both overweight and obesity: a systematic review

    PubMed Central

    2014-01-01

    Background The rising prevalence of overweight and obesity places a financial burden on health services and on the wider economy. Health service and societal costs of overweight and obesity are typically estimated by top-down approaches which derive population attributable fractions for a range of conditions associated with increased body fat or bottom-up methods based on analyses of cross-sectional or longitudinal datasets. The evidence base of cost of obesity studies is continually expanding, however, the scope of these studies varies widely and a lack of standardised methods limits comparisons nationally and internationally. The objective of this review is to contribute to this knowledge pool by examining direct costs and indirect (lost productivity) costs of both overweight and obesity to provide comparable estimates. This review was undertaken as part of the introductory work for the Irish cost of overweight and obesity study and examines inconsistencies in the methodologies of cost of overweight and obesity studies. Studies which evaluated the direct costs and indirect costs of both overweight and obesity were included. Methods A computerised search of English language studies addressing direct and indirect costs of overweight and obesity in adults between 2001 and 2011 was conducted. Reference lists of reports, articles and earlier reviews were scanned to identify additional studies. Results Five published articles were deemed eligible for inclusion. Despite the limited scope of this review there was considerable heterogeneity in methodological approaches and findings. In the four studies which presented separate estimates for direct and indirect costs of overweight and obesity, the indirect costs were higher, accounting for between 54% and 59% of the estimated total costs. Conclusion A gradient exists between increasing BMI and direct healthcare costs and indirect costs due to reduced productivity and early premature mortality. Determining precise estimates for the increases is mired by the large presence of heterogeneity among the available cost estimation literature. To improve the availability of quality evidence an international consensus on standardised methods for cost of obesity studies is warranted. Analyses of nationally representative cross-sectional datasets augmented by data from primary care are likely to provide the best data for international comparisons. PMID:24739239

  10. The direct and indirect costs of both overweight and obesity: a systematic review.

    PubMed

    Dee, Anne; Kearns, Karen; O'Neill, Ciaran; Sharp, Linda; Staines, Anthony; O'Dwyer, Victoria; Fitzgerald, Sarah; Perry, Ivan J

    2014-04-16

    The rising prevalence of overweight and obesity places a financial burden on health services and on the wider economy. Health service and societal costs of overweight and obesity are typically estimated by top-down approaches which derive population attributable fractions for a range of conditions associated with increased body fat or bottom-up methods based on analyses of cross-sectional or longitudinal datasets. The evidence base of cost of obesity studies is continually expanding, however, the scope of these studies varies widely and a lack of standardised methods limits comparisons nationally and internationally. The objective of this review is to contribute to this knowledge pool by examining direct costs and indirect (lost productivity) costs of both overweight and obesity to provide comparable estimates. This review was undertaken as part of the introductory work for the Irish cost of overweight and obesity study and examines inconsistencies in the methodologies of cost of overweight and obesity studies. Studies which evaluated the direct costs and indirect costs of both overweight and obesity were included. A computerised search of English language studies addressing direct and indirect costs of overweight and obesity in adults between 2001 and 2011 was conducted. Reference lists of reports, articles and earlier reviews were scanned to identify additional studies. Five published articles were deemed eligible for inclusion. Despite the limited scope of this review there was considerable heterogeneity in methodological approaches and findings. In the four studies which presented separate estimates for direct and indirect costs of overweight and obesity, the indirect costs were higher, accounting for between 54% and 59% of the estimated total costs. A gradient exists between increasing BMI and direct healthcare costs and indirect costs due to reduced productivity and early premature mortality. Determining precise estimates for the increases is mired by the large presence of heterogeneity among the available cost estimation literature. To improve the availability of quality evidence an international consensus on standardised methods for cost of obesity studies is warranted. Analyses of nationally representative cross-sectional datasets augmented by data from primary care are likely to provide the best data for international comparisons.

  11. A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula

    PubMed Central

    Giordano, Bruno L.; Kayser, Christoph; Rousselet, Guillaume A.; Gross, Joachim; Schyns, Philippe G.

    2016-01-01

    Abstract We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017. © 2016 Wiley Periodicals, Inc. PMID:27860095

  12. Accounting for imperfect detection in ecology: a quantitative review.

    PubMed

    Kellner, Kenneth F; Swihart, Robert K

    2014-01-01

    Detection in studies of species abundance and distribution is often imperfect. Assuming perfect detection introduces bias into estimation that can weaken inference upon which understanding and policy are based. Despite availability of numerous methods designed to address this assumption, many refereed papers in ecology fail to account for non-detection error. We conducted a quantitative literature review of 537 ecological articles to measure the degree to which studies of different taxa, at various scales, and over time have accounted for imperfect detection. Overall, just 23% of articles accounted for imperfect detection. The probability that an article incorporated imperfect detection increased with time and varied among taxa studied; studies of vertebrates were more likely to incorporate imperfect detection. Among articles that reported detection probability, 70% contained per-survey estimates of detection that were less than 0.5. For articles in which constancy of detection was tested, 86% reported significant variation. We hope that our findings prompt more ecologists to consider carefully the detection process when designing studies and analyzing results, especially for sub-disciplines where incorporation of imperfect detection in study design and analysis so far has been lacking.

  13. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  14. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  15. A robust design mark-resight abundance estimator allowing heterogeneity in resighting probabilities

    USGS Publications Warehouse

    McClintock, B.T.; White, Gary C.; Burnham, K.P.

    2006-01-01

    This article introduces the beta-binomial estimator (BBE), a closed-population abundance mark-resight model combining the favorable qualities of maximum likelihood theory and the allowance of individual heterogeneity in sighting probability (p). The model may be parameterized for a robust sampling design consisting of multiple primary sampling occasions where closure need not be met between primary occasions. We applied the model to brown bear data from three study areas in Alaska and compared its performance to the joint hypergeometric estimator (JHE) and Bowden's estimator (BOWE). BBE estimates suggest heterogeneity levels were non-negligible and discourage the use of JHE for these data. Compared to JHE and BOWE, confidence intervals were considerably shorter for the AICc model-averaged BBE. To evaluate the properties of BBE relative to JHE and BOWE when sample sizes are small, simulations were performed with data from three primary occasions generated under both individual heterogeneity and temporal variation in p. All models remained consistent regardless of levels of variation in p. In terms of precision, the AICc model-averaged BBE showed advantages over JHE and BOWE when heterogeneity was present and mean sighting probabilities were similar between primary occasions. Based on the conditions examined, BBE is a reliable alternative to JHE or BOWE and provides a framework for further advances in mark-resight abundance estimation. ?? 2006 American Statistical Association and the International Biometric Society.

  16. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module

    PubMed Central

    Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen

    2018-01-01

    Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system. PMID:29473877

  17. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module.

    PubMed

    Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen

    2018-02-23

    Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system.

  18. Population entropies estimates of proteins

    NASA Astrophysics Data System (ADS)

    Low, Wai Yee

    2017-05-01

    The Shannon entropy equation provides a way to estimate variability of amino acids sequences in a multiple sequence alignment of proteins. Knowledge of protein variability is useful in many areas such as vaccine design, identification of antibody binding sites, and exploration of protein 3D structural properties. In cases where the population entropies of a protein are of interest but only a small sample size can be obtained, a method based on linear regression and random subsampling can be used to estimate the population entropy. This method is useful for comparisons of entropies where the actual sequence counts differ and thus, correction for alignment size bias is needed. In the current work, an R based package named EntropyCorrect that enables estimation of population entropy is presented and an empirical study on how well this new algorithm performs on simulated dataset of various combinations of population and sample sizes is discussed. The package is available at https://github.com/lloydlow/EntropyCorrect. This article, which was originally published online on 12 May 2017, contained an error in Eq. (1), where the summation sign was missing. The corrected equation appears in the Corrigendum attached to the pdf.

  19. Automated quantification of surface water inundation in wetlands using optical satellite imagery

    USGS Publications Warehouse

    DeVries, Ben; Huang, Chengquan; Lang, Megan W.; Jones, John W.; Huang, Wenli; Creed, Irena F.; Carroll, Mark L.

    2017-01-01

    We present a fully automated and scalable algorithm for quantifying surface water inundation in wetlands. Requiring no external training data, our algorithm estimates sub-pixel water fraction (SWF) over large areas and long time periods using Landsat data. We tested our SWF algorithm over three wetland sites across North America, including the Prairie Pothole Region, the Delmarva Peninsula and the Everglades, representing a gradient of inundation and vegetation conditions. We estimated SWF at 30-m resolution with accuracies ranging from a normalized root-mean-square-error of 0.11 to 0.19 when compared with various high-resolution ground and airborne datasets. SWF estimates were more sensitive to subtle inundated features compared to previously published surface water datasets, accurately depicting water bodies, large heterogeneously inundated surfaces, narrow water courses and canopy-covered water features. Despite this enhanced sensitivity, several sources of errors affected SWF estimates, including emergent or floating vegetation and forest canopies, shadows from topographic features, urban structures and unmasked clouds. The automated algorithm described in this article allows for the production of high temporal resolution wetland inundation data products to support a broad range of applications.

  20. Chair rise transfer detection and analysis using a pendant sensor: an algorithm for fall risk assessment in older people.

    PubMed

    Zhang, Wei; Regterschot, G Ruben H; Wahle, Fabian; Geraedts, Hilde; Baldus, Heribert; Zijlstra, Wiebren

    2014-01-01

    Falls result in substantial disability, morbidity, and mortality among older people. Early detection of fall risks and timely intervention can prevent falls and injuries due to falls. Simple field tests, such as repeated chair rise, are used in clinical assessment of fall risks in older people. Development of on-body sensors introduces potential beneficial alternatives for traditional clinical methods. In this article, we present a pendant sensor based chair rise detection and analysis algorithm for fall risk assessment in older people. The recall and the precision of the transfer detection were 85% and 87% in standard protocol, and 61% and 89% in daily life activities. Estimation errors of chair rise performance indicators: duration, maximum acceleration, peak power and maximum jerk were tested in over 800 transfers. Median estimation error in transfer peak power ranged from 1.9% to 4.6% in various tests. Among all the performance indicators, maximum acceleration had the lowest median estimation error of 0% and duration had the highest median estimation error of 24% over all tests. The developed algorithm might be feasible for continuous fall risk assessment in older people.

  1. One-shot estimate of MRMC variance: AUC.

    PubMed

    Gallas, Brandon D

    2006-03-01

    One popular study design for estimating the area under the receiver operating characteristic curve (AUC) is the one in which a set of readers reads a set of cases: a fully crossed design in which every reader reads every case. The variability of the subsequent reader-averaged AUC has two sources: the multiple readers and the multiple cases (MRMC). In this article, we present a nonparametric estimate for the variance of the reader-averaged AUC that is unbiased and does not use resampling tools. The one-shot estimate is based on the MRMC variance derived by the mechanistic approach of Barrett et al. (2005), as well as the nonparametric variance of a single-reader AUC derived in the literature on U statistics. We investigate the bias and variance properties of the one-shot estimate through a set of Monte Carlo simulations with simulated model observers and images. The different simulation configurations vary numbers of readers and cases, amounts of image noise and internal noise, as well as how the readers are constructed. We compare the one-shot estimate to a method that uses the jackknife resampling technique with an analysis of variance model at its foundation (Dorfman et al. 1992). The name one-shot highlights that resampling is not used. The one-shot and jackknife estimators behave similarly, with the one-shot being marginally more efficient when the number of cases is small. We have derived a one-shot estimate of the MRMC variance of AUC that is based on a probabilistic foundation with limited assumptions, is unbiased, and compares favorably to an established estimate.

  2. Looking at 3,000,000 References Without Growing Grey Hair

    NASA Astrophysics Data System (ADS)

    Demleitner, M.; Accomazzi, A.; Eichhorn, G.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    1999-12-01

    The article service of the Astrophysics Data System (ADS, http://adswww.harvard.edu) currently holds about 500,000 pages scanned from astronomical journals and conference proceedings. This data set not only facilitates an easy and convenient access to the majority of the astronomical literature from anywhere on the Internet but also allows highly automatized extraction of the information contained in the articles. As first steps towards processing and indexing the full texts of the articles, the ADS has been extracting abstracts and references from the bitmap images of the articles since May 1999. In this poster we describe the procedures and strategies to (a) automatically identify the regions within a paper containing the abstract or the references, (b) spot and correct errors in the data base or the identification of the regions, (c) resolve references obtained by optical character recognition (OCR) with its inherent uncertainties to parsed references (i.e., bibcodes) and (d) incorporate the data collected in this way into the ADS abstract service. We also give an overview of the extent of additional bibliographical material from this source. We estimate that by January 2000, these procedures will have yielded about 14,000 abstracts and 1,000,000 citation pairs (out of a total of 3,000,000 references) not previously present in the ADS.

  3. A Bayesian Approach to Model Selection in Hierarchical Mixtures-of-Experts Architectures.

    PubMed

    Tanner, Martin A.; Peng, Fengchun; Jacobs, Robert A.

    1997-03-01

    There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable; investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive "divide-and-conquer" strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.

  4. Estimating the production, consumption and export of cannabis: The Dutch case.

    PubMed

    van der Giessen, Mark; van Ooyen-Houben, Marianne M J; Moolenaar, Debora E G

    2016-05-01

    Quantifying an illegal phenomenon like a drug market is inherently complex due to its hidden nature and the limited availability of reliable information. This article presents findings from a recent estimate of the production, consumption and export of Dutch cannabis and discusses the opportunities provided by, and limitations of, mathematical models for estimating the illegal cannabis market. The data collection consisted of a comprehensive literature study, secondary analyses on data from available registrations (2012-2014) and previous studies, and expert opinion. The cannabis market was quantified with several mathematical models. The data analysis included a Monte Carlo simulation to come to a 95% interval estimate (IE) and a sensitivity analysis to identify the most influential indicators. The annual production of Dutch cannabis was estimated to be between 171 and 965tons (95% IE of 271-613tons). The consumption was estimated to be between 28 and 119tons, depending on the inclusion or exclusion of non-residents (95% IE of 51-78tons or 32-49tons respectively). The export was estimated to be between 53 and 937tons (95% IE of 206-549tons or 231-573tons, respectively). Mathematical models are valuable tools for the systematic assessment of the size of illegal markets and determining the uncertainty inherent in the estimates. The estimates required the use of many assumptions and the availability of reliable indicators was limited. This uncertainty is reflected in the wide ranges of the estimates. The estimates are sensitive to 10 of the 45 indicators. These 10 account for 86-93% of the variation found. Further research should focus on improving the variables and the independence of the mathematical models. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Magnetic nanofluid flow and convective heat transfer in a porous cavity considering Brownian motion effects

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, M.; Rokni, Houman B.

    2018-01-01

    In the present article, the improvement of nanofluid heat transfer inside a porous cavity by means of a non-equilibrium model in the existence of Lorentz forces has been investigated by employing control volume based finite element method. Nanofluid properties are estimated by means of Koo-Kleinstreuer-Li. The Darcy-Boussinesq approximation is utilized for the nanofluid flow. Roles of the solid-nanofluid interface heat transfer parameter (N h s ), Hartmann number (H a ), porosity (ɛ ), and Rayleigh number (R a ) were presented. Outputs demonstrate that the convective flow decreases with the rise of N h s , but it enhances with the rise of R a . Porosity has opposite relationship with the temperature gradient.

  6. Data characterizing tensile behavior of cenosphere/HDPE syntactic foam.

    PubMed

    Kumar, B R Bharath; Doddamani, Mrityunjay; Zeltmann, Steven E; Gupta, Nikhil; Ramakrishna, Seeram

    2016-03-01

    The data set presented is related to the tensile behavior of cenosphere reinforced high density polyethylene syntactic foam composites "Processing of cenosphere/HDPE syntactic foams using an industrial scale polymer injection molding machine" (Bharath et al., 2016) [1]. The focus of the work is on determining the feasibility of using an industrial scale polymer injection molding (PIM) machine for fabricating syntactic foams. The fabricated syntactic foams are investigated for microstructure and tensile properties. The data presented in this article is related to optimization of the PIM process for syntactic foam manufacture, equations and procedures to develop theoretical estimates for properties of cenospheres, and microstructure of syntactic foams before and after failure. Included dataset contains values obtained from the theoretical model.

  7. Conditions Presenting with Symptoms of Peripheral Arterial Disease

    PubMed Central

    Sharma, Aditya M.; Norton, Patrick T.; Zhu, Daisy

    2014-01-01

    Peripheral artery disease (PAD) is estimated to affect more than 20% of people older than 65 years. The vast majority of patients with symptoms suggestive of PAD have atherosclerosis often associated with conventional vascular risk factors such as smoking, diabetes, dyslipidemia, and inflammation. A minority of people presenting with symptoms suggesting PAD have an alternative etiology. These groups of disorders are often underdiagnosed, and if diagnosed correctly the diagnosis may be delayed. Understanding these pathologies well is important, as they can be very debilitating and optimal treatment may vary significantly. Inappropriate treatment of these disorders can lead to worsening morbidity and mortality. This article discusses the underlying causes of nonatherosclerotic PAD, including the diagnosis and treatment of these disorders. PMID:25435652

  8. Data characterizing tensile behavior of cenosphere/HDPE syntactic foam

    PubMed Central

    Kumar, B.R. Bharath; Doddamani, Mrityunjay; Zeltmann, Steven E.; Gupta, Nikhil; Ramakrishna, Seeram

    2016-01-01

    The data set presented is related to the tensile behavior of cenosphere reinforced high density polyethylene syntactic foam composites “Processing of cenosphere/HDPE syntactic foams using an industrial scale polymer injection molding machine” (Bharath et al., 2016) [1]. The focus of the work is on determining the feasibility of using an industrial scale polymer injection molding (PIM) machine for fabricating syntactic foams. The fabricated syntactic foams are investigated for microstructure and tensile properties. The data presented in this article is related to optimization of the PIM process for syntactic foam manufacture, equations and procedures to develop theoretical estimates for properties of cenospheres, and microstructure of syntactic foams before and after failure. Included dataset contains values obtained from the theoretical model. PMID:26937472

  9. On the nullspace of TLS multi-station adjustment

    NASA Astrophysics Data System (ADS)

    Sterle, Oskar; Kogoj, Dušan; Stopar, Bojan; Kregar, Klemen

    2018-07-01

    In the article we present an analytic aspect of TLS multi-station least-squares adjustment with the main focus on the datum problem. The datum problem is, compared to previously published researches, theoretically analyzed and solved, where the solution is based on nullspace derivation of the mathematical model. The importance of datum problem solution is seen in a complete description of TLS multi-station adjustment solutions from a set of all minimally constrained least-squares solutions. On a basis of known nullspace, estimable parameters are described and the geometric interpretation of all minimally constrained least squares solutions is presented. At the end a simulated example is used to analyze the results of TLS multi-station minimally constrained and inner constrained least-squares adjustment solutions.

  10. Continuous stacking computational approach based automated microscope slide scanner

    NASA Astrophysics Data System (ADS)

    Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva

    2018-02-01

    Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.

  11. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    NASA Astrophysics Data System (ADS)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  12. Valuation of National Park System Visitation: The Efficient Use of Count Data Models, Meta-Analysis, and Secondary Visitor Survey Data

    NASA Astrophysics Data System (ADS)

    Neher, Christopher; Duffield, John; Patterson, David

    2013-09-01

    The National Park Service (NPS) currently manages a large and diverse system of park units nationwide which received an estimated 279 million recreational visits in 2011. This article uses park visitor data collected by the NPS Visitor Services Project to estimate a consistent set of count data travel cost models of park visitor willingness to pay (WTP). Models were estimated using 58 different park unit survey datasets. WTP estimates for these 58 park surveys were used within a meta-regression analysis model to predict average and total WTP for NPS recreational visitation system-wide. Estimated WTP per NPS visit in 2011 averaged 102 system-wide, and ranged across park units from 67 to 288. Total 2011 visitor WTP for the NPS system is estimated at 28.5 billion with a 95% confidence interval of 19.7-43.1 billion. The estimation of a meta-regression model using consistently collected data and identical specification of visitor WTP models greatly reduces problems common to meta-regression models, including sample selection bias, primary data heterogeneity, and heteroskedasticity, as well as some aspects of panel effects. The article provides the first estimate of total annual NPS visitor WTP within the literature directly based on NPS visitor survey data.

  13. Personalized State-space Modeling of Glucose Dynamics for Type 1 Diabetes Using Continuously Monitored Glucose, Insulin Dose, and Meal Intake: An Extended Kalman Filter Approach.

    PubMed

    Wang, Qian; Molenaar, Peter; Harsh, Saurabh; Freeman, Kenneth; Xie, Jinyu; Gold, Carol; Rovine, Mike; Ulbrecht, Jan

    2014-03-01

    An essential component of any artificial pancreas is on the prediction of blood glucose levels as a function of exogenous and endogenous perturbations such as insulin dose, meal intake, and physical activity and emotional tone under natural living conditions. In this article, we present a new data-driven state-space dynamic model with time-varying coefficients that are used to explicitly quantify the time-varying patient-specific effects of insulin dose and meal intake on blood glucose fluctuations. Using the 3-variate time series of glucose level, insulin dose, and meal intake of an individual type 1 diabetic subject, we apply an extended Kalman filter (EKF) to estimate time-varying coefficients of the patient-specific state-space model. We evaluate our empirical modeling using (1) the FDA-approved UVa/Padova simulator with 30 virtual patients and (2) clinical data of 5 type 1 diabetic patients under natural living conditions. Compared to a forgetting-factor-based recursive ARX model of the same order, the EKF model predictions have higher fit, and significantly better temporal gain and J index and thus are superior in early detection of upward and downward trends in glucose. The EKF based state-space model developed in this article is particularly suitable for model-based state-feedback control designs since the Kalman filter estimates the state variable of the glucose dynamics based on the measured glucose time series. In addition, since the model parameters are estimated in real time, this model is also suitable for adaptive control. © 2014 Diabetes Technology Society.

  14. Measurement Uncertainty of Dew-Point Temperature in a Two-Pressure Humidity Generator

    NASA Astrophysics Data System (ADS)

    Martins, L. Lages; Ribeiro, A. Silva; Alves e Sousa, J.; Forbes, Alistair B.

    2012-09-01

    This article describes the measurement uncertainty evaluation of the dew-point temperature when using a two-pressure humidity generator as a reference standard. The estimation of the dew-point temperature involves the solution of a non-linear equation for which iterative solution techniques, such as the Newton-Raphson method, are required. Previous studies have already been carried out using the GUM method and the Monte Carlo method but have not discussed the impact of the approximate numerical method used to provide the temperature estimation. One of the aims of this article is to take this approximation into account. Following the guidelines presented in the GUM Supplement 1, two alternative approaches can be developed: the forward measurement uncertainty propagation by the Monte Carlo method when using the Newton-Raphson numerical procedure; and the inverse measurement uncertainty propagation by Bayesian inference, based on prior available information regarding the usual dispersion of values obtained by the calibration process. The measurement uncertainties obtained using these two methods can be compared with previous results. Other relevant issues concerning this research are the broad application to measurements that require hygrometric conditions obtained from two-pressure humidity generators and, also, the ability to provide a solution that can be applied to similar iterative models. The research also studied the factors influencing both the use of the Monte Carlo method (such as the seed value and the convergence parameter) and the inverse uncertainty propagation using Bayesian inference (such as the pre-assigned tolerance, prior estimate, and standard deviation) in terms of their accuracy and adequacy.

  15. Estimating Time-to-Collision with Retinitis Pigmentosa

    ERIC Educational Resources Information Center

    Jones, Tim

    2006-01-01

    This article reports on the ability of observers who are sighted and those with low vision to make time-to-collision (TTC) estimations using video. The TTC estimations made by the observers with low vision were comparable to those made by the sighted observers, and both groups made underestimation errors that were similar to those that were…

  16. Why is "S" a Biased Estimate of [sigma]?

    ERIC Educational Resources Information Center

    Sanqui, Jose Almer T.; Arnholt, Alan T.

    2011-01-01

    This article describes a simulation activity that can be used to help students see that the estimator "S" is a biased estimator of [sigma]. The activity can be implemented using either a statistical package such as R, Minitab, or a Web applet. In the activity, the students investigate and compare the bias of "S" when sampling from different…

  17. WITHDRAWN: Development of EMC-based empirical model for estimating spatial distribution of pollutant loads and its application in rural areas of Korea.

    PubMed

    Yi, Qitao; Li, Hui; Lee, Jin-Woo; Kim, Youngchul

    2015-09-01

    The Publisher regrets that this article is an accidental duplication of an article that has already been published in Desalination Water Treat., 27:1-3, 175-188, http://dx.doi.org/10.5004/dwt.2011.2736. The duplicate article has therefore been withdrawn. The full Elsevier Policy on Article Withdrawal can be found at http://www.elsevier.com/locate/withdrawalpolicy. Copyright © 2015. Published by Elsevier B.V.

  18. Parameter Estimation of Partial Differential Equation Models.

    PubMed

    Xun, Xiaolei; Cao, Jiguo; Mallick, Bani; Carroll, Raymond J; Maity, Arnab

    2013-01-01

    Partial differential equation (PDE) models are commonly used to model complex dynamic systems in applied sciences such as biology and finance. The forms of these PDE models are usually proposed by experts based on their prior knowledge and understanding of the dynamic system. Parameters in PDE models often have interesting scientific interpretations, but their values are often unknown, and need to be estimated from the measurements of the dynamic system in the present of measurement errors. Most PDEs used in practice have no analytic solutions, and can only be solved with numerical methods. Currently, methods for estimating PDE parameters require repeatedly solving PDEs numerically under thousands of candidate parameter values, and thus the computational load is high. In this article, we propose two methods to estimate parameters in PDE models: a parameter cascading method and a Bayesian approach. In both methods, the underlying dynamic process modeled with the PDE model is represented via basis function expansion. For the parameter cascading method, we develop two nested levels of optimization to estimate the PDE parameters. For the Bayesian method, we develop a joint model for data and the PDE, and develop a novel hierarchical model allowing us to employ Markov chain Monte Carlo (MCMC) techniques to make posterior inference. Simulation studies show that the Bayesian method and parameter cascading method are comparable, and both outperform other available methods in terms of estimation accuracy. The two methods are demonstrated by estimating parameters in a PDE model from LIDAR data.

  19. Estimations of the lethal and exposure doses for representative methanol symptoms in humans.

    PubMed

    Moon, Chan-Seok

    2017-01-01

    The aim of this review was to estimate the lethal and exposure doses of a representative symptom (blindness) of methanol exposure in humans by reviewing data from previous articles. Available articles published from 1970 to 2016 that investigated the dose-response relationship for methanol exposure (i.e., the exposure concentration and the biological markers/clinical symptoms) were evaluated; the MEDLINE and RISS (Korean search engine) databases were searched. The available data from these articles were carefully selected to estimate the range and median of a lethal human dose. The regression equation and correlation coefficient (between the exposure level and urinary methanol concentration as a biological exposure marker) were assumed from the previous data. The lethal human dose of pure methanol was estimated at 15.8-474 g/person as a range and as 56.2 g/person as the median. The dose-response relationship between methanol vapor in ambient air and urinary methanol concentrations was thought to be correlated. An oral intake of 3.16-11.85 g/person of pure methanol could cause blindness. The lethal dose from respiratory intake was reported to be 4000-13,000 mg/l. The initial concentration of optic neuritis and blindness were shown to be 228.5 and 1103 mg/l, respectively, for a 12-h exposure. The concentration of biological exposure indices and clinical symptoms for methanol exposure might have a dose-response relationship according to previous articles. Even a low dose of pure methanol through oral or respiratory exposure might be lethal or result in blindness as a clinical symptom.

  20. Using routine surveillance data to estimate the epidemic potential of emerging zoonoses: application to the emergence of US swine origin influenza A H3N2v virus.

    PubMed

    Cauchemez, Simon; Epperson, Scott; Biggerstaff, Matthew; Swerdlow, David; Finelli, Lyn; Ferguson, Neil M

    2013-01-01

    Prior to emergence in human populations, zoonoses such as SARS cause occasional infections in human populations exposed to reservoir species. The risk of widespread epidemics in humans can be assessed by monitoring the reproduction number R (average number of persons infected by a human case). However, until now, estimating R required detailed outbreak investigations of human clusters, for which resources and expertise are not always available. Additionally, existing methods do not correct for important selection and under-ascertainment biases. Here, we present simple estimation methods that overcome many of these limitations. Our approach is based on a parsimonious mathematical model of disease transmission and only requires data collected through routine surveillance and standard case investigations. We apply it to assess the transmissibility of swine-origin influenza A H3N2v-M virus in the US, Nipah virus in Malaysia and Bangladesh, and also present a non-zoonotic example (cholera in the Dominican Republic). Estimation is based on two simple summary statistics, the proportion infected by the natural reservoir among detected cases (G) and among the subset of the first detected cases in each cluster (F). If detection of a case does not affect detection of other cases from the same cluster, we find that R can be estimated by 1-G; otherwise R can be estimated by 1-F when the case detection rate is low. In more general cases, bounds on R can still be derived. We have developed a simple approach with limited data requirements that enables robust assessment of the risks posed by emerging zoonoses. We illustrate this by deriving transmissibility estimates for the H3N2v-M virus, an important step in evaluating the possible pandemic threat posed by this virus. Please see later in the article for the Editors' Summary.

  1. A systematic review and synthesis of the strengths and limitations of measuring malaria mortality through verbal autopsy.

    PubMed

    Herrera, Samantha; Enuameh, Yeetey; Adjei, George; Ae-Ngibise, Kenneth Ayuurebobi; Asante, Kwaku Poku; Sankoh, Osman; Owusu-Agyei, Seth; Yé, Yazoume

    2017-10-23

    Lack of valid and reliable data on malaria deaths continues to be a problem that plagues the global health community. To address this gap, the verbal autopsy (VA) method was developed to ascertain cause of death at the population level. Despite the adoption and wide use of VA, there are many recognized limitations of VA tools and methods, especially for measuring malaria mortality. This study synthesizes the strengths and limitations of existing VA tools and methods for measuring malaria mortality (MM) in low- and middle-income countries through a systematic literature review. The authors searched PubMed, Cochrane Library, Popline, WHOLIS, Google Scholar, and INDEPTH Network Health and Demographic Surveillance System sites' websites from 1 January 1990 to 15 January 2016 for articles and reports on MM measurement through VA. article presented results from a VA study where malaria was a cause of death; article discussed limitations/challenges related to measurement of MM through VA. Two authors independently searched the databases and websites and conducted a synthesis of articles using a standard matrix. The authors identified 828 publications; 88 were included in the final review. Most publications were VA studies; others were systematic reviews discussing VA tools or methods; editorials or commentaries; and studies using VA data to develop MM estimates. The main limitation were low sensitivity and specificity of VA tools for measuring MM. Other limitations included lack of standardized VA tools and methods, lack of a 'true' gold standard to assess accuracy of VA malaria mortality. Existing VA tools and methods for measuring MM have limitations. Given the need for data to measure progress toward the World Health Organization's Global Technical Strategy for Malaria 2016-2030 goals, the malaria community should define strategies for improving MM estimates, including exploring whether VA tools and methods could be further improved. Longer term strategies should focus on improving countries' vital registration systems for more robust and timely cause of death data.

  2. Stroke Prevalence in Children With Sickle Cell Disease in Sub-Saharan Africa: A Systematic Review and Meta-Analysis

    PubMed Central

    Munube, Deogratias; Kasirye, Philip; Mupere, Ezekiel; Jin, Zhezhen; LaRussa, Philip; Idro, Richard; Green, Nancy S.

    2018-01-01

    Objectives. The prevalence of stroke among children with sickle cell disease (SCD) in sub-Saharan Africa was systematically reviewed. Methods. Comprehensive searches of PubMed, Embase, and Web of Science were performed for articles published between 1980 and 2016 (English or French) reporting stroke prevalence. Using preselected inclusion criteria, titles and abstracts were screened and full-text articles were reviewed. Results. Ten full-text articles met selection criteria. Cross-sectional clinic-based data reported 2.9% to 16.9% stroke prevalence among children with SCD. Using available sickle gene frequencies by country, estimated pediatric mortality, and fixed- and random-effects model, the number of affected individuals is projected as 29 800 (95% confidence interval = 25 571-34 027) and 59 732 (37 004-82 460), respectively. Conclusion. Systematic review enabled the estimation of the number of children with SCD stroke in sub-Saharan Africa. High disease mortality, inaccurate diagnosis, and regional variability of risk hamper more precise estimates. Adopting standardized stroke assessments may provide more accurate determination of numbers affected to inform preventive interventions. PMID:29785408

  3. Stroke Prevalence in Children With Sickle Cell Disease in Sub-Saharan Africa: A Systematic Review and Meta-Analysis.

    PubMed

    Marks, Lianna J; Munube, Deogratias; Kasirye, Philip; Mupere, Ezekiel; Jin, Zhezhen; LaRussa, Philip; Idro, Richard; Green, Nancy S

    2018-01-01

    Objectives . The prevalence of stroke among children with sickle cell disease (SCD) in sub-Saharan Africa was systematically reviewed. Methods . Comprehensive searches of PubMed, Embase, and Web of Science were performed for articles published between 1980 and 2016 (English or French) reporting stroke prevalence. Using preselected inclusion criteria, titles and abstracts were screened and full-text articles were reviewed. Results . Ten full-text articles met selection criteria. Cross-sectional clinic-based data reported 2.9% to 16.9% stroke prevalence among children with SCD. Using available sickle gene frequencies by country, estimated pediatric mortality, and fixed- and random-effects model, the number of affected individuals is projected as 29 800 (95% confidence interval = 25 571-34 027) and 59 732 (37 004-82 460), respectively. Conclusion . Systematic review enabled the estimation of the number of children with SCD stroke in sub-Saharan Africa. High disease mortality, inaccurate diagnosis, and regional variability of risk hamper more precise estimates. Adopting standardized stroke assessments may provide more accurate determination of numbers affected to inform preventive interventions.

  4. Reply to Steele & Ferrer: Modeling Oscillation, Approximately or Exactly?

    ERIC Educational Resources Information Center

    Oud, Johan H. L.; Folmer, Henk

    2011-01-01

    This article addresses modeling oscillation in continuous time. It criticizes Steele and Ferrer's article "Latent Differential Equation Modeling of Self-Regulatory and Coregulatory Affective Processes" (2011), particularly the approximate estimation procedure applied. This procedure is the latent version of the local linear approximation procedure…

  5. [Russian oxygen generation system "Elektron-VM": hydrogen content in electrolytically produced oxygen for breathing by International Space Station crews].

    PubMed

    Proshkin, V Yu; Kurmazenko, E A

    2014-01-01

    The article presents the particulars of hydrogen content in electrolysis oxygen produced aboard the ISS Russian segment by oxygen generator "Elektron-VM" (SGK) for crew breathing. Hydrogen content was estimated as in the course of SGK operation in the ISS RS, so during the ground life tests. According to the investigation of hydrogen sources, the primary path of H2 appearance in oxygen is its diffusion through the porous diaphragm separating the electrolytic-cell cathode and anode chambers. Effectiveness of hydrogen oxidation in the SGK reheating unit was evaluated.

  6. Future Directions for the National Health Accounts

    PubMed Central

    Huskamp, Haiden A.; Newhouse, Joseph P.

    1999-01-01

    Over the past 15 years, the Health Care Financing Administration (HCFA) has engaged in ongoing efforts to improve the methodology and data collection processes used to develop the national health accounts (NHA) estimates of national health expenditures (NHE). In March 1998, HCFA initiated a third conference to explore possible improvements or useful extensions to the current NHA projects. This article summarizes the issues discussed at the conference, provides an overview of three commissioned papers on future directions for the NHA that were presented, and summarizes suggestions made by participants regarding future directions for the accounts. PMID:11481786

  7. Remote sensing in agriculture. [using Earth Resources Technology Satellite photography

    NASA Technical Reports Server (NTRS)

    Downs, S. W., Jr.

    1974-01-01

    Some examples are presented of the use of remote sensing in cultivated crops, forestry, and range management. Areas of concern include: the determination of crop areas and types, prediction of yield, and detection of disease; the determination of forest areas and types, timber volume estimation, detection of insect and disease attack, and forest fires; and the determination of range conditions and inventory, and livestock inventory. Articles in the literature are summarized and specific examples of work being performed at the Marshall Space Flight Center are given. Primarily, aerial photographs and photo-like ERTS images are considered.

  8. Unconventional tail configurations for transport aircraft

    NASA Astrophysics Data System (ADS)

    Sánchez-Carmona, A.; Cuerno-Rejado, C.; García-Hernández, L.

    2017-06-01

    This article presents the bases of a methodology in order to size unconventional tail configurations for transport aircraft. The case study of this paper is a V-tail con¦guration. Firstly, an aerodynamic study is developed for determining stability derivatives and aerodynamic forces. The objective is to size a tail such as it develops at least the same static stability derivatives than a conventional reference aircraft. The optimum is obtained minimizing its weight. The weight is estimated through two methods: adapted Farrar£s method and a statistical method. The solution reached is heavier than the reference, but it reduces the wetted area.

  9. Review of running injuries of the foot and ankle: clinical presentation and SPECT-CT imaging patterns

    PubMed Central

    Pelletier-Galarneau, Matthieu; Martineau, Patrick; Gaudreault, Maxime; Pham, Xuan

    2015-01-01

    Distance running is among the fastest growing sports, with record registration to marathons worldwide. It is estimated that more than half of recreational runners will experience injuries related to the practice of their sport. Three-phase bone scintigraphy is a very sensitive tool to identify sports injury, allowing imaging of hyperemia, stress reaction, enthesopathy and fractures, often before abnormalities can be detected on conventional anatomical modalities. In this article, we review the most common running related injuries and their imaging findings on bone scintigraphy with SPECT-CT. PMID:26269770

  10. Screening for Intimate Partner Violence During Pregnancy

    PubMed Central

    Deshpande, Neha A; Lewis-O’Connor, Annie

    2013-01-01

    Intimate partner violence (IPV) is defined as an actual or threatened abuse by an intimate partner that may be physical, sexual, psychological, or emotional in nature. Each year approximately 1.5 million women in the United States report some form of sexual or physical assault by an intimate partner; it is estimated that approximately 324,000 women are pregnant when violence occurs. Pregnancy may present a unique opportunity to identify and screen for patients experiencing IPV. This article provides health care practitioners and clinicians with the most current valid assessment and screening tools for evaluating pregnant women for IPV. PMID:24920977

  11. UAV Control on the Basis of 3D Landmark Bearing-Only Observations

    PubMed Central

    Karpenko, Simon; Konovalenko, Ivan; Miller, Alexander; Miller, Boris; Nikolaev, Dmitry

    2015-01-01

    The article presents an approach to the control of a UAV on the basis of 3D landmark observations. The novelty of the work is the usage of the 3D RANSAC algorithm developed on the basis of the landmarks’ position prediction with the aid of a modified Kalman-type filter. Modification of the filter based on the pseudo-measurements approach permits obtaining unbiased UAV position estimation with quadratic error characteristics. Modeling of UAV flight on the basis of the suggested algorithm shows good performance, even under significant external perturbations. PMID:26633394

  12. Provably trustworthy systems.

    PubMed

    Klein, Gerwin; Andronick, June; Keller, Gabriele; Matichuk, Daniel; Murray, Toby; O'Connor, Liam

    2017-10-13

    We present recent work on building and scaling trustworthy systems with formal, machine-checkable proof from the ground up, including the operating system kernel, at the level of binary machine code. We first give a brief overview of the seL4 microkernel verification and how it can be used to build verified systems. We then show two complementary techniques for scaling these methods to larger systems: proof engineering, to estimate verification effort; and code/proof co-generation, for scalable development of provably trustworthy applications.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  13. Earthquakes in Tuhinj Valley (Slovenia) In 1840

    NASA Astrophysics Data System (ADS)

    Cecić, Ina

    2015-04-01

    A less known damaging earthquake in southern part of Kamnik-Savinja Alps, Slovenia, in 1840 is described. The main shock was on 27 August 1840 with the epicentre in Tuhinj Valley. The maximum intensity was VII EMS-98 in Ljubljana, Slovenia, and in Eisenkappel, Austria. It was felt as far as Venice, Italy, 200 km away. The macroseismic magnitude of the main shock, estimated from the area of intensity VI EMS-98, was 5.0. The effects of the main shock and its aftershocks are described, and an earthquake catalogue for Slovenia in 1840 is provided. Available primary sources (newspaper articles) are presented.

  14. [Transsexualism: from diagnosis to management].

    PubMed

    De Bonnecaze, G; Pessey, J J; Chaput, B; Al Hawat, A; Vairel, B

    2013-01-01

    The transsexualism or gender dysphoria is a pathology during which an individual does not recognize himself in his sexual identity and wishes to change it: in that it must be differentiated from the sexual ambiguities (hermaphrodism, pseudohermaphroditism) in which the sexual phenotype is not clearly established. In France the number of transsexuals is estimated at approximately 50,000 people. Since 2009 the transsexualism is not any more considered as a mental illness, it remains regarded as a long term illness. The objective of this article is to present the recent evolutions concerning the management of transsexual patients seeking feminization.

  15. Intricate Puzzle of Oil and Gas Reserves Growth

    EIA Publications

    1997-01-01

    This article begins with a background discussion of the methods used to estimate proved oil and gas reserves and ultimate recovery, which is followed by a discussion of the factors that affect the ultimate recovery estimates of a field or reservoir.

  16. Continuous estimates on the earthquake early warning magnitude by use of the near-field acceleration records

    NASA Astrophysics Data System (ADS)

    Li, Jun; Jin, Xing; Wei, Yongxiang; Zhang, Hongcai

    2013-10-01

    In this article, the seismic records of Japan's Kik-net are selected to measure the acceleration, displacement, and effective peak acceleration of each seismic record within a certain time after P wave, then a continuous estimation is given on earthquake early warning magnitude through statistical analysis method, and Wenchuan earthquake record is utilized to check the method. The results show that the reliability of earthquake early warning magnitude continuously increases with the increase of the seismic information, the biggest residual happens if the acceleration is adopted to fit earthquake magnitude, which may be caused by rich high-frequency components and large dispersion of peak value in acceleration record, the influence caused by the high-frequency components can be effectively reduced if the effective peak acceleration and peak displacement is adopted, it is estimated that the dispersion of earthquake magnitude obviously reduces, but it is easy for peak displacement to be affected by long-period drifting. In various components, the residual enlargement phenomenon at vertical direction is almost unobvious, thus it is recommended in this article that the effective peak acceleration at vertical direction is preferred to estimate earthquake early warning magnitude. Through adopting Wenchuan strong earthquake record to check the method mentioned in this article, it is found that this method can be used to quickly, stably, and accurately estimate the early warning magnitude of this earthquake, which shows that this method is completely applicable for earthquake early warning.

  17. ESTIMATING THE COST OF AGRICULTURAL MORBIDITY IN MAINE AND NEW HAMPSHIRE.

    PubMed

    Jones, Nathan M; Scott, Erika E; Krupa, Nicole; Jenkins, Paul L

    2018-01-29

    This article provides an estimate for the economic costs of agricultural injuries sustained in the states of Maine and New Hampshire between the years 2008 and 2010. The authors used a novel dataset of 562 agriculturally related occupational injuries, and cost estimates were generated using the CDC's Web-based Injury Statistics Query and Reporting System (WISQARS). Individual cases from the dataset that did not match the query options for WISQARS were excluded. Of the 562 agricultural injuries identified in the dataset, 361 met the WISQARS criteria. The remaining 201 cases were judged to be incompatible with the WISQARS query criteria. Significant differences (p 0.0001) were found between the median costs of eight types of injury. Amputations (median = $70,077) and fractures (median = $13,365) were found to be the most expensive types of injury. The total cost of the 361 injuries for which estimates were available was $6,342,270. Injuries that reportedly involved machinery were found to be more expensive than injuries caused by animals. This article highlights the difference in the total cost of injury between types of injuries and demonstrates that agricultural injuries were a significant economic burden for Maine and New Hampshire for the years 2008-2010. These data can be used to direct future preventive efforts. Finally, this article suggests that WISQARS is a powerful tool for estimating injury costs without requiring access to treatment or billing records. Copyright© by the American Society of Agricultural Engineers.

  18. Hail frequency estimation across Europe based on a combination of overshooting top detections and the ERA-INTERIM reanalysis

    NASA Astrophysics Data System (ADS)

    Punge, H. J.; Bedka, K. M.; Kunz, M.; Reinbold, A.

    2017-12-01

    This article presents a hail frequency estimation based on the detection of cold overshooting cloud tops (OTs) from the Meteosat Second Generation (MSG) operational weather satellites, in combination with a hail-specific filter derived from the ERA-INTERIM reanalysis. This filter has been designed based on the atmospheric properties in the vicinity of hail reports registered in the European Severe Weather Database (ESWD). These include Convective Available Potential Energy (CAPE), 0-6-km bulk wind shear and freezing level height, evaluated at the nearest time step and interpolated from the reanalysis grid to the location of the hail report. Regions highly exposed to hail events include Northern Italy, followed by South-Eastern Austria and Eastern Spain. Pronounced hail frequency is also found in large parts of Eastern Europe, around the Alps, the Czech Republic, Southern Germany, Southern and Eastern France, and in the Iberic and Apennine mountain ranges.

  19. Test-based age-of-acquisition norms for 44 thousand English word meanings.

    PubMed

    Brysbaert, Marc; Biemiller, Andrew

    2017-08-01

    Age of acquisition (AoA) is an important variable in word recognition research. Up to now, nearly all psychology researchers examining the AoA effect have used ratings obtained from adult participants. An alternative basis for determining AoA is directly testing children's knowledge of word meanings at various ages. In educational research, scholars and teachers have tried to establish the grade at which particular words should be taught by examining the ages at which children know various word meanings. Such a list is available from Dale and O'Rourke's (1981) Living Word Vocabulary for nearly 44 thousand meanings coming from over 31 thousand unique word forms and multiword expressions. The present article relates these test-based AoA estimates to lexical decision times as well as to AoA adult ratings, and reports strong correlations between all of the measures. Therefore, test-based estimates of AoA can be used as an alternative measure.

  20. Sensor Network Localization by Eigenvector Synchronization Over the Euclidean Group

    PubMed Central

    CUCURINGU, MIHAI; LIPMAN, YARON; SINGER, AMIT

    2013-01-01

    We present a new approach to localization of sensors from noisy measurements of a subset of their Euclidean distances. Our algorithm starts by finding, embedding, and aligning uniquely realizable subsets of neighboring sensors called patches. In the noise-free case, each patch agrees with its global positioning up to an unknown rigid motion of translation, rotation, and possibly reflection. The reflections and rotations are estimated using the recently developed eigenvector synchronization algorithm, while the translations are estimated by solving an overdetermined linear system. The algorithm is scalable as the number of nodes increases and can be implemented in a distributed fashion. Extensive numerical experiments show that it compares favorably to other existing algorithms in terms of robustness to noise, sparse connectivity, and running time. While our approach is applicable to higher dimensions, in the current article, we focus on the two-dimensional case. PMID:23946700

Top