Sample records for auto-regressive moving average

  1. [Comparison of predictive effect between the single auto regressive integrated moving average (ARIMA) model and the ARIMA-generalized regression neural network (GRNN) combination model on the incidence of scarlet fever].

    PubMed

    Zhu, Yu; Xia, Jie-lai; Wang, Jing

    2009-09-01

    Application of the 'single auto regressive integrated moving average (ARIMA) model' and the 'ARIMA-generalized regression neural network (GRNN) combination model' in the research of the incidence of scarlet fever. Establish the auto regressive integrated moving average model based on the data of the monthly incidence on scarlet fever of one city, from 2000 to 2006. The fitting values of the ARIMA model was used as input of the GRNN, and the actual values were used as output of the GRNN. After training the GRNN, the effect of the single ARIMA model and the ARIMA-GRNN combination model was then compared. The mean error rate (MER) of the single ARIMA model and the ARIMA-GRNN combination model were 31.6%, 28.7% respectively and the determination coefficient (R(2)) of the two models were 0.801, 0.872 respectively. The fitting efficacy of the ARIMA-GRNN combination model was better than the single ARIMA, which had practical value in the research on time series data such as the incidence of scarlet fever.

  2. Monthly streamflow forecasting with auto-regressive integrated moving average

    NASA Astrophysics Data System (ADS)

    Nasir, Najah; Samsudin, Ruhaidah; Shabri, Ani

    2017-09-01

    Forecasting of streamflow is one of the many ways that can contribute to better decision making for water resource management. The auto-regressive integrated moving average (ARIMA) model was selected in this research for monthly streamflow forecasting with enhancement made by pre-processing the data using singular spectrum analysis (SSA). This study also proposed an extension of the SSA technique to include a step where clustering was performed on the eigenvector pairs before reconstruction of the time series. The monthly streamflow data of Sungai Muda at Jeniang, Sungai Muda at Jambatan Syed Omar and Sungai Ketil at Kuala Pegang was gathered from the Department of Irrigation and Drainage Malaysia. A ratio of 9:1 was used to divide the data into training and testing sets. The ARIMA, SSA-ARIMA and Clustered SSA-ARIMA models were all developed in R software. Results from the proposed model are then compared to a conventional auto-regressive integrated moving average model using the root-mean-square error and mean absolute error values. It was found that the proposed model can outperform the conventional model.

  3. Beyond long memory in heart rate variability: An approach based on fractionally integrated autoregressive moving average time series models with conditional heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Leite, Argentina; Paula Rocha, Ana; Eduarda Silva, Maria

    2013-06-01

    Heart Rate Variability (HRV) series exhibit long memory and time-varying conditional variance. This work considers the Fractionally Integrated AutoRegressive Moving Average (ARFIMA) models with Generalized AutoRegressive Conditional Heteroscedastic (GARCH) errors. ARFIMA-GARCH models may be used to capture and remove long memory and estimate the conditional volatility in 24 h HRV recordings. The ARFIMA-GARCH approach is applied to fifteen long term HRV series available at Physionet, leading to the discrimination among normal individuals, heart failure patients, and patients with atrial fibrillation.

  4. Robust Semi-Active Ride Control under Stochastic Excitation

    DTIC Science & Technology

    2014-01-01

    broad classes of time-series models which are of practical importance; the Auto-Regressive (AR) models, the Integrated (I) models, and the Moving...Average (MA) models [12]. Combinations of these models result in autoregressive moving average (ARMA) and autoregressive integrated moving average...Down Up 4) Down Down These four cases can be written in compact form as: (20) Where is the Heaviside

  5. Time Series ARIMA Models of Undergraduate Grade Point Average.

    ERIC Educational Resources Information Center

    Rogers, Bruce G.

    The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…

  6. Forecasting daily meteorological time series using ARIMA and regression models

    NASA Astrophysics Data System (ADS)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  7. Large signal-to-noise ratio quantification in MLE for ARARMAX models

    NASA Astrophysics Data System (ADS)

    Zou, Yiqun; Tang, Xiafei

    2014-06-01

    It has been shown that closed-loop linear system identification by indirect method can be generally transferred to open-loop ARARMAX (AutoRegressive AutoRegressive Moving Average with eXogenous input) estimation. For such models, the gradient-related optimisation with large enough signal-to-noise ratio (SNR) can avoid the potential local convergence in maximum likelihood estimation. To ease the application of this condition, the threshold SNR needs to be quantified. In this paper, we build the amplitude coefficient which is an equivalence to the SNR and prove the finiteness of the threshold amplitude coefficient within the stability region. The quantification of threshold is achieved by the minimisation of an elaborately designed multi-variable cost function which unifies all the restrictions on the amplitude coefficient. The corresponding algorithm based on two sets of physically realisable system input-output data details the minimisation and also points out how to use the gradient-related method to estimate ARARMAX parameters when local minimum is present as the SNR is small. Then, the algorithm is tested on a theoretical AutoRegressive Moving Average with eXogenous input model for the derivation of the threshold and a gas turbine engine real system for model identification, respectively. Finally, the graphical validation of threshold on a two-dimensional plot is discussed.

  8. An Intelligent Decision Support System for Workforce Forecast

    DTIC Science & Technology

    2011-01-01

    ARIMA ) model to forecast the demand for construction skills in Hong Kong. This model was based...Decision Trees ARIMA Rule Based Forecasting Segmentation Forecasting Regression Analysis Simulation Modeling Input-Output Models LP and NLP Markovian...data • When results are needed as a set of easily interpretable rules 4.1.4 ARIMA Auto-regressive, integrated, moving-average ( ARIMA ) models

  9. Application of a new hybrid model with seasonal auto-regressive integrated moving average (ARIMA) and nonlinear auto-regressive neural network (NARNN) in forecasting incidence cases of HFMD in Shenzhen, China.

    PubMed

    Yu, Lijing; Zhou, Lingling; Tan, Li; Jiang, Hongbo; Wang, Ying; Wei, Sheng; Nie, Shaofa

    2014-01-01

    Outbreaks of hand-foot-mouth disease (HFMD) have been reported for many times in Asia during the last decades. This emerging disease has drawn worldwide attention and vigilance. Nowadays, the prevention and control of HFMD has become an imperative issue in China. Early detection and response will be helpful before it happening, using modern information technology during the epidemic. In this paper, a hybrid model combining seasonal auto-regressive integrated moving average (ARIMA) model and nonlinear auto-regressive neural network (NARNN) is proposed to predict the expected incidence cases from December 2012 to May 2013, using the retrospective observations obtained from China Information System for Disease Control and Prevention from January 2008 to November 2012. The best-fitted hybrid model was combined with seasonal ARIMA [Formula: see text] and NARNN with 15 hidden units and 5 delays. The hybrid model makes the good forecasting performance and estimates the expected incidence cases from December 2012 to May 2013, which are respectively -965.03, -1879.58, 4138.26, 1858.17, 4061.86 and 6163.16 with an obviously increasing trend. The model proposed in this paper can predict the incidence trend of HFMD effectively, which could be helpful to policy makers. The usefulness of expected cases of HFMD perform not only in detecting outbreaks or providing probability statements, but also in providing decision makers with a probable trend of the variability of future observations that contains both historical and recent information.

  10. Use of Time-Series, ARIMA Designs to Assess Program Efficacy.

    ERIC Educational Resources Information Center

    Braden, Jeffery P.; And Others

    1990-01-01

    Illustrates use of time-series designs for determining efficacy of interventions with fictitious data describing drug-abuse prevention program. Discusses problems and procedures associated with time-series data analysis using Auto Regressive Integrated Moving Averages (ARIMA) models. Example illustrates application of ARIMA analysis for…

  11. Developing a predictive tropospheric ozone model for Tabriz

    NASA Astrophysics Data System (ADS)

    Khatibi, Rahman; Naghipour, Leila; Ghorbani, Mohammad A.; Smith, Michael S.; Karimi, Vahid; Farhoudi, Reza; Delafrouz, Hadi; Arvanaghi, Hadi

    2013-04-01

    Predictive ozone models are becoming indispensable tools by providing a capability for pollution alerts to serve people who are vulnerable to the risks. We have developed a tropospheric ozone prediction capability for Tabriz, Iran, by using the following five modeling strategies: three regression-type methods: Multiple Linear Regression (MLR), Artificial Neural Networks (ANNs), and Gene Expression Programming (GEP); and two auto-regression-type models: Nonlinear Local Prediction (NLP) to implement chaos theory and Auto-Regressive Integrated Moving Average (ARIMA) models. The regression-type modeling strategies explain the data in terms of: temperature, solar radiation, dew point temperature, and wind speed, by regressing present ozone values to their past values. The ozone time series are available at various time intervals, including hourly intervals, from August 2010 to March 2011. The results for MLR, ANN and GEP models are not overly good but those produced by NLP and ARIMA are promising for the establishing a forecasting capability.

  12. Two models for identification and predicting behaviour of an induction motor system

    NASA Astrophysics Data System (ADS)

    Kuo, Chien-Hsun

    2018-01-01

    System identification or modelling is the process of building mathematical models of dynamical systems based on the available input and output data from the systems. This paper introduces system identification by using ARX (Auto Regressive with eXogeneous input) and ARMAX (Auto Regressive Moving Average with eXogeneous input) models. Through the identified system model, the predicted output could be compared with the measured one to help prevent the motor faults from developing into a catastrophic machine failure and avoid unnecessary costs and delays caused by the need to carry out unscheduled repairs. The induction motor system is illustrated as an example. Numerical and experimental results are shown for the identified induction motor system.

  13. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  14. Impact of the Illinois Seat Belt Use Law on Accidents, Deaths, and Injuries.

    ERIC Educational Resources Information Center

    Rock, Steven M.

    1992-01-01

    The impact of the 1985 Illinois seat belt law is explored using Box-Jenkins Auto-Regressive, Integrated Moving Averages (ARIMA) techniques and monthly accident statistical data from the state department of transportation for January-July 1990. A conservative estimate is that the law provides benefits of $15 million per month in Illinois. (SLD)

  15. Large deviation probabilities for correlated Gaussian stochastic processes and daily temperature anomalies

    NASA Astrophysics Data System (ADS)

    Massah, Mozhdeh; Kantz, Holger

    2016-04-01

    As we have one and only one earth and no replicas, climate characteristics are usually computed as time averages from a single time series. For understanding climate variability, it is essential to understand how close a single time average will typically be to an ensemble average. To answer this question, we study large deviation probabilities (LDP) of stochastic processes and characterize them by their dependence on the time window. In contrast to iid variables for which there exists an analytical expression for the rate function, the correlated variables such as auto-regressive (short memory) and auto-regressive fractionally integrated moving average (long memory) processes, have not an analytical LDP. We study LDP for these processes, in order to see how correlation affects this probability in comparison to iid data. Although short range correlations lead to a simple correction of sample size, long range correlations lead to a sub-exponential decay of LDP and hence to a very slow convergence of time averages. This effect is demonstrated for a 120 year long time series of daily temperature anomalies measured in Potsdam (Germany).

  16. Modeling Geodetic Processes with Levy α-Stable Distribution and FARIMA

    NASA Astrophysics Data System (ADS)

    Montillet, Jean-Philippe; Yu, Kegen

    2015-04-01

    Over the last years the scientific community has been using the auto regressive moving average (ARMA) model in the modeling of the noise in global positioning system (GPS) time series (daily solution). This work starts with the investigation of the limit of the ARMA model which is widely used in signal processing when the measurement noise is white. Since a typical GPS time series consists of geophysical signals (e.g., seasonal signal) and stochastic processes (e.g., coloured and white noise), the ARMA model may be inappropriate. Therefore, the application of the fractional auto-regressive integrated moving average (FARIMA) model is investigated. The simulation results using simulated time series as well as real GPS time series from a few selected stations around Australia show that the FARIMA model fits the time series better than other models when the coloured noise is larger than the white noise. The second fold of this work focuses on fitting the GPS time series with the family of Levy α-stable distributions. Using this distribution, a hypothesis test is developed to eliminate effectively coarse outliers from GPS time series, achieving better performance than using the rule of thumb of n standard deviations (with n chosen empirically).

  17. Distributed parameter system coupled ARMA expansion identification and adaptive parallel IIR filtering - A unified problem statement. [Auto Regressive Moving-Average

    NASA Technical Reports Server (NTRS)

    Johnson, C. R., Jr.; Balas, M. J.

    1980-01-01

    A novel interconnection of distributed parameter system (DPS) identification and adaptive filtering is presented, which culminates in a common statement of coupled autoregressive, moving-average expansion or parallel infinite impulse response configuration adaptive parameterization. The common restricted complexity filter objectives are seen as similar to the reduced-order requirements of the DPS expansion description. The interconnection presents the possibility of an exchange of problem formulations and solution approaches not yet easily addressed in the common finite dimensional lumped-parameter system context. It is concluded that the shared problems raised are nevertheless many and difficult.

  18. Modelling and Closed-Loop System Identification of a Quadrotor-Based Aerial Manipulator

    NASA Astrophysics Data System (ADS)

    Dube, Chioniso; Pedro, Jimoh O.

    2018-05-01

    This paper presents the modelling and system identification of a quadrotor-based aerial manipulator. The aerial manipulator model is first derived analytically using the Newton-Euler formulation for the quadrotor and Recursive Newton-Euler formulation for the manipulator. The aerial manipulator is then simulated with the quadrotor under Proportional Derivative (PD) control, with the manipulator in motion. The simulation data is then used for system identification of the aerial manipulator. Auto Regressive with eXogenous inputs (ARX) models are obtained from the system identification for linear accelerations \\ddot{X} and \\ddot{Y} and yaw angular acceleration \\ddot{\\psi }. For linear acceleration \\ddot{Z}, and pitch and roll angular accelerations \\ddot{θ } and \\ddot{φ }, Auto Regressive Moving Average with eXogenous inputs (ARMAX) models are identified.

  19. [The trial of business data analysis at the Department of Radiology by constructing the auto-regressive integrated moving-average (ARIMA) model].

    PubMed

    Tani, Yuji; Ogasawara, Katsuhiko

    2012-01-01

    This study aimed to contribute to the management of a healthcare organization by providing management information using time-series analysis of business data accumulated in the hospital information system, which has not been utilized thus far. In this study, we examined the performance of the prediction method using the auto-regressive integrated moving-average (ARIMA) model, using the business data obtained at the Radiology Department. We made the model using the data used for analysis, which was the number of radiological examinations in the past 9 years, and we predicted the number of radiological examinations in the last 1 year. Then, we compared the actual value with the forecast value. We were able to establish that the performance prediction method was simple and cost-effective by using free software. In addition, we were able to build the simple model by pre-processing the removal of trend components using the data. The difference between predicted values and actual values was 10%; however, it was more important to understand the chronological change rather than the individual time-series values. Furthermore, our method was highly versatile and adaptable compared to the general time-series data. Therefore, different healthcare organizations can use our method for the analysis and forecasting of their business data.

  20. Simultaneous Estimation of Electromechanical Modes and Forced Oscillations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, Jim; Pierre, John W.; Martin, Russell

    Over the past several years, great strides have been made in the effort to monitor the small-signal stability of power systems. These efforts focus on estimating electromechanical modes, which are a property of the system that dictate how generators in different parts of the system exchange energy. Though the algorithms designed for this task are powerful and important for reliable operation of the power system, they are susceptible to severe bias when forced oscillations are present in the system. Forced oscillations are fundamentally different from electromechanical oscillations in that they are the result of a rogue input to the system,more » rather than a property of the system itself. To address the presence of forced oscillations, the frequently used AutoRegressive Moving Average (ARMA) model is adapted to include sinusoidal inputs, resulting in the AutoRegressive Moving Average plus Sinusoid (ARMA+S) model. From this model, a new Two-Stage Least Squares algorithm is derived to incorporate the forced oscillations, thereby enabling the simultaneous estimation of the electromechanical modes and the amplitude and phase of the forced oscillations. The method is validated using simulated power system data as well as data obtained from the western North American power system (wNAPS) and Eastern Interconnection (EI).« less

  1. Forecasting the incidence of tuberculosis in China using the seasonal auto-regressive integrated moving average (SARIMA) model.

    PubMed

    Mao, Qiang; Zhang, Kai; Yan, Wu; Cheng, Chaonan

    2018-05-02

    The aims of this study were to develop a forecasting model for the incidence of tuberculosis (TB) and analyze the seasonality of infections in China; and to provide a useful tool for formulating intervention programs and allocating medical resources. Data for the monthly incidence of TB from January 2004 to December 2015 were obtained from the National Scientific Data Sharing Platform for Population and Health (China). The Box-Jenkins method was applied to fit a seasonal auto-regressive integrated moving average (SARIMA) model to forecast the incidence of TB over the subsequent six months. During the study period of 144 months, 12,321,559 TB cases were reported in China, with an average monthly incidence of 6.4426 per 100,000 of the population. The monthly incidence of TB showed a clear 12-month cycle, and a seasonality with two peaks occurring in January and March and a trough in December. The best-fit model was SARIMA (1,0,0)(0,1,1) 12 , which demonstrated adequate information extraction (white noise test, p>0.05). Based on the analysis, the incidence of TB from January to June 2016 were 6.6335, 4.7208, 5.8193, 5.5474, 5.2202 and 4.9156 per 100,000 of the population, respectively. According to the seasonal pattern of TB incidence in China, the SARIMA model was proposed as a useful tool for monitoring epidemics. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Kepler AutoRegressive Planet Search: Motivation & Methodology

    NASA Astrophysics Data System (ADS)

    Caceres, Gabriel; Feigelson, Eric; Jogesh Babu, G.; Bahamonde, Natalia; Bertin, Karine; Christen, Alejandra; Curé, Michel; Meza, Cristian

    2015-08-01

    The Kepler AutoRegressive Planet Search (KARPS) project uses statistical methodology associated with autoregressive (AR) processes to model Kepler lightcurves in order to improve exoplanet transit detection in systems with high stellar variability. We also introduce a planet-search algorithm to detect transits in time-series residuals after application of the AR models. One of the main obstacles in detecting faint planetary transits is the intrinsic stellar variability of the host star. The variability displayed by many stars may have autoregressive properties, wherein later flux values are correlated with previous ones in some manner. Auto-Regressive Moving-Average (ARMA) models, Generalized Auto-Regressive Conditional Heteroskedasticity (GARCH), and related models are flexible, phenomenological methods used with great success to model stochastic temporal behaviors in many fields of study, particularly econometrics. Powerful statistical methods are implemented in the public statistical software environment R and its many packages. Modeling involves maximum likelihood fitting, model selection, and residual analysis. These techniques provide a useful framework to model stellar variability and are used in KARPS with the objective of reducing stellar noise to enhance opportunities to find as-yet-undiscovered planets. Our analysis procedure consisting of three steps: pre-processing of the data to remove discontinuities, gaps and outliers; ARMA-type model selection and fitting; and transit signal search of the residuals using a new Transit Comb Filter (TCF) that replaces traditional box-finding algorithms. We apply the procedures to simulated Kepler-like time series with known stellar and planetary signals to evaluate the effectiveness of the KARPS procedures. The ARMA-type modeling is effective at reducing stellar noise, but also reduces and transforms the transit signal into ingress/egress spikes. A periodogram based on the TCF is constructed to concentrate the signal of these periodic spikes. When a periodic transit is found, the model is displayed on a standard period-folded averaged light curve. We also illustrate the efficient coding in R.

  3. Polymer Coatings Degradation Properties

    DTIC Science & Technology

    1985-02-01

    undertaken 124). The Box-Jenkins approach first evaluates the partial auto -correlation function and determines the order of the moving average memory function...78 - Tables 15 and 16 show the resalit- f- a, the partial auto correlation plots. Second order moving .-. "ra ;;th -he appropriate lags were...coated films. Kaempf, Guenter; Papenroth, Wolfgang; Kunststoffe Date: 1982 Volume: 72 Number:7 Pages: 424-429 Parameters influencing the accelerated

  4. Statistical description of turbulent transport for flux driven toroidal plasmas

    NASA Astrophysics Data System (ADS)

    Anderson, J.; Imadera, K.; Kishimoto, Y.; Li, J. Q.; Nordman, H.

    2017-06-01

    A novel methodology to analyze non-Gaussian probability distribution functions (PDFs) of intermittent turbulent transport in global full-f gyrokinetic simulations is presented. In this work, the auto-regressive integrated moving average (ARIMA) model is applied to time series data of intermittent turbulent heat transport to separate noise and oscillatory trends, allowing for the extraction of non-Gaussian features of the PDFs. It was shown that non-Gaussian tails of the PDFs from first principles based gyrokinetic simulations agree with an analytical estimation based on a two fluid model.

  5. Weather variability and the incidence of cryptosporidiosis: comparison of time series poisson regression and SARIMA models.

    PubMed

    Hu, Wenbiao; Tong, Shilu; Mengersen, Kerrie; Connell, Des

    2007-09-01

    Few studies have examined the relationship between weather variables and cryptosporidiosis in Australia. This paper examines the potential impact of weather variability on the transmission of cryptosporidiosis and explores the possibility of developing an empirical forecast system. Data on weather variables, notified cryptosporidiosis cases, and population size in Brisbane were supplied by the Australian Bureau of Meteorology, Queensland Department of Health, and Australian Bureau of Statistics for the period of January 1, 1996-December 31, 2004, respectively. Time series Poisson regression and seasonal auto-regression integrated moving average (SARIMA) models were performed to examine the potential impact of weather variability on the transmission of cryptosporidiosis. Both the time series Poisson regression and SARIMA models show that seasonal and monthly maximum temperature at a prior moving average of 1 and 3 months were significantly associated with cryptosporidiosis disease. It suggests that there may be 50 more cases a year for an increase of 1 degrees C maximum temperature on average in Brisbane. Model assessments indicated that the SARIMA model had better predictive ability than the Poisson regression model (SARIMA: root mean square error (RMSE): 0.40, Akaike information criterion (AIC): -12.53; Poisson regression: RMSE: 0.54, AIC: -2.84). Furthermore, the analysis of residuals shows that the time series Poisson regression appeared to violate a modeling assumption, in that residual autocorrelation persisted. The results of this study suggest that weather variability (particularly maximum temperature) may have played a significant role in the transmission of cryptosporidiosis. A SARIMA model may be a better predictive model than a Poisson regression model in the assessment of the relationship between weather variability and the incidence of cryptosporidiosis.

  6. Performance evaluation of ionospheric time delay forecasting models using GPS observations at a low-latitude station

    NASA Astrophysics Data System (ADS)

    Sivavaraprasad, G.; Venkata Ratnam, D.

    2017-07-01

    Ionospheric delay is one of the major atmospheric effects on the performance of satellite-based radio navigation systems. It limits the accuracy and availability of Global Positioning System (GPS) measurements, related to critical societal and safety applications. The temporal and spatial gradients of ionospheric total electron content (TEC) are driven by several unknown priori geophysical conditions and solar-terrestrial phenomena. Thereby, the prediction of ionospheric delay is challenging especially over Indian sub-continent. Therefore, an appropriate short/long-term ionospheric delay forecasting model is necessary. Hence, the intent of this paper is to forecast ionospheric delays by considering day to day, monthly and seasonal ionospheric TEC variations. GPS-TEC data (January 2013-December 2013) is extracted from a multi frequency GPS receiver established at K L University, Vaddeswaram, Guntur station (geographic: 16.37°N, 80.37°E; geomagnetic: 7.44°N, 153.75°E), India. An evaluation, in terms of forecasting capabilities, of three ionospheric time delay models - an Auto Regressive Moving Average (ARMA) model, Auto Regressive Integrated Moving Average (ARIMA) model, and a Holt-Winter's model is presented. The performances of these models are evaluated through error measurement analysis during both geomagnetic quiet and disturbed days. It is found that, ARMA model is effectively forecasting the ionospheric delay with an accuracy of 82-94%, which is 10% more superior to ARIMA and Holt-Winter's models. Moreover, the modeled VTEC derived from International Reference Ionosphere, IRI (IRI-2012) model and new global TEC model, Neustrelitz TEC Model (NTCM-GL) have compared with forecasted VTEC values of ARMA, ARIMA and Holt-Winter's models during geomagnetic quiet days. The forecast results are indicating that ARMA model would be useful to set up an early warning system for ionospheric disturbances at low latitude regions.

  7. Application of seasonal auto-regressive integrated moving average model in forecasting the incidence of hand-foot-mouth disease in Wuhan, China.

    PubMed

    Peng, Ying; Yu, Bin; Wang, Peng; Kong, De-Guang; Chen, Bang-Hua; Yang, Xiao-Bing

    2017-12-01

    Outbreaks of hand-foot-mouth disease (HFMD) have occurred many times and caused serious health burden in China since 2008. Application of modern information technology to prediction and early response can be helpful for efficient HFMD prevention and control. A seasonal auto-regressive integrated moving average (ARIMA) model for time series analysis was designed in this study. Eighty-four-month (from January 2009 to December 2015) retrospective data obtained from the Chinese Information System for Disease Prevention and Control were subjected to ARIMA modeling. The coefficient of determination (R 2 ), normalized Bayesian Information Criterion (BIC) and Q-test P value were used to evaluate the goodness-of-fit of constructed models. Subsequently, the best-fitted ARIMA model was applied to predict the expected incidence of HFMD from January 2016 to December 2016. The best-fitted seasonal ARIMA model was identified as (1,0,1)(0,1,1) 12 , with the largest coefficient of determination (R 2 =0.743) and lowest normalized BIC (BIC=3.645) value. The residuals of the model also showed non-significant autocorrelations (P Box-Ljung (Q) =0.299). The predictions by the optimum ARIMA model adequately captured the pattern in the data and exhibited two peaks of activity over the forecast interval, including a major peak during April to June, and again a light peak for September to November. The ARIMA model proposed in this study can forecast HFMD incidence trend effectively, which could provide useful support for future HFMD prevention and control in the study area. Besides, further observations should be added continually into the modeling data set, and parameters of the models should be adjusted accordingly.

  8. Determination of baroreflex gain using auto-regressive moving-average analysis during spontaneous breathing.

    PubMed

    O'Leary, D D; Lin, D C; Hughson, R L

    1999-09-01

    The heart rate component of the arterial baroreflex gain (BRG) was determined with auto-regressive moving-average (ARMA) analysis during each of spontaneous (SB) and random breathing (RB) protocols. Ten healthy subjects completed each breathing pattern on two different days in each of two different body positions, supine (SUP) and head-up tilt (HUT). The R-R interval, systolic arterial pressure (SAP) and instantaneous lung volume were recorded continuously. BRG was estimated from the ARMA impulse response relationship of R-R interval to SAP and from the spontaneous sequence method. The results indicated that both the ARMA and spontaneous sequence methods were reproducible (r = 0.76 and r = 0.85, respectively). As expected, BRG was significantly less in the HUT compared to SUP position for both ARMA (mean +/- SEM; 3.5 +/- 0.3 versus 11.2 +/- 1.4 ms mmHg-1; P < 0.01) and spontaneous sequence analysis (10.3 +/- 0.8 versus 31.5 +/- 2.3 ms mmHg-1; P < 0.001). However, no significant difference was found between BRG during RB and SB protocols for either ARMA (7.9 +/- 1.4 versus 6.7 +/- 0.8 ms mmHg-1; P = 0.27) or spontaneous sequence methods (21.8 +/- 2.7 versus 20.0 +/- 2.1 ms mmHg-1; P = 0.24). BRG was correlated during RB and SB protocols (r = 0.80; P < 0.0001). ARMA and spontaneous BRG estimates were correlated (r = 0.79; P < 0.0001), with spontaneous sequence values being consistently larger (P < 0.0001). In conclusion, we have shown that ARMA-derived BRG values are reproducible and that they can be determined during SB conditions, making the ARMA method appropriate for use in a wider range of patients.

  9. An asymptotic theory for cross-correlation between auto-correlated sequences and its application on neuroimaging data.

    PubMed

    Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng

    2018-04-20

    Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.

  10. NARMAX model identification of a palm oil biodiesel engine using multi-objective optimization differential evolution

    NASA Astrophysics Data System (ADS)

    Mansor, Zakwan; Zakaria, Mohd Zakimi; Nor, Azuwir Mohd; Saad, Mohd Sazli; Ahmad, Robiah; Jamaluddin, Hishamuddin

    2017-09-01

    This paper presents the black-box modelling of palm oil biodiesel engine (POB) using multi-objective optimization differential evolution (MOODE) algorithm. Two objective functions are considered in the algorithm for optimization; minimizing the number of term of a model structure and minimizing the mean square error between actual and predicted outputs. The mathematical model used in this study to represent the POB system is nonlinear auto-regressive moving average with exogenous input (NARMAX) model. Finally, model validity tests are applied in order to validate the possible models that was obtained from MOODE algorithm and lead to select an optimal model.

  11. Photonic single nonlinear-delay dynamical node for information processing

    NASA Astrophysics Data System (ADS)

    Ortín, Silvia; San-Martín, Daniel; Pesquera, Luis; Gutiérrez, José Manuel

    2012-06-01

    An electro-optical system with a delay loop based on semiconductor lasers is investigated for information processing by performing numerical simulations. This system can replace a complex network of many nonlinear elements for the implementation of Reservoir Computing. We show that a single nonlinear-delay dynamical system has the basic properties to perform as reservoir: short-term memory and separation property. The computing performance of this system is evaluated for two prediction tasks: Lorenz chaotic time series and nonlinear auto-regressive moving average (NARMA) model. We sweep the parameters of the system to find the best performance. The results achieved for the Lorenz and the NARMA-10 tasks are comparable to those obtained by other machine learning methods.

  12. Forecasting of Water Consumptions Expenditure Using Holt-Winter’s and ARIMA

    NASA Astrophysics Data System (ADS)

    Razali, S. N. A. M.; Rusiman, M. S.; Zawawi, N. I.; Arbin, N.

    2018-04-01

    This study is carried out to forecast water consumption expenditure of Malaysian university specifically at University Tun Hussein Onn Malaysia (UTHM). The proposed Holt-Winter’s and Auto-Regressive Integrated Moving Average (ARIMA) models were applied to forecast the water consumption expenditure in Ringgit Malaysia from year 2006 until year 2014. The two models were compared and performance measurement of the Mean Absolute Percentage Error (MAPE) and Mean Absolute Deviation (MAD) were used. It is found that ARIMA model showed better results regarding the accuracy of forecast with lower values of MAPE and MAD. Analysis showed that ARIMA (2,1,4) model provided a reasonable forecasting tool for university campus water usage.

  13. A frequency domain global parameter estimation method for multiple reference frequency response measurements

    NASA Astrophysics Data System (ADS)

    Shih, C. Y.; Tsuei, Y. G.; Allemang, R. J.; Brown, D. L.

    1988-10-01

    A method of using the matrix Auto-Regressive Moving Average (ARMA) model in the Laplace domain for multiple-reference global parameter identification is presented. This method is particularly applicable to the area of modal analysis where high modal density exists. The method is also applicable when multiple reference frequency response functions are used to characterise linear systems. In order to facilitate the mathematical solution, the Forsythe orthogonal polynomial is used to reduce the ill-conditioning of the formulated equations and to decouple the normal matrix into two reduced matrix blocks. A Complex Mode Indicator Function (CMIF) is introduced, which can be used to determine the proper order of the rational polynomials.

  14. Short-term forecasts gain in accuracy. [Regression technique using ''Box-Jenkins'' analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Box-Jenkins time-series models offer accuracy for short-term forecasts that compare with large-scale macroeconomic forecasts. Utilities need to be able to forecast peak demand in order to plan their generating, transmitting, and distribution systems. This new method differs from conventional models by not assuming specific data patterns, but by fitting available data into a tentative pattern on the basis of auto-correlations. Three types of models (autoregressive, moving average, or mixed autoregressive/moving average) can be used according to which provides the most appropriate combination of autocorrelations and related derivatives. Major steps in choosing a model are identifying potential models, estimating the parametersmore » of the problem, and running a diagnostic check to see if the model fits the parameters. The Box-Jenkins technique is well suited for seasonal patterns, which makes it possible to have as short as hourly forecasts of load demand. With accuracy up to two years, the method will allow electricity price-elasticity forecasting that can be applied to facility planning and rate design. (DCK)« less

  15. Nonlinear System Identification for Aeroelastic Systems with Application to Experimental Data

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2008-01-01

    Representation and identification of a nonlinear aeroelastic pitch-plunge system as a model of the Nonlinear AutoRegressive, Moving Average eXogenous (NARMAX) class is considered. A nonlinear difference equation describing this aircraft model is derived theoretically and shown to be of the NARMAX form. Identification methods for NARMAX models are applied to aeroelastic dynamics and its properties demonstrated via continuous-time simulations of experimental conditions. Simulation results show that (1) the outputs of the NARMAX model closely match those generated using continuous-time methods, and (2) NARMAX identification methods applied to aeroelastic dynamics provide accurate discrete-time parameter estimates. Application of NARMAX identification to experimental pitch-plunge dynamics data gives a high percent fit for cross-validated data.

  16. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, K.C.; Hoyer, K.K.; Humenik, K.E.

    1995-10-17

    A method and system for monitoring an industrial process and a sensor are disclosed. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 17 figs.

  17. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, K.C.; Hoyer, K.K.; Humenik, K.E.

    1997-05-13

    A method and system are disclosed for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 17 figs.

  18. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, Kenneth C.; Hoyer, Kristin K.; Humenik, Keith E.

    1995-01-01

    A method and system for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.

  19. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, Kenneth C.; Hoyer, Kristin K.; Humenik, Keith E.

    1997-01-01

    A method and system for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.

  20. Parameter estimation of an ARMA model for river flow forecasting using goal programming

    NASA Astrophysics Data System (ADS)

    Mohammadi, Kourosh; Eslami, H. R.; Kahawita, Rene

    2006-11-01

    SummaryRiver flow forecasting constitutes one of the most important applications in hydrology. Several methods have been developed for this purpose and one of the most famous techniques is the Auto regressive moving average (ARMA) model. In the research reported here, the goal was to minimize the error for a specific season of the year as well as for the complete series. Goal programming (GP) was used to estimate the ARMA model parameters. Shaloo Bridge station on the Karun River with 68 years of observed stream flow data was selected to evaluate the performance of the proposed method. The results when compared with the usual method of maximum likelihood estimation were favorable with respect to the new proposed algorithm.

  1. Relationship research between meteorological disasters and stock markets based on a multifractal detrending moving average algorithm

    NASA Astrophysics Data System (ADS)

    Li, Qingchen; Cao, Guangxi; Xu, Wei

    2018-01-01

    Based on a multifractal detrending moving average algorithm (MFDMA), this study uses the fractionally autoregressive integrated moving average process (ARFIMA) to demonstrate the effectiveness of MFDMA in the detection of auto-correlation at different sample lengths and to simulate some artificial time series with the same length as the actual sample interval. We analyze the effect of predictable and unpredictable meteorological disasters on the US and Chinese stock markets and the degree of long memory in different sectors. Furthermore, we conduct a preliminary investigation to determine whether the fluctuations of financial markets caused by meteorological disasters are derived from the normal evolution of the financial system itself or not. We also propose several reasonable recommendations.

  2. Univariate Time Series Prediction of Solar Power Using a Hybrid Wavelet-ARMA-NARX Prediction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nazaripouya, Hamidreza; Wang, Yubo; Chu, Chi-Cheng

    This paper proposes a new hybrid method for super short-term solar power prediction. Solar output power usually has a complex, nonstationary, and nonlinear characteristic due to intermittent and time varying behavior of solar radiance. In addition, solar power dynamics is fast and is inertia less. An accurate super short-time prediction is required to compensate for the fluctuations and reduce the impact of solar power penetration on the power system. The objective is to predict one step-ahead solar power generation based only on historical solar power time series data. The proposed method incorporates discrete wavelet transform (DWT), Auto-Regressive Moving Average (ARMA)more » models, and Recurrent Neural Networks (RNN), while the RNN architecture is based on Nonlinear Auto-Regressive models with eXogenous inputs (NARX). The wavelet transform is utilized to decompose the solar power time series into a set of richer-behaved forming series for prediction. ARMA model is employed as a linear predictor while NARX is used as a nonlinear pattern recognition tool to estimate and compensate the error of wavelet-ARMA prediction. The proposed method is applied to the data captured from UCLA solar PV panels and the results are compared with some of the common and most recent solar power prediction methods. The results validate the effectiveness of the proposed approach and show a considerable improvement in the prediction precision.« less

  3. Time Series Analysis and Forecasting of Wastewater Inflow into Bandar Tun Razak Sewage Treatment Plant in Selangor, Malaysia

    NASA Astrophysics Data System (ADS)

    Abunama, Taher; Othman, Faridah

    2017-06-01

    Analysing the fluctuations of wastewater inflow rates in sewage treatment plants (STPs) is essential to guarantee a sufficient treatment of wastewater before discharging it to the environment. The main objectives of this study are to statistically analyze and forecast the wastewater inflow rates into the Bandar Tun Razak STP in Kuala Lumpur, Malaysia. A time series analysis of three years’ weekly influent data (156weeks) has been conducted using the Auto-Regressive Integrated Moving Average (ARIMA) model. Various combinations of ARIMA orders (p, d, q) have been tried to select the most fitted model, which was utilized to forecast the wastewater inflow rates. The linear regression analysis was applied to testify the correlation between the observed and predicted influents. ARIMA (3, 1, 3) model was selected with the highest significance R-square and lowest normalized Bayesian Information Criterion (BIC) value, and accordingly the wastewater inflow rates were forecasted to additional 52weeks. The linear regression analysis between the observed and predicted values of the wastewater inflow rates showed a positive linear correlation with a coefficient of 0.831.

  4. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter

    PubMed Central

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  5. GPC-Based Stable Reconfigurable Control

    NASA Technical Reports Server (NTRS)

    Soloway, Don; Shi, Jian-Jun; Kelkar, Atul

    2004-01-01

    This paper presents development of multi-input multi-output (MIMO) Generalized Pre-dictive Control (GPC) law and its application to reconfigurable control design in the event of actuator saturation. A Controlled Auto-Regressive Integrating Moving Average (CARIMA) model is used to describe the plant dynamics. The control law is derived using input-output description of the system and is also related to the state-space form of the model. The stability of the GPC control law without reconfiguration is first established using Riccati-based approach and state-space formulation. A novel reconfiguration strategy is developed for the systems which have actuator redundancy and are faced with actuator saturation type failure. An elegant reconfigurable control design is presented with stability proof. Several numerical examples are presented to demonstrate the application of various results.

  6. Forecasting seeing and parameters of long-exposure images by means of ARIMA

    NASA Astrophysics Data System (ADS)

    Kornilov, Matwey V.

    2016-02-01

    Atmospheric turbulence is the one of the major limiting factors for ground-based astronomical observations. In this paper, the problem of short-term forecasting seeing is discussed. The real data that were obtained by atmospheric optical turbulence (OT) measurements above Mount Shatdzhatmaz in 2007-2013 have been analysed. Linear auto-regressive integrated moving average (ARIMA) models are used for the forecasting. A new procedure for forecasting the image characteristics of direct astronomical observations (central image intensity, full width at half maximum, radius encircling 80 % of the energy) has been proposed. Probability density functions of the forecast of these quantities are 1.5-2 times thinner than the respective unconditional probability density functions. Overall, this study found that the described technique could adequately describe temporal stochastic variations of the OT power.

  7. Forecasting conditional climate-change using a hybrid approach

    USGS Publications Warehouse

    Esfahani, Akbar Akbari; Friedel, Michael J.

    2014-01-01

    A novel approach is proposed to forecast the likelihood of climate-change across spatial landscape gradients. This hybrid approach involves reconstructing past precipitation and temperature using the self-organizing map technique; determining quantile trends in the climate-change variables by quantile regression modeling; and computing conditional forecasts of climate-change variables based on self-similarity in quantile trends using the fractionally differenced auto-regressive integrated moving average technique. The proposed modeling approach is applied to states (Arizona, California, Colorado, Nevada, New Mexico, and Utah) in the southwestern U.S., where conditional forecasts of climate-change variables are evaluated against recent (2012) observations, evaluated at a future time period (2030), and evaluated as future trends (2009–2059). These results have broad economic, political, and social implications because they quantify uncertainty in climate-change forecasts affecting various sectors of society. Another benefit of the proposed hybrid approach is that it can be extended to any spatiotemporal scale providing self-similarity exists.

  8. Deep learning architecture for air quality predictions.

    PubMed

    Li, Xiang; Peng, Ling; Hu, Yuan; Shao, Jing; Chi, Tianhe

    2016-11-01

    With the rapid development of urbanization and industrialization, many developing countries are suffering from heavy air pollution. Governments and citizens have expressed increasing concern regarding air pollution because it affects human health and sustainable development worldwide. Current air quality prediction methods mainly use shallow models; however, these methods produce unsatisfactory results, which inspired us to investigate methods of predicting air quality based on deep architecture models. In this paper, a novel spatiotemporal deep learning (STDL)-based air quality prediction method that inherently considers spatial and temporal correlations is proposed. A stacked autoencoder (SAE) model is used to extract inherent air quality features, and it is trained in a greedy layer-wise manner. Compared with traditional time series prediction models, our model can predict the air quality of all stations simultaneously and shows the temporal stability in all seasons. Moreover, a comparison with the spatiotemporal artificial neural network (STANN), auto regression moving average (ARMA), and support vector regression (SVR) models demonstrates that the proposed method of performing air quality predictions has a superior performance.

  9. A study on industrial accident rate forecasting and program development of estimated zero accident time in Korea.

    PubMed

    Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won

    2011-01-01

    To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.

  10. Selection of Hidden Layer Neurons and Best Training Method for FFNN in Application of Long Term Load Forecasting

    NASA Astrophysics Data System (ADS)

    Singh, Navneet K.; Singh, Asheesh K.; Tripathy, Manoj

    2012-05-01

    For power industries electricity load forecast plays an important role for real-time control, security, optimal unit commitment, economic scheduling, maintenance, energy management, and plant structure planning etc. A new technique for long term load forecasting (LTLF) using optimized feed forward artificial neural network (FFNN) architecture is presented in this paper, which selects optimal number of neurons in the hidden layer as well as the best training method for the case study. The prediction performance of proposed technique is evaluated using mean absolute percentage error (MAPE) of Thailand private electricity consumption and forecasted data. The results obtained are compared with the results of classical auto-regressive (AR) and moving average (MA) methods. It is, in general, observed that the proposed method is prediction wise more accurate.

  11. [Prediction of schistosomiasis infection rates of population based on ARIMA-NARNN model].

    PubMed

    Ke-Wei, Wang; Yu, Wu; Jin-Ping, Li; Yu-Yu, Jiang

    2016-07-12

    To explore the effect of the autoregressive integrated moving average model-nonlinear auto-regressive neural network (ARIMA-NARNN) model on predicting schistosomiasis infection rates of population. The ARIMA model, NARNN model and ARIMA-NARNN model were established based on monthly schistosomiasis infection rates from January 2005 to February 2015 in Jiangsu Province, China. The fitting and prediction performances of the three models were compared. Compared to the ARIMA model and NARNN model, the mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model were the least with the values of 0.011 1, 0.090 0 and 0.282 4, respectively. The ARIMA-NARNN model could effectively fit and predict schistosomiasis infection rates of population, which might have a great application value for the prevention and control of schistosomiasis.

  12. Optimal design and experimental analyses of a new micro-vibration control payload-platform

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoqing; Yang, Bintang; Zhao, Long; Sun, Xiaofen

    2016-07-01

    This paper presents a new payload-platform, for precision devices, which possesses the capability of isolating the complex space micro-vibration in low frequency range below 5 Hz. The novel payload-platform equipped with smart material actuators is investigated and designed through optimization strategy based on the minimum energy loss rate, for the aim of achieving high drive efficiency and reducing the effect of the magnetic circuit nonlinearity. Then, the dynamic model of the driving element is established by using the Lagrange method and the performance of the designed payload-platform is further discussed through the combination of the controlled auto regressive moving average (CARMA) model with modified generalized prediction control (MGPC) algorithm. Finally, an experimental prototype is developed and tested. The experimental results demonstrate that the payload-platform has an impressive potential of micro-vibration isolation.

  13. Symbiosis of Steel, Energy, and CO2 Evolution in Korea

    NASA Astrophysics Data System (ADS)

    Lee, Hyunjoung; Matsuura, Hiroyuki; Sohn, Il

    2016-09-01

    This study looks at the energy intensity of the steel industry and the greenhouse gas intensity involved with the production of steel. Using several sources of steel production data and the corresponding energy sources used provides a time-series analysis of the greenhouse gas (GHG) and energy intensity from 1990 to 2014. The impact of the steel economy with the gross domestic product (GDP) provides indirect importance of the general manufacturing sector within Korea and in particular the steel industry. Beyond 2008, the shift in excess materials production and significant increase in total imports have led to an imbalance in the Korean steel market and continue to inhibit the growth of the domestic steel market. The forecast of the GHG and energy intensity along with the steel production up to 2030 is provided using the auto regressive integrated moving average analysis.

  14. A Modified LS+AR Model to Improve the Accuracy of the Short-term Polar Motion Prediction

    NASA Astrophysics Data System (ADS)

    Wang, Z. W.; Wang, Q. X.; Ding, Y. Q.; Zhang, J. J.; Liu, S. S.

    2017-03-01

    There are two problems of the LS (Least Squares)+AR (AutoRegressive) model in polar motion forecast: the inner residual value of LS fitting is reasonable, but the residual value of LS extrapolation is poor; and the LS fitting residual sequence is non-linear. It is unsuitable to establish an AR model for the residual sequence to be forecasted, based on the residual sequence before forecast epoch. In this paper, we make solution to those two problems with two steps. First, restrictions are added to the two endpoints of LS fitting data to fix them on the LS fitting curve. Therefore, the fitting values next to the two endpoints are very close to the observation values. Secondly, we select the interpolation residual sequence of an inward LS fitting curve, which has a similar variation trend as the LS extrapolation residual sequence, as the modeling object of AR for the residual forecast. Calculation examples show that this solution can effectively improve the short-term polar motion prediction accuracy by the LS+AR model. In addition, the comparison results of the forecast models of RLS (Robustified Least Squares)+AR, RLS+ARIMA (AutoRegressive Integrated Moving Average), and LS+ANN (Artificial Neural Network) confirm the feasibility and effectiveness of the solution for the polar motion forecast. The results, especially for the polar motion forecast in the 1-10 days, show that the forecast accuracy of the proposed model can reach the world level.

  15. A vertical handoff decision algorithm based on ARMA prediction model

    NASA Astrophysics Data System (ADS)

    Li, Ru; Shen, Jiao; Chen, Jun; Liu, Qiuhuan

    2012-01-01

    With the development of computer technology and the increasing demand for mobile communications, the next generation wireless networks will be composed of various wireless networks (e.g., WiMAX and WiFi). Vertical handoff is a key technology of next generation wireless networks. During the vertical handoff procedure, handoff decision is a crucial issue for an efficient mobility. Based on auto regression moving average (ARMA) prediction model, we propose a vertical handoff decision algorithm, which aims to improve the performance of vertical handoff and avoid unnecessary handoff. Based on the current received signal strength (RSS) and the previous RSS, the proposed approach adopt ARMA model to predict the next RSS. And then according to the predicted RSS to determine whether trigger the link layer triggering event and complete vertical handoff. The simulation results indicate that the proposed algorithm outperforms the RSS-based scheme with a threshold in the performance of handoff and the number of handoff.

  16. Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes

    PubMed Central

    Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A

    2014-01-01

    This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed. PMID:26167432

  17. Watershed Regressions for Pesticides (WARP) for Predicting Annual Maximum and Annual Maximum Moving-Average Concentrations of Atrazine in Streams

    USGS Publications Warehouse

    Stone, Wesley W.; Gilliom, Robert J.; Crawford, Charles G.

    2008-01-01

    Regression models were developed for predicting annual maximum and selected annual maximum moving-average concentrations of atrazine in streams using the Watershed Regressions for Pesticides (WARP) methodology developed by the National Water-Quality Assessment Program (NAWQA) of the U.S. Geological Survey (USGS). The current effort builds on the original WARP models, which were based on the annual mean and selected percentiles of the annual frequency distribution of atrazine concentrations. Estimates of annual maximum and annual maximum moving-average concentrations for selected durations are needed to characterize the levels of atrazine and other pesticides for comparison to specific water-quality benchmarks for evaluation of potential concerns regarding human health or aquatic life. Separate regression models were derived for the annual maximum and annual maximum 21-day, 60-day, and 90-day moving-average concentrations. Development of the regression models used the same explanatory variables, transformations, model development data, model validation data, and regression methods as those used in the original development of WARP. The models accounted for 72 to 75 percent of the variability in the concentration statistics among the 112 sampling sites used for model development. Predicted concentration statistics from the four models were within a factor of 10 of the observed concentration statistics for most of the model development and validation sites. Overall, performance of the models for the development and validation sites supports the application of the WARP models for predicting annual maximum and selected annual maximum moving-average atrazine concentration in streams and provides a framework to interpret the predictions in terms of uncertainty. For streams with inadequate direct measurements of atrazine concentrations, the WARP model predictions for the annual maximum and the annual maximum moving-average atrazine concentrations can be used to characterize the probable levels of atrazine for comparison to specific water-quality benchmarks. Sites with a high probability of exceeding a benchmark for human health or aquatic life can be prioritized for monitoring.

  18. Forecasting carbon dioxide emissions based on a hybrid of mixed data sampling regression model and back propagation neural network in the USA.

    PubMed

    Zhao, Xin; Han, Meng; Ding, Lili; Calin, Adrian Cantemir

    2018-01-01

    The accurate forecast of carbon dioxide emissions is critical for policy makers to take proper measures to establish a low carbon society. This paper discusses a hybrid of the mixed data sampling (MIDAS) regression model and BP (back propagation) neural network (MIDAS-BP model) to forecast carbon dioxide emissions. Such analysis uses mixed frequency data to study the effects of quarterly economic growth on annual carbon dioxide emissions. The forecasting ability of MIDAS-BP is remarkably better than MIDAS, ordinary least square (OLS), polynomial distributed lags (PDL), autoregressive distributed lags (ADL), and auto-regressive moving average (ARMA) models. The MIDAS-BP model is suitable for forecasting carbon dioxide emissions for both the short and longer term. This research is expected to influence the methodology for forecasting carbon dioxide emissions by improving the forecast accuracy. Empirical results show that economic growth has both negative and positive effects on carbon dioxide emissions that last 15 quarters. Carbon dioxide emissions are also affected by their own change within 3 years. Therefore, there is a need for policy makers to explore an alternative way to develop the economy, especially applying new energy policies to establish a low carbon society.

  19. A stepwise model to predict monthly streamflow

    NASA Astrophysics Data System (ADS)

    Mahmood Al-Juboori, Anas; Guven, Aytac

    2016-12-01

    In this study, a stepwise model empowered with genetic programming is developed to predict the monthly flows of Hurman River in Turkey and Diyalah and Lesser Zab Rivers in Iraq. The model divides the monthly flow data to twelve intervals representing the number of months in a year. The flow of a month, t is considered as a function of the antecedent month's flow (t - 1) and it is predicted by multiplying the antecedent monthly flow by a constant value called K. The optimum value of K is obtained by a stepwise procedure which employs Gene Expression Programming (GEP) and Nonlinear Generalized Reduced Gradient Optimization (NGRGO) as alternative to traditional nonlinear regression technique. The degree of determination and root mean squared error are used to evaluate the performance of the proposed models. The results of the proposed model are compared with the conventional Markovian and Auto Regressive Integrated Moving Average (ARIMA) models based on observed monthly flow data. The comparison results based on five different statistic measures show that the proposed stepwise model performed better than Markovian model and ARIMA model. The R2 values of the proposed model range between 0.81 and 0.92 for the three rivers in this study.

  20. Stochastic modelling of the monthly average maximum and minimum temperature patterns in India 1981-2015

    NASA Astrophysics Data System (ADS)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-04-01

    The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.

  1. Nonlinear Recurrent Neural Network Predictive Control for Energy Distribution of a Fuel Cell Powered Robot

    PubMed Central

    Chen, Qihong; Long, Rong; Quan, Shuhai

    2014-01-01

    This paper presents a neural network predictive control strategy to optimize power distribution for a fuel cell/ultracapacitor hybrid power system of a robot. We model the nonlinear power system by employing time variant auto-regressive moving average with exogenous (ARMAX), and using recurrent neural network to represent the complicated coefficients of the ARMAX model. Because the dynamic of the system is viewed as operating- state- dependent time varying local linear behavior in this frame, a linear constrained model predictive control algorithm is developed to optimize the power splitting between the fuel cell and ultracapacitor. The proposed algorithm significantly simplifies implementation of the controller and can handle multiple constraints, such as limiting substantial fluctuation of fuel cell current. Experiment and simulation results demonstrate that the control strategy can optimally split power between the fuel cell and ultracapacitor, limit the change rate of the fuel cell current, and so as to extend the lifetime of the fuel cell. PMID:24707206

  2. Two-stage damage diagnosis based on the distance between ARMA models and pre-whitening filters

    NASA Astrophysics Data System (ADS)

    Zheng, H.; Mita, A.

    2007-10-01

    This paper presents a two-stage damage diagnosis strategy for damage detection and localization. Auto-regressive moving-average (ARMA) models are fitted to time series of vibration signals recorded by sensors. In the first stage, a novel damage indicator, which is defined as the distance between ARMA models, is applied to damage detection. This stage can determine the existence of damage in the structure. Such an algorithm uses output only and does not require operator intervention. Therefore it can be embedded in the sensor board of a monitoring network. In the second stage, a pre-whitening filter is used to minimize the cross-correlation of multiple excitations. With this technique, the damage indicator can further identify the damage location and severity when the damage has been detected in the first stage. The proposed methodology is tested using simulation and experimental data. The analysis results clearly illustrate the feasibility of the proposed two-stage damage diagnosis methodology.

  3. An efficient approach to ARMA modeling of biological systems with multiple inputs and delays

    NASA Technical Reports Server (NTRS)

    Perrott, M. H.; Cohen, R. J.

    1996-01-01

    This paper presents a new approach to AutoRegressive Moving Average (ARMA or ARX) modeling which automatically seeks the best model order to represent investigated linear, time invariant systems using their input/output data. The algorithm seeks the ARMA parameterization which accounts for variability in the output of the system due to input activity and contains the fewest number of parameters required to do so. The unique characteristics of the proposed system identification algorithm are its simplicity and efficiency in handling systems with delays and multiple inputs. We present results of applying the algorithm to simulated data and experimental biological data In addition, a technique for assessing the error associated with the impulse responses calculated from estimated ARMA parameterizations is presented. The mapping from ARMA coefficients to impulse response estimates is nonlinear, which complicates any effort to construct confidence bounds for the obtained impulse responses. Here a method for obtaining a linearization of this mapping is derived, which leads to a simple procedure to approximate the confidence bounds.

  4. Freely chosen cadence during a covert manipulation of ambient temperature.

    PubMed

    Hartley, Geoffrey L; Cheung, Stephen S

    2013-01-01

    The present study investigated relationships between changes in power output (PO) to torque (TOR) or freely chosen cadence (FCC) during thermal loading. Twenty participants cycled at a constant rating of perceived exertion while ambient temperature (Ta) was covertly manipulated at 20-min intervals of 20 °C, 35 °C, and 20 °C. The magnitude responses of PO, FCC and TOR were analyzed using repeated-measures ANOVA, while the temporal correlations were analyzed using Auto-Regressive Integrated Moving Averages (ARIMA). Increases in Ta caused significant thermal strain (p < .01), and subsequently, a decrease in PO and TOR magnitude (p < .01), whereas FCC remained unchanged (p = .51). ARIMA indicates that changes in PO were highly correlated to TOR (stationary r2 = .954, p = .04), while FCC was moderately correlated (stationary r2 = .717, p = .01) to PO. In conclusion, changes in PO are caused by a modulation in TOR, whereas FCC remains unchanged and therefore, unaffected by thermal stressors.

  5. Shape classification of malignant lymphomas and leukemia by morphological watersheds and ARMA modeling

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Song, Yinglei; Ma, Limin; Zhou, Min

    2003-05-01

    A new algorithm that can be used to automatically recognize and classify malignant lymphomas and lukemia is proposed in this paper. The algorithm utilizes the morphological watershed to extract boundaries of cells from their grey-level images. It generates a sequence of Euclidean distances by selecting pixels in clockwise direction on the boundary of the cell and calculating the Euclidean distances of the selected pixels from the centroid of the cell. A feature vector associated with each cell is then obtained by applying the auto-regressive moving-average (ARMA) model to the generated sequence of Euclidean distances. The clustering measure J3=trace{inverse(Sw-1)Sm} involving the within (Sw) and mixed (Sm) class-scattering matrices is computed for both cell classes to provide an insight into the extent to which different cell classes in the training data are separated. Our test results suggest that the algorithm is highly accurate for the development of an interactive, computer-assisted diagnosis (CAD) tool.

  6. Inter-comparison of time series models of lake levels predicted by several modeling strategies

    NASA Astrophysics Data System (ADS)

    Khatibi, R.; Ghorbani, M. A.; Naghipour, L.; Jothiprakash, V.; Fathima, T. A.; Fazelifard, M. H.

    2014-04-01

    Five modeling strategies are employed to analyze water level time series of six lakes with different physical characteristics such as shape, size, altitude and range of variations. The models comprise chaos theory, Auto-Regressive Integrated Moving Average (ARIMA) - treated for seasonality and hence SARIMA, Artificial Neural Networks (ANN), Gene Expression Programming (GEP) and Multiple Linear Regression (MLR). Each is formulated on a different premise with different underlying assumptions. Chaos theory is elaborated in a greater detail as it is customary to identify the existence of chaotic signals by a number of techniques (e.g. average mutual information and false nearest neighbors) and future values are predicted using the Nonlinear Local Prediction (NLP) technique. This paper takes a critical view of past inter-comparison studies seeking a superior performance, against which it is reported that (i) the performances of all five modeling strategies vary from good to poor, hampering the recommendation of a clear-cut predictive model; (ii) the performances of the datasets of two cases are consistently better with all five modeling strategies; (iii) in other cases, their performances are poor but the results can still be fit-for-purpose; (iv) the simultaneous good performances of NLP and SARIMA pull their underlying assumptions to different ends, which cannot be reconciled. A number of arguments are presented including the culture of pluralism, according to which the various modeling strategies facilitate an insight into the data from different vantages.

  7. SU-E-J-112: The Impact of Cine EPID Image Acquisition Frame Rate On Markerless Soft-Tissue Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yip, S; Rottmann, J; Berbeco, R

    2014-06-01

    Purpose: Although reduction of the cine EPID acquisition frame rate through multiple frame averaging may reduce hardware memory burden and decrease image noise, it can hinder the continuity of soft-tissue motion leading to poor auto-tracking results. The impact of motion blurring and image noise on the tracking performance was investigated. Methods: Phantom and patient images were acquired at a frame rate of 12.87Hz on an AS1000 portal imager. Low frame rate images were obtained by continuous frame averaging. A previously validated tracking algorithm was employed for auto-tracking. The difference between the programmed and auto-tracked positions of a Las Vegas phantommore » moving in the superior-inferior direction defined the tracking error (δ). Motion blurring was assessed by measuring the area change of the circle with the greatest depth. Additionally, lung tumors on 1747 frames acquired at eleven field angles from four radiotherapy patients are manually and automatically tracked with varying frame averaging. δ was defined by the position difference of the two tracking methods. Image noise was defined as the standard deviation of the background intensity. Motion blurring and image noise were correlated with δ using Pearson correlation coefficient (R). Results: For both phantom and patient studies, the auto-tracking errors increased at frame rates lower than 4.29Hz. Above 4.29Hz, changes in errors were negligible with δ<1.60mm. Motion blurring and image noise were observed to increase and decrease with frame averaging, respectively. Motion blurring and tracking errors were significantly correlated for the phantom (R=0.94) and patient studies (R=0.72). Moderate to poor correlation was found between image noise and tracking error with R -0.58 and -0.19 for both studies, respectively. Conclusion: An image acquisition frame rate of at least 4.29Hz is recommended for cine EPID tracking. Motion blurring in images with frame rates below 4.39Hz can substantially reduce the accuracy of auto-tracking. This work is supported in part by the Varian Medical Systems, Inc.« less

  8. The effect of climate variability on urinary stone attacks: increased incidence associated with temperature over 18 °C: a population-based study.

    PubMed

    Park, Hyoung Keun; Bae, Sang Rak; Kim, Satbyul E; Choi, Woo Suk; Paick, Sung Hyun; Ho, Kim; Kim, Hyeong Gon; Lho, Yong Soo

    2015-02-01

    The aim of this study was to evaluate the effect of seasonal variation and climate parameters on urinary tract stone attack and investigate whether stone attack is increased sharply at a specific point. Nationwide data of total urinary tract stone attack numbers per month between January 2006 and December 2010 were obtained from the Korean Health Insurance Review and Assessment Service. The effects of climatic factors on monthly urinary stone attack were assessed using auto-regressive integrated moving average (ARIMA) regression method. A total of 1,702,913 stone attack cases were identified. Mean monthly and monthly average daily urinary stone attack cases were 28,382 ± 2,760 and 933 ± 85, respectively. The stone attack showed seasonal trends of sharp incline in June, a peak plateau from July to September, and a sharp decline after September. The correlation analysis showed that ambient temperature (r = 0.557, p < 0.001) and relative humidity (r = 0.513, p < 0.001) were significantly associated with urinary stone attack cases. However, after adjustment for trends and seasonality, ambient temperature was the only climate factor associated with the stone attack cases in ARIMA regression test (p = 0.04). Threshold temperature was estimated as 18.4 °C. Risk of urinary stone attack significantly increases 1.71% (1.02-2.41 %, 95% confidence intervals) with a 1 °C increase of ambient temperature above the threshold point. In conclusion, monthly urinary stone attack cases were changed according to seasonal variation. Among the climates variables, only temperature had consistent association with stone attack and when the temperature is over 18.4 °C, urinary stone attack would be increased sharply.

  9. Very-short-term wind power prediction by a hybrid model with single- and multi-step approaches

    NASA Astrophysics Data System (ADS)

    Mohammed, E.; Wang, S.; Yu, J.

    2017-05-01

    Very-short-term wind power prediction (VSTWPP) has played an essential role for the operation of electric power systems. This paper aims at improving and applying a hybrid method of VSTWPP based on historical data. The hybrid method is combined by multiple linear regressions and least square (MLR&LS), which is intended for reducing prediction errors. The predicted values are obtained through two sub-processes:1) transform the time-series data of actual wind power into the power ratio, and then predict the power ratio;2) use the predicted power ratio to predict the wind power. Besides, the proposed method can include two prediction approaches: single-step prediction (SSP) and multi-step prediction (MSP). WPP is tested comparatively by auto-regressive moving average (ARMA) model from the predicted values and errors. The validity of the proposed hybrid method is confirmed in terms of error analysis by using probability density function (PDF), mean absolute percent error (MAPE) and means square error (MSE). Meanwhile, comparison of the correlation coefficients between the actual values and the predicted values for different prediction times and window has confirmed that MSP approach by using the hybrid model is the most accurate while comparing to SSP approach and ARMA. The MLR&LS is accurate and promising for solving problems in WPP.

  10. Real time detection of farm-level swine mycobacteriosis outbreak using time series modeling of the number of condemned intestines in abattoirs.

    PubMed

    Adachi, Yasumoto; Makita, Kohei

    2015-09-01

    Mycobacteriosis in swine is a common zoonosis found in abattoirs during meat inspections, and the veterinary authority is expected to inform the producer for corrective actions when an outbreak is detected. The expected value of the number of condemned carcasses due to mycobacteriosis therefore would be a useful threshold to detect an outbreak, and the present study aims to develop such an expected value through time series modeling. The model was developed using eight years of inspection data (2003 to 2010) obtained at 2 abattoirs of the Higashi-Mokoto Meat Inspection Center, Japan. The resulting model was validated by comparing the predicted time-dependent values for the subsequent 2 years with the actual data for 2 years between 2011 and 2012. For the modeling, at first, periodicities were checked using Fast Fourier Transformation, and the ensemble average profiles for weekly periodicities were calculated. An Auto-Regressive Integrated Moving Average (ARIMA) model was fitted to the residual of the ensemble average on the basis of minimum Akaike's information criterion (AIC). The sum of the ARIMA model and the weekly ensemble average was regarded as the time-dependent expected value. During 2011 and 2012, the number of whole or partial condemned carcasses exceeded the 95% confidence interval of the predicted values 20 times. All of these events were associated with the slaughtering of pigs from three producers with the highest rate of condemnation due to mycobacteriosis.

  11. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  12. Preliminary evidence for the influence of physiography and scale upon the autocorrelation function of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Labovitz, M. L.; Toll, D. L.; Kennard, R. E.

    1980-01-01

    Previously established results demonstrate that LANDSAT data are autocorrelated and can be described by a univariate linear stochastic process known as auto-regressive-integrated-moving-average model of degree 1, 0, 1 or ARIMA (1, 0, 1). This model has two coefficients of interest for interpretation phi(1) and theta(1). In a comparison of LANDSAT thematic mapper simulator (TMS) data and LANDSAT MSS data several results were established: (1) The form of the relatedness as described by this model is not dependent upon system look angle or pixel size. (2) The phi(1) coefficient increases with decreasing pixel size and increasing topographic complexity. (3) Changes in topography have a greater influence upon phi(1) than changes in land cover class. (4) The theta(1) seems to vary with the amount of atmospheric haze. These patterns of variation in phi(1) and theta(1) are potentially exploitable by the remote sensing community to yield stochastically independent sets of observations, characterize topography, and reduce the number of bytes needed to store remotely sensed data.

  13. Water quality management using statistical analysis and time-series prediction model

    NASA Astrophysics Data System (ADS)

    Parmar, Kulwinder Singh; Bhardwaj, Rashmi

    2014-12-01

    This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.

  14. A New Navigation Satellite Clock Bias Prediction Method Based on Modified Clock-bias Quadratic Polynomial Model

    NASA Astrophysics Data System (ADS)

    Wang, Y. P.; Lu, Z. P.; Sun, D. S.; Wang, N.

    2016-01-01

    In order to better express the characteristics of satellite clock bias (SCB) and improve SCB prediction precision, this paper proposed a new SCB prediction model which can take physical characteristics of space-borne atomic clock, the cyclic variation, and random part of SCB into consideration. First, the new model employs a quadratic polynomial model with periodic items to fit and extract the trend term and cyclic term of SCB; then based on the characteristics of fitting residuals, a time series ARIMA ~(Auto-Regressive Integrated Moving Average) model is used to model the residuals; eventually, the results from the two models are combined to obtain final SCB prediction values. At last, this paper uses precise SCB data from IGS (International GNSS Service) to conduct prediction tests, and the results show that the proposed model is effective and has better prediction performance compared with the quadratic polynomial model, grey model, and ARIMA model. In addition, the new method can also overcome the insufficiency of the ARIMA model in model recognition and order determination.

  15. The Lateral Tracking Control for the Intelligent Vehicle Based on Adaptive PID Neural Network.

    PubMed

    Han, Gaining; Fu, Weiping; Wang, Wen; Wu, Zongsheng

    2017-05-30

    The intelligent vehicle is a complicated nonlinear system, and the design of a path tracking controller is one of the key technologies in intelligent vehicle research. This paper mainly designs a lateral control dynamic model of the intelligent vehicle, which is used for lateral tracking control. Firstly, the vehicle dynamics model (i.e., transfer function) is established according to the vehicle parameters. Secondly, according to the vehicle steering control system and the CARMA (Controlled Auto-Regression and Moving-Average) model, a second-order control system model is built. Using forgetting factor recursive least square estimation (FFRLS), the system parameters are identified. Finally, a neural network PID (Proportion Integral Derivative) controller is established for lateral path tracking control based on the vehicle model and the steering system model. Experimental simulation results show that the proposed model and algorithm have the high real-time and robustness in path tracing control. This provides a certain theoretical basis for intelligent vehicle autonomous navigation tracking control, and lays the foundation for the vertical and lateral coupling control.

  16. The Satellite Clock Bias Prediction Method Based on Takagi-Sugeno Fuzzy Neural Network

    NASA Astrophysics Data System (ADS)

    Cai, C. L.; Yu, H. G.; Wei, Z. C.; Pan, J. D.

    2017-05-01

    The continuous improvement of the prediction accuracy of Satellite Clock Bias (SCB) is the key problem of precision navigation. In order to improve the precision of SCB prediction and better reflect the change characteristics of SCB, this paper proposes an SCB prediction method based on the Takagi-Sugeno fuzzy neural network. Firstly, the SCB values are pre-treated based on their characteristics. Then, an accurate Takagi-Sugeno fuzzy neural network model is established based on the preprocessed data to predict SCB. This paper uses the precise SCB data with different sampling intervals provided by IGS (International Global Navigation Satellite System Service) to realize the short-time prediction experiment, and the results are compared with the ARIMA (Auto-Regressive Integrated Moving Average) model, GM(1,1) model, and the quadratic polynomial model. The results show that the Takagi-Sugeno fuzzy neural network model is feasible and effective for the SCB short-time prediction experiment, and performs well for different types of clocks. The prediction results for the proposed method are better than the conventional methods obviously.

  17. The Lateral Tracking Control for the Intelligent Vehicle Based on Adaptive PID Neural Network

    PubMed Central

    Han, Gaining; Fu, Weiping; Wang, Wen; Wu, Zongsheng

    2017-01-01

    The intelligent vehicle is a complicated nonlinear system, and the design of a path tracking controller is one of the key technologies in intelligent vehicle research. This paper mainly designs a lateral control dynamic model of the intelligent vehicle, which is used for lateral tracking control. Firstly, the vehicle dynamics model (i.e., transfer function) is established according to the vehicle parameters. Secondly, according to the vehicle steering control system and the CARMA (Controlled Auto-Regression and Moving-Average) model, a second-order control system model is built. Using forgetting factor recursive least square estimation (FFRLS), the system parameters are identified. Finally, a neural network PID (Proportion Integral Derivative) controller is established for lateral path tracking control based on the vehicle model and the steering system model. Experimental simulation results show that the proposed model and algorithm have the high real-time and robustness in path tracing control. This provides a certain theoretical basis for intelligent vehicle autonomous navigation tracking control, and lays the foundation for the vertical and lateral coupling control. PMID:28556817

  18. Stock price forecasting based on time series analysis

    NASA Astrophysics Data System (ADS)

    Chi, Wan Le

    2018-05-01

    Using the historical stock price data to set up a sequence model to explain the intrinsic relationship of data, the future stock price can forecasted. The used models are auto-regressive model, moving-average model and autoregressive-movingaverage model. The original data sequence of unit root test was used to judge whether the original data sequence was stationary. The non-stationary original sequence as a first order difference needed further processing. Then the stability of the sequence difference was re-inspected. If it is still non-stationary, the second order differential processing of the sequence is carried out. Autocorrelation diagram and partial correlation diagram were used to evaluate the parameters of the identified ARMA model, including coefficients of the model and model order. Finally, the model was used to forecast the fitting of the shanghai composite index daily closing price with precision. Results showed that the non-stationary original data series was stationary after the second order difference. The forecast value of shanghai composite index daily closing price was closer to actual value, indicating that the ARMA model in the paper was a certain accuracy.

  19. Riemannian multi-manifold modeling and clustering in brain networks

    NASA Astrophysics Data System (ADS)

    Slavakis, Konstantinos; Salsabilian, Shiva; Wack, David S.; Muldoon, Sarah F.; Baidoo-Williams, Henry E.; Vettel, Jean M.; Cieslak, Matthew; Grafton, Scott T.

    2017-08-01

    This paper introduces Riemannian multi-manifold modeling in the context of brain-network analytics: Brainnetwork time-series yield features which are modeled as points lying in or close to a union of a finite number of submanifolds within a known Riemannian manifold. Distinguishing disparate time series amounts thus to clustering multiple Riemannian submanifolds. To this end, two feature-generation schemes for brain-network time series are put forth. The first one is motivated by Granger-causality arguments and uses an auto-regressive moving average model to map low-rank linear vector subspaces, spanned by column vectors of appropriately defined observability matrices, to points into the Grassmann manifold. The second one utilizes (non-linear) dependencies among network nodes by introducing kernel-based partial correlations to generate points in the manifold of positivedefinite matrices. Based on recently developed research on clustering Riemannian submanifolds, an algorithm is provided for distinguishing time series based on their Riemannian-geometry properties. Numerical tests on time series, synthetically generated from real brain-network structural connectivity matrices, reveal that the proposed scheme outperforms classical and state-of-the-art techniques in clustering brain-network states/structures.

  20. Predicting groundwater level fluctuations with meteorological effect implications—A comparative study among soft computing techniques

    NASA Astrophysics Data System (ADS)

    Shiri, Jalal; Kisi, Ozgur; Yoon, Heesung; Lee, Kang-Kun; Hossein Nazemi, Amir

    2013-07-01

    The knowledge of groundwater table fluctuations is important in agricultural lands as well as in the studies related to groundwater utilization and management levels. This paper investigates the abilities of Gene Expression Programming (GEP), Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Networks (ANN) and Support Vector Machine (SVM) techniques for groundwater level forecasting in following day up to 7-day prediction intervals. Several input combinations comprising water table level, rainfall and evapotranspiration values from Hongcheon Well station (South Korea), covering a period of eight years (2001-2008) were used to develop and test the applied models. The data from the first six years were used for developing (training) the applied models and the last two years data were reserved for testing. A comparison was also made between the forecasts provided by these models and the Auto-Regressive Moving Average (ARMA) technique. Based on the comparisons, it was found that the GEP models could be employed successfully in forecasting water table level fluctuations up to 7 days beyond data records.

  1. Statistical modeling of valley fever data in Kern County, California

    NASA Astrophysics Data System (ADS)

    Talamantes, Jorge; Behseta, Sam; Zender, Charles S.

    2007-03-01

    Coccidioidomycosis (valley fever) is a fungal infection found in the southwestern US, northern Mexico, and some places in Central and South America. The fungus that causes it ( Coccidioides immitis) is normally soil-dwelling but, if disturbed, becomes air-borne and infects the host when its spores are inhaled. It is thus natural to surmise that weather conditions that foster the growth and dispersal of the fungus must have an effect on the number of cases in the endemic areas. We present here an attempt at the modeling of valley fever incidence in Kern County, California, by the implementation of a generalized auto regressive moving average (GARMA) model. We show that the number of valley fever cases can be predicted mainly by considering only the previous history of incidence rates in the county. The inclusion of weather-related time sequences improves the model only to a relatively minor extent. This suggests that fluctuations of incidence rates (about a seasonally varying background value) are related to biological and/or anthropogenic reasons, and not so much to weather anomalies.

  2. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE PAGES

    Yoo, Wucherl; Sim, Alex

    2016-06-24

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  3. Seasonality and Trend Forecasting of Tuberculosis Prevalence Data in Eastern Cape, South Africa, Using a Hybrid Model.

    PubMed

    Azeez, Adeboye; Obaromi, Davies; Odeyemi, Akinwumi; Ndege, James; Muntabayi, Ruffin

    2016-07-26

    Tuberculosis (TB) is a deadly infectious disease caused by Mycobacteria tuberculosis. Tuberculosis as a chronic and highly infectious disease is prevalent in almost every part of the globe. More than 95% of TB mortality occurs in low/middle income countries. In 2014, approximately 10 million people were diagnosed with active TB and two million died from the disease. In this study, our aim is to compare the predictive powers of the seasonal autoregressive integrated moving average (SARIMA) and neural network auto-regression (SARIMA-NNAR) models of TB incidence and analyse its seasonality in South Africa. TB incidence cases data from January 2010 to December 2015 were extracted from the Eastern Cape Health facility report of the electronic Tuberculosis Register (ERT.Net). A SARIMA model and a combined model of SARIMA model and a neural network auto-regression (SARIMA-NNAR) model were used in analysing and predicting the TB data from 2010 to 2015. Simulation performance parameters of mean square error (MSE), root mean square error (RMSE), mean absolute error (MAE), mean percent error (MPE), mean absolute scaled error (MASE) and mean absolute percentage error (MAPE) were applied to assess the better performance of prediction between the models. Though practically, both models could predict TB incidence, the combined model displayed better performance. For the combined model, the Akaike information criterion (AIC), second-order AIC (AICc) and Bayesian information criterion (BIC) are 288.56, 308.31 and 299.09 respectively, which were lower than the SARIMA model with corresponding values of 329.02, 327.20 and 341.99, respectively. The seasonality trend of TB incidence was forecast to have a slightly increased seasonal TB incidence trend from the SARIMA-NNAR model compared to the single model. The combined model indicated a better TB incidence forecasting with a lower AICc. The model also indicates the need for resolute intervention to reduce infectious disease transmission with co-infection with HIV and other concomitant diseases, and also at festival peak periods.

  4. The role of personal values, urban form, and auto availability in the analysis of walking for transportation.

    PubMed

    Coogan, Matthew A; Karash, Karla H; Adler, Thomas; Sallis, James

    2007-01-01

    To examine the association of personal values, the built environment, and auto availability with walking for transportation. Participants were drawn from 11 U.S. metropolitan areas with good transit services. 865 adults who had recently made or were contemplating making a residential move. Respondents reported if walking was their primary mode for nine trip purposes. "Personal values" reflected ratings of 15 variables assessing attitudes about urban and environmental attributes, with high reliability (ot = 0.85). Neighborhood form was indicated by a three-item scale. Three binary variables were created to reflect (1) personal values, (2) neighborhood form, and (3) auto availability. The association with walking was reported for each of the three variables, each combination of two variables, and the combination of three variables. An analysis of covariance was applied, and a hierarchic linear regression model was developed. All three variables were associated with walking, and all three variables interacted. The standardized coefficients were 0.23for neighborhood form, 0.21 for autos per person, and 0.18 for personal values. Positive attitudes about urban attributes, living in a supportive neighborhood, and low automobile availability significantly predicted more walking for transportation. A framework for further research is proposed in which a factor representing the role of the automobile is examined explicitly in addition to personal values and urban form.

  5. Evaluation of statistical models for forecast errors from the HBV model

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur

    2010-04-01

    SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.

  6. Using Baidu Search Index to Predict Dengue Outbreak in China

    NASA Astrophysics Data System (ADS)

    Liu, Kangkang; Wang, Tao; Yang, Zhicong; Huang, Xiaodong; Milinovich, Gabriel J.; Lu, Yi; Jing, Qinlong; Xia, Yao; Zhao, Zhengyang; Yang, Yang; Tong, Shilu; Hu, Wenbiao; Lu, Jiahai

    2016-12-01

    This study identified the possible threshold to predict dengue fever (DF) outbreaks using Baidu Search Index (BSI). Time-series classification and regression tree models based on BSI were used to develop a predictive model for DF outbreak in Guangzhou and Zhongshan, China. In the regression tree models, the mean autochthonous DF incidence rate increased approximately 30-fold in Guangzhou when the weekly BSI for DF at the lagged moving average of 1-3 weeks was more than 382. When the weekly BSI for DF at the lagged moving average of 1-5 weeks was more than 91.8, there was approximately 9-fold increase of the mean autochthonous DF incidence rate in Zhongshan. In the classification tree models, the results showed that when the weekly BSI for DF at the lagged moving average of 1-3 weeks was more than 99.3, there was 89.28% chance of DF outbreak in Guangzhou, while, in Zhongshan, when the weekly BSI for DF at the lagged moving average of 1-5 weeks was more than 68.1, the chance of DF outbreak rose up to 100%. The study indicated that less cost internet-based surveillance systems can be the valuable complement to traditional DF surveillance in China.

  7. Auto-optimization of dewetting rates by rim instabilities in slipping polymer films.

    PubMed

    Reiter, G; Sharma, A

    2001-10-15

    We investigated the instability of the moving rim in dewetting of slipping polymer films. Small fluctuations of the width of the rim get spontaneously amplified since narrower sections of the rim move faster than wider ones due to frictional forces being proportional to the width of the rim. Instability leads eventually to an autocontrol of the rim width by the continuous formation of droplets with a mean size proportional to the initial film thickness. Surprisingly, the mean dewetting velocity at late stages, averaged over the length of the rim, was found to be constant. Thus, the instability of the rim enabled a more efficient, i.e., faster, "drying" of the substrate. Nonslipping films did not show this instability.

  8. Auto-Optimization of Dewetting Rates by Rim Instabilities in Slipping Polymer Films

    NASA Astrophysics Data System (ADS)

    Reiter, Günter; Sharma, Ashutosh

    2001-10-01

    We investigated the instability of the moving rim in dewetting of slipping polymer films. Small fluctuations of the width of the rim get spontaneously amplified since narrower sections of the rim move faster than wider ones due to frictional forces being proportional to the width of the rim. Instability leads eventually to an autocontrol of the rim width by the continuous formation of droplets with a mean size proportional to the initial film thickness. Surprisingly, the mean dewetting velocity at late stages, averaged over the length of the rim, was found to be constant. Thus, the instability of the rim enabled a more efficient, i.e., faster, ``drying'' of the substrate. Nonslipping films did not show this instability.

  9. Tracking Electroencephalographic Changes Using Distributions of Linear Models: Application to Propofol-Based Depth of Anesthesia Monitoring.

    PubMed

    Kuhlmann, Levin; Manton, Jonathan H; Heyse, Bjorn; Vereecke, Hugo E M; Lipping, Tarmo; Struys, Michel M R F; Liley, David T J

    2017-04-01

    Tracking brain states with electrophysiological measurements often relies on short-term averages of extracted features and this may not adequately capture the variability of brain dynamics. The objective is to assess the hypotheses that this can be overcome by tracking distributions of linear models using anesthesia data, and that anesthetic brain state tracking performance of linear models is comparable to that of a high performing depth of anesthesia monitoring feature. Individuals' brain states are classified by comparing the distribution of linear (auto-regressive moving average-ARMA) model parameters estimated from electroencephalographic (EEG) data obtained with a sliding window to distributions of linear model parameters for each brain state. The method is applied to frontal EEG data from 15 subjects undergoing propofol anesthesia and classified by the observers assessment of alertness/sedation (OAA/S) scale. Classification of the OAA/S score was performed using distributions of either ARMA parameters or the benchmark feature, Higuchi fractal dimension. The highest average testing sensitivity of 59% (chance sensitivity: 17%) was found for ARMA (2,1) models and Higuchi fractal dimension achieved 52%, however, no statistical difference was observed. For the same ARMA case, there was no statistical difference if medians are used instead of distributions (sensitivity: 56%). The model-based distribution approach is not necessarily more effective than a median/short-term average approach, however, it performs well compared with a distribution approach based on a high performing anesthesia monitoring measure. These techniques hold potential for anesthesia monitoring and may be generally applicable for tracking brain states.

  10. Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…

  11. Long-range correlation and market segmentation in bond market

    NASA Astrophysics Data System (ADS)

    Wang, Zhongxing; Yan, Yan; Chen, Xiaosong

    2017-09-01

    This paper investigates the long-range auto-correlations and cross-correlations in bond market. Based on Detrended Moving Average (DMA) method, empirical results present a clear evidence of long-range persistence that exists in one year scale. The degree of long-range correlation related to maturities has an upward tendency with a peak in short term. These findings confirm the expectations of fractal market hypothesis (FMH). Furthermore, we have developed a method based on a complex network to study the long-range cross-correlation structure and applied it to our data, and found a clear pattern of market segmentation in the long run. We also detected the nature of long-range correlation in the sub-period 2007-2012 and 2011-2016. The result from our research shows that long-range auto-correlations are decreasing in the recent years while long-range cross-correlations are strengthening.

  12. Momentum conserving Brownian dynamics propagator for complex soft matter fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padding, J. T.; Briels, W. J.

    2014-12-28

    We present a Galilean invariant, momentum conserving first order Brownian dynamics scheme for coarse-grained simulations of highly frictional soft matter systems. Friction forces are taken to be with respect to moving background material. The motion of the background material is described by locally averaged velocities in the neighborhood of the dissolved coarse coordinates. The velocity variables are updated by a momentum conserving scheme. The properties of the stochastic updates are derived through the Chapman-Kolmogorov and Fokker-Planck equations for the evolution of the probability distribution of coarse-grained position and velocity variables, by requiring the equilibrium distribution to be a stationary solution.more » We test our new scheme on concentrated star polymer solutions and find that the transverse current and velocity time auto-correlation functions behave as expected from hydrodynamics. In particular, the velocity auto-correlation functions display a long time tail in complete agreement with hydrodynamics.« less

  13. Assessing the impacts of Saskatchewan's minimum alcohol pricing regulations on alcohol-related crime.

    PubMed

    Stockwell, Tim; Zhao, Jinhui; Sherk, Adam; Callaghan, Russell C; Macdonald, Scott; Gatley, Jodi

    2017-07-01

    Saskatchewan's introduction in April 2010 of minimum prices graded by alcohol strength led to an average minimum price increase of 9.1% per Canadian standard drink (=13.45 g ethanol). This increase was shown to be associated with reduced consumption and switching to lower alcohol content beverages. Police also informally reported marked reductions in night-time alcohol-related crime. This study aims to assess the impacts of changes to Saskatchewan's minimum alcohol-pricing regulations between 2008 and 2012 on selected crime events often related to alcohol use. Data were obtained from Canada's Uniform Crime Reporting Survey. Auto-regressive integrated moving average time series models were used to test immediate and lagged associations between minimum price increases and rates of night-time and police identified alcohol-related crimes. Controls were included for simultaneous crime rates in the neighbouring province of Alberta, economic variables, linear trend, seasonality and autoregressive and/or moving-average effects. The introduction of increased minimum-alcohol prices was associated with an abrupt decrease in night-time alcohol-related traffic offences for men (-8.0%, P < 0.001), but not women. No significant immediate changes were observed for non-alcohol-related driving offences, disorderly conduct or violence. Significant monthly lagged effects were observed for violent offences (-19.7% at month 4 to -18.2% at month 6), which broadly corresponded to lagged effects in on-premise alcohol sales. Increased minimum alcohol prices may contribute to reductions in alcohol-related traffic-related and violent crimes perpetrated by men. Observed lagged effects for violent incidents may be due to a delay in bars passing on increased prices to their customers, perhaps because of inventory stockpiling. [Stockwell T, Zhao J, Sherk A, Callaghan RC, Macdonald S, Gatley J. Assessing the impacts of Saskatchewan's minimum alcohol pricing regulations on alcohol-related crime. Drug Alcohol Rev 2017;36:492-501]. © 2016 Australasian Professional Society on Alcohol and other Drugs.

  14. Comparison of 3-D Multi-Lag Cross-Correlation and Speckle Brightness Aberration Correction Algorithms on Static and Moving Targets

    PubMed Central

    Ivancevich, Nikolas M.; Dahl, Jeremy J.; Smith, Stephen W.

    2010-01-01

    Phase correction has the potential to increase the image quality of 3-D ultrasound, especially transcranial ultrasound. We implemented and compared 2 algorithms for aberration correction, multi-lag cross-correlation and speckle brightness, using static and moving targets. We corrected three 75-ns rms electronic aberrators with full-width at half-maximum (FWHM) auto-correlation lengths of 1.35, 2.7, and 5.4 mm. Cross-correlation proved the better algorithm at 2.7 and 5.4 mm correlation lengths (P < 0.05). Static cross-correlation performed better than moving-target cross-correlation at the 2.7 mm correlation length (P < 0.05). Finally, we compared the static and moving-target cross-correlation on a flow phantom with a skull casting aberrator. Using signal from static targets, the correction resulted in an average contrast increase of 22.2%, compared with 13.2% using signal from moving targets. The contrast-to-noise ratio (CNR) increased by 20.5% and 12.8% using static and moving targets, respectively. Doppler signal strength increased by 5.6% and 4.9% for the static and moving-targets methods, respectively. PMID:19942503

  15. Comparison of 3-D multi-lag cross- correlation and speckle brightness aberration correction algorithms on static and moving targets.

    PubMed

    Ivancevich, Nikolas M; Dahl, Jeremy J; Smith, Stephen W

    2009-10-01

    Phase correction has the potential to increase the image quality of 3-D ultrasound, especially transcranial ultrasound. We implemented and compared 2 algorithms for aberration correction, multi-lag cross-correlation and speckle brightness, using static and moving targets. We corrected three 75-ns rms electronic aberrators with full-width at half-maximum (FWHM) auto-correlation lengths of 1.35, 2.7, and 5.4 mm. Cross-correlation proved the better algorithm at 2.7 and 5.4 mm correlation lengths (P < 0.05). Static cross-correlation performed better than moving-target cross-correlation at the 2.7 mm correlation length (P < 0.05). Finally, we compared the static and moving-target cross-correlation on a flow phantom with a skull casting aberrator. Using signal from static targets, the correction resulted in an average contrast increase of 22.2%, compared with 13.2% using signal from moving targets. The contrast-to-noise ratio (CNR) increased by 20.5% and 12.8% using static and moving targets, respectively. Doppler signal strength increased by 5.6% and 4.9% for the static and moving-targets methods, respectively.

  16. Climate variability, weather and enteric disease incidence in New Zealand: time series analysis.

    PubMed

    Lal, Aparna; Ikeda, Takayoshi; French, Nigel; Baker, Michael G; Hales, Simon

    2013-01-01

    Evaluating the influence of climate variability on enteric disease incidence may improve our ability to predict how climate change may affect these diseases. To examine the associations between regional climate variability and enteric disease incidence in New Zealand. Associations between monthly climate and enteric diseases (campylobacteriosis, salmonellosis, cryptosporidiosis, giardiasis) were investigated using Seasonal Auto Regressive Integrated Moving Average (SARIMA) models. No climatic factors were significantly associated with campylobacteriosis and giardiasis, with similar predictive power for univariate and multivariate models. Cryptosporidiosis was positively associated with average temperature of the previous month (β =  0.130, SE =  0.060, p <0.01) and inversely related to the Southern Oscillation Index (SOI) two months previously (β =  -0.008, SE =  0.004, p <0.05). By contrast, salmonellosis was positively associated with temperature (β  = 0.110, SE = 0.020, p<0.001) of the current month and SOI of the current (β  = 0.005, SE = 0.002, p<0.050) and previous month (β  = 0.005, SE = 0.002, p<0.05). Forecasting accuracy of the multivariate models for cryptosporidiosis and salmonellosis were significantly higher. Although spatial heterogeneity in the observed patterns could not be assessed, these results suggest that temporally lagged relationships between climate variables and national communicable disease incidence data can contribute to disease prediction models and early warning systems.

  17. Development, Verification and Use of Gust Modeling in the NASA Computational Fluid Dynamics Code FUN3D

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.

    2012-01-01

    This paper presents the implementation of gust modeling capability in the CFD code FUN3D. The gust capability is verified by computing the response of an airfoil to a sharp edged gust. This result is compared with the theoretical result. The present simulations will be compared with other CFD gust simulations. This paper also serves as a users manual for FUN3D gust analyses using a variety of gust profiles. Finally, the development of an Auto-Regressive Moving-Average (ARMA) reduced order gust model using a gust with a Gaussian profile in the FUN3D code is presented. ARMA simulated results of a sequence of one-minus-cosine gusts is shown to compare well with the same gust profile computed with FUN3D. Proper Orthogonal Decomposition (POD) is combined with the ARMA modeling technique to predict the time varying pressure coefficient increment distribution due to a novel gust profile. The aeroelastic response of a pitch/plunge airfoil to a gust environment is computed with a reduced order model, and compared with a direct simulation of the system in the FUN3D code. The two results are found to agree very well.

  18. Population drinking and fatal injuries in Eastern Europe: a time-series analysis of six countries.

    PubMed

    Landberg, Jonas

    2010-01-01

    To estimate to what extent injury mortality rates in 6 Eastern European countries are affected by changes in population drinking during the post-war period. The analysis included injury mortality rates and per capita alcohol consumption in Russia, Belarus, Poland, Hungary, Bulgaria and the former Czechoslovakia. Total population and gender-specific models were estimated using auto regressive integrated moving average time-series modelling. The estimates for the total population were generally positive and significant. For Russia and Belarus, a 1-litre increase in per capita consumption was associated with an increase in injury mortality of 7.5 and 5.5 per 100,000 inhabitants, respectively. The estimates for the remaining countries ranged between 1.4 and 2.0. The gender-specific estimates displayed national variations similar to the total population estimates although the estimates for males were higher than for females in all countries. The results suggest that changes in per capita consumption have a significant impact on injury mortality in these countries, but the strength of the association tends to be stronger in countries where intoxication-oriented drinking is more common. Copyright 2009 S. Karger AG, Basel.

  19. Reference Aid: Glossary of Acronyms, Abbreviations, and Special Terms Used in the Western Europe Romance-Language Press.

    DTIC Science & Technology

    1977-11-25

    Fr) Legere Car (Fr) Auto Light Machine AM Metralhadora Gun (Por) Aviazione Air Force Ligeira Militare (Ita) AMN Adetto Military Ansaldo...Italian Socialist Youth Federation (Ita) General Federa- tion of Labor of Belgium (Bel) 52 FIA Federation International de 1’ Auto ...Obrero Auto - gestionario Self-Management Workers Move- ment (Spa) MOC Mouvement des Ouvriers Chretiens Mocidade Portuguese MODEF Mouvement de

  20. Road Traffic Injury Trends in the City of Valledupar, Colombia. A Time Series Study from 2008 to 2012

    PubMed Central

    Rodríguez, Jorge Martín; Peñaloza, Rolando Enrique; Moreno Montoya, José

    2015-01-01

    Objective To analyze the behavior temporal of road-traffic injuries (RTI) in Valledupar, Colombia from January 2008 to December 2012. Methodology An observational study was conducted based on records from the Colombian National Legal Medicine and Forensic Sciences Institute regional office in Valledupar. Different variables were analyzed, such as the injured person’s sex, age, education level, and type of road user; the timeframe, place and circumstances of crashes and the vehicles associated with the occurrence. Furthermore, a time series analysis was conducted using an auto-regressive integrated moving average. Results There were 105 events per month on an average, 64.9% of RTI involved men; 82.3% of the persons injured were from 18 to 59 years of age; the average age was 35.4 years of age; the road users most involved in RTI were motorcyclists (69%), followed by pedestrians (12%). 70% had up to upper-secondary education. Sunday was the day with the most RTI occurrences; 93% of the RTI occurred in the urban area. The time series showed a seasonal pattern and a significant trend effect. The modeling process verified the existence of both memory and extrinsic variables related. Conclusions An RTI occurrence pattern was identified, which showed an upward trend during the period analyzed. Motorcyclists were the main road users involved in RTI, which suggests the need to design and implement specific measures for that type of road user, from regulations for graduated licensing for young drivers to monitoring road user behavior for the promotion of road safety. PMID:26657887

  1. Road Traffic Injury Trends in the City of Valledupar, Colombia. A Time Series Study from 2008 to 2012.

    PubMed

    Rodríguez, Jorge Martín; Peñaloza, Rolando Enrique; Moreno Montoya, José

    2015-01-01

    To analyze the behavior temporal of road-traffic injuries (RTI) in Valledupar, Colombia from January 2008 to December 2012. An observational study was conducted based on records from the Colombian National Legal Medicine and Forensic Sciences Institute regional office in Valledupar. Different variables were analyzed, such as the injured person's sex, age, education level, and type of road user; the timeframe, place and circumstances of crashes and the vehicles associated with the occurrence. Furthermore, a time series analysis was conducted using an auto-regressive integrated moving average. There were 105 events per month on an average, 64.9% of RTI involved men; 82.3% of the persons injured were from 18 to 59 years of age; the average age was 35.4 years of age; the road users most involved in RTI were motorcyclists (69%), followed by pedestrians (12%). 70% had up to upper-secondary education. Sunday was the day with the most RTI occurrences; 93% of the RTI occurred in the urban area. The time series showed a seasonal pattern and a significant trend effect. The modeling process verified the existence of both memory and extrinsic variables related. An RTI occurrence pattern was identified, which showed an upward trend during the period analyzed. Motorcyclists were the main road users involved in RTI, which suggests the need to design and implement specific measures for that type of road user, from regulations for graduated licensing for young drivers to monitoring road user behavior for the promotion of road safety.

  2. A landslide-quake detection algorithm with STA/LTA and diagnostic functions of moving average and scintillation index: A preliminary case study of the 2009 Typhoon Morakot in Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Yu-Jie; Lin, Guan-Wei

    2017-04-01

    Since 1999, Taiwan has experienced a rapid rise in the number of landslides, and the number even reached a peak after the 2009 Typhoon Morakot. Although it is proved that the ground-motion signals induced by slope processes could be recorded by seismograph, it is difficult to be distinguished from continuous seismic records due to the lack of distinct P and S waves. In this study, we combine three common seismic detectors including the short-term average/long-term average (STA/LTA) approach, and two diagnostic functions of moving average and scintillation index. Based on these detectors, we have established an auto-detection algorithm of landslide-quakes and the detection thresholds are defined to distinguish landslide-quake from earthquakes and background noises. To further improve the proposed detection algorithm, we apply it to seismic archives recorded by Broadband Array in Taiwan for Seismology (BATS) during the 2009 Typhoon Morakots and consequently the discrete landslide-quakes detected by the automatic algorithm are located. The detection algorithm show that the landslide-detection results are consistent with that of visual inspection and hence can be used to automatically monitor landslide-quakes.

  3. [Correlation coefficient-based classification method of hydrological dependence variability: With auto-regression model as example].

    PubMed

    Zhao, Yu Xi; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi

    2018-04-01

    Hydrological process evaluation is temporal dependent. Hydrological time series including dependence components do not meet the data consistency assumption for hydrological computation. Both of those factors cause great difficulty for water researches. Given the existence of hydrological dependence variability, we proposed a correlationcoefficient-based method for significance evaluation of hydrological dependence based on auto-regression model. By calculating the correlation coefficient between the original series and its dependence component and selecting reasonable thresholds of correlation coefficient, this method divided significance degree of dependence into no variability, weak variability, mid variability, strong variability, and drastic variability. By deducing the relationship between correlation coefficient and auto-correlation coefficient in each order of series, we found that the correlation coefficient was mainly determined by the magnitude of auto-correlation coefficient from the 1 order to p order, which clarified the theoretical basis of this method. With the first-order and second-order auto-regression models as examples, the reasonability of the deduced formula was verified through Monte-Carlo experiments to classify the relationship between correlation coefficient and auto-correlation coefficient. This method was used to analyze three observed hydrological time series. The results indicated the coexistence of stochastic and dependence characteristics in hydrological process.

  4. XMGR5 users manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, K.R.; Fisher, J.E.

    1997-03-01

    ACE/gr is XY plotting tool for workstations or X-terminals using X. A few of its features are: User defined scaling, tick marks, labels, symbols, line styles, colors. Batch mode for unattended plotting. Read and write parameters used during a session. Polynomial regression, splines, running averages, DFT/FFT, cross/auto-correlation. Hardcopy support for PostScript, HP-GL, and FrameMaker.mif format. While ACE/gr has a convenient point-and-click interface, most parameter settings and operations are available through a command line interface (found in Files/Commands).

  5. Creating Reconfigurable Materials Using ``Colonies'' of Oscillating Polymer Gels

    NASA Astrophysics Data System (ADS)

    Deb, Debabrata; Dayal, Pratyush; Kuksenok, Olga; Balazs, Anna

    2013-03-01

    Species ranging from single-cell organisms to social insects can undergo auto-chemotaxis, where the entities move towards a chemo-attractant that they themselves emit. This mode of signaling allows the organisms to form large-scale structures. Using computational modeling, we show that millimeter-sized polymer gels can display similar auto-chemotaxis. In particular, we demonstrate that gels undergoing the self-oscillating Belousov-Zhabotinsky (BZ) reaction not only respond to a chemical signal from the surrounding solution, but also emit this signal and thus, multiple gel pieces can spontaneously self-aggregate. We focus on the collective behavior of ``colonies'' of BZ gels and show that communication between the individual pieces critically depends on all the neighboring gels. We isolate the conditions at which the BZ gels can undergo a type of self-recombining: if a larger gel is cut into distinct pieces that are moved relatively far apart, then their auto-chemotactic behavior drives them to move and autonomously recombine into a structure resembling the original, uncut sample. These findings reveal that the BZ gels can be used as autonomously moving building blocks to construct multiple structures and thus, provide a new route for creating dynamically reconfigurable materials.

  6. [Design of longitudinal auto-tracking of the detector on X-ray in digital radiography].

    PubMed

    Yu, Xiaomin; Jiang, Tianhao; Liu, Zhihong; Zhao, Xu

    2018-04-01

    One algorithm is designed to implement longitudinal auto-tracking of the the detector on X-ray in the digital radiography system (DR) with manual collimator. In this study, when the longitudinal length of field of view (LFOV) on the detector is coincided with the longitudinal effective imaging size of the detector, the collimator half open angle ( Ψ ), the maximum centric distance ( e max ) between the center of X-ray field of view and the projection center of the focal spot, and the detector moving distance for auto-traking can be calculated automatically. When LFOV is smaller than the longitudinal effective imaging size of the detector by reducing Ψ , the e max can still be used to calculate the detector moving distance. Using this auto-tracking algorithm in DR with manual collimator, the tested results show that the X-ray projection is totally covered by the effective imaging area of the detector, although the center of the field of view is not aligned with the center of the effective imaging area of the detector. As a simple and low-cost design, the algorithm can be used for longitudinal auto-tracking of the detector on X-ray in the manual collimator DR.

  7. AutoDockFR: Advances in Protein-Ligand Docking with Explicitly Specified Binding Site Flexibility

    PubMed Central

    Ravindranath, Pradeep Anand; Forli, Stefano; Goodsell, David S.; Olson, Arthur J.; Sanner, Michel F.

    2015-01-01

    Automated docking of drug-like molecules into receptors is an essential tool in structure-based drug design. While modeling receptor flexibility is important for correctly predicting ligand binding, it still remains challenging. This work focuses on an approach in which receptor flexibility is modeled by explicitly specifying a set of receptor side-chains a-priori. The challenges of this approach include the: 1) exponential growth of the search space, demanding more efficient search methods; and 2) increased number of false positives, calling for scoring functions tailored for flexible receptor docking. We present AutoDockFR–AutoDock for Flexible Receptors (ADFR), a new docking engine based on the AutoDock4 scoring function, which addresses the aforementioned challenges with a new Genetic Algorithm (GA) and customized scoring function. We validate ADFR using the Astex Diverse Set, demonstrating an increase in efficiency and reliability of its GA over the one implemented in AutoDock4. We demonstrate greatly increased success rates when cross-docking ligands into apo receptors that require side-chain conformational changes for ligand binding. These cross-docking experiments are based on two datasets: 1) SEQ17 –a receptor diversity set containing 17 pairs of apo-holo structures; and 2) CDK2 –a ligand diversity set composed of one CDK2 apo structure and 52 known bound inhibitors. We show that, when cross-docking ligands into the apo conformation of the receptors with up to 14 flexible side-chains, ADFR reports more correctly cross-docked ligands than AutoDock Vina on both datasets with solutions found for 70.6% vs. 35.3% systems on SEQ17, and 76.9% vs. 61.5% on CDK2. ADFR also outperforms AutoDock Vina in number of top ranking solutions on both datasets. Furthermore, we show that correctly docked CDK2 complexes re-create on average 79.8% of all pairwise atomic interactions between the ligand and moving receptor atoms in the holo complexes. Finally, we show that down-weighting the receptor internal energy improves the ranking of correctly docked poses and that runtime for AutoDockFR scales linearly when side-chain flexibility is added. PMID:26629955

  8. Forecasting Daily Patient Outflow From a Ward Having No Real-Time Clinical Data

    PubMed Central

    Tran, Truyen; Luo, Wei; Phung, Dinh; Venkatesh, Svetha

    2016-01-01

    Background: Modeling patient flow is crucial in understanding resource demand and prioritization. We study patient outflow from an open ward in an Australian hospital, where currently bed allocation is carried out by a manager relying on past experiences and looking at demand. Automatic methods that provide a reasonable estimate of total next-day discharges can aid in efficient bed management. The challenges in building such methods lie in dealing with large amounts of discharge noise introduced by the nonlinear nature of hospital procedures, and the nonavailability of real-time clinical information in wards. Objective Our study investigates different models to forecast the total number of next-day discharges from an open ward having no real-time clinical data. Methods We compared 5 popular regression algorithms to model total next-day discharges: (1) autoregressive integrated moving average (ARIMA), (2) the autoregressive moving average with exogenous variables (ARMAX), (3) k-nearest neighbor regression, (4) random forest regression, and (5) support vector regression. Although the autoregressive integrated moving average model relied on past 3-month discharges, nearest neighbor forecasting used median of similar discharges in the past in estimating next-day discharge. In addition, the ARMAX model used the day of the week and number of patients currently in ward as exogenous variables. For the random forest and support vector regression models, we designed a predictor set of 20 patient features and 88 ward-level features. Results Our data consisted of 12,141 patient visits over 1826 days. Forecasting quality was measured using mean forecast error, mean absolute error, symmetric mean absolute percentage error, and root mean square error. When compared with a moving average prediction model, all 5 models demonstrated superior performance with the random forests achieving 22.7% improvement in mean absolute error, for all days in the year 2014. Conclusions In the absence of clinical information, our study recommends using patient-level and ward-level data in predicting next-day discharges. Random forest and support vector regression models are able to use all available features from such data, resulting in superior performance over traditional autoregressive methods. An intelligent estimate of available beds in wards plays a crucial role in relieving access block in emergency departments. PMID:27444059

  9. Elbow joint angle and elbow movement velocity estimation using NARX-multiple layer perceptron neural network model with surface EMG time domain parameters.

    PubMed

    Raj, Retheep; Sivanandan, K S

    2017-01-01

    Estimation of elbow dynamics has been the object of numerous investigations. In this work a solution is proposed for estimating elbow movement velocity and elbow joint angle from Surface Electromyography (SEMG) signals. Here the Surface Electromyography signals are acquired from the biceps brachii muscle of human hand. Two time-domain parameters, Integrated EMG (IEMG) and Zero Crossing (ZC), are extracted from the Surface Electromyography signal. The relationship between the time domain parameters, IEMG and ZC with elbow angular displacement and elbow angular velocity during extension and flexion of the elbow are studied. A multiple input-multiple output model is derived for identifying the kinematics of elbow. A Nonlinear Auto Regressive with eXogenous inputs (NARX) structure based multiple layer perceptron neural network (MLPNN) model is proposed for the estimation of elbow joint angle and elbow angular velocity. The proposed NARX MLPNN model is trained using Levenberg-marquardt based algorithm. The proposed model is estimating the elbow joint angle and elbow movement angular velocity with appreciable accuracy. The model is validated using regression coefficient value (R). The average regression coefficient value (R) obtained for elbow angular displacement prediction is 0.9641 and for the elbow anglular velocity prediction is 0.9347. The Nonlinear Auto Regressive with eXogenous inputs (NARX) structure based multiple layer perceptron neural networks (MLPNN) model can be used for the estimation of angular displacement and movement angular velocity of the elbow with good accuracy.

  10. Trans-dimensional joint inversion of seabed scattering and reflection data.

    PubMed

    Steininger, Gavin; Dettmer, Jan; Dosso, Stan E; Holland, Charles W

    2013-03-01

    This paper examines joint inversion of acoustic scattering and reflection data to resolve seabed interface roughness parameters (spectral strength, exponent, and cutoff) and geoacoustic profiles. Trans-dimensional (trans-D) Bayesian sampling is applied with both the number of sediment layers and the order (zeroth or first) of auto-regressive parameters in the error model treated as unknowns. A prior distribution that allows fluid sediment layers over an elastic basement in a trans-D inversion is derived and implemented. Three cases are considered: Scattering-only inversion, joint scattering and reflection inversion, and joint inversion with the trans-D auto-regressive error model. Including reflection data improves the resolution of scattering and geoacoustic parameters. The trans-D auto-regressive model further improves scattering resolution and correctly differentiates between strongly and weakly correlated residual errors.

  11. An Optimization of Inventory Demand Forecasting in University Healthcare Centre

    NASA Astrophysics Data System (ADS)

    Bon, A. T.; Ng, T. K.

    2017-01-01

    Healthcare industry becomes an important field for human beings nowadays as it concerns about one’s health. With that, forecasting demand for health services is an important step in managerial decision making for all healthcare organizations. Hence, a case study was conducted in University Health Centre to collect historical demand data of Panadol 650mg for 68 months from January 2009 until August 2014. The aim of the research is to optimize the overall inventory demand through forecasting techniques. Quantitative forecasting or time series forecasting model was used in the case study to forecast future data as a function of past data. Furthermore, the data pattern needs to be identified first before applying the forecasting techniques. Trend is the data pattern and then ten forecasting techniques are applied using Risk Simulator Software. Lastly, the best forecasting techniques will be find out with the least forecasting error. Among the ten forecasting techniques include single moving average, single exponential smoothing, double moving average, double exponential smoothing, regression, Holt-Winter’s additive, Seasonal additive, Holt-Winter’s multiplicative, seasonal multiplicative and Autoregressive Integrated Moving Average (ARIMA). According to the forecasting accuracy measurement, the best forecasting technique is regression analysis.

  12. Adaptive and accelerated tracking-learning-detection

    NASA Astrophysics Data System (ADS)

    Guo, Pengyu; Li, Xin; Ding, Shaowen; Tian, Zunhua; Zhang, Xiaohu

    2013-08-01

    An improved online long-term visual tracking algorithm, named adaptive and accelerated TLD (AA-TLD) based on Tracking-Learning-Detection (TLD) which is a novel tracking framework has been introduced in this paper. The improvement focuses on two aspects, one is adaption, which makes the algorithm not dependent on the pre-defined scanning grids by online generating scale space, and the other is efficiency, which uses not only algorithm-level acceleration like scale prediction that employs auto-regression and moving average (ARMA) model to learn the object motion to lessen the detector's searching range and the fixed number of positive and negative samples that ensures a constant retrieving time, but also CPU and GPU parallel technology to achieve hardware acceleration. In addition, in order to obtain a better effect, some TLD's details are redesigned, which uses a weight including both normalized correlation coefficient and scale size to integrate results, and adjusts distance metric thresholds online. A contrastive experiment on success rate, center location error and execution time, is carried out to show a performance and efficiency upgrade over state-of-the-art TLD with partial TLD datasets and Shenzhou IX return capsule image sequences. The algorithm can be used in the field of video surveillance to meet the need of real-time video tracking.

  13. Hybrid Wavelet De-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series

    NASA Astrophysics Data System (ADS)

    WANG, D.; Wang, Y.; Zeng, X.

    2017-12-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.

  14. Effect of climate variables on cocoa black pod incidence in Sabah using ARIMAX model

    NASA Astrophysics Data System (ADS)

    Ling Sheng Chang, Albert; Ramba, Haya; Mohd. Jaaffar, Ahmad Kamil; Kim Phin, Chong; Chong Mun, Ho

    2016-06-01

    Cocoa black pod disease is one of the major diseases affecting the cocoa production in Malaysia and also around the world. Studies have shown that the climate variables have influenced the cocoa black pod disease incidence and it is important to quantify the black pod disease variation due to the effect of climate variables. Application of time series analysis especially auto-regressive moving average (ARIMA) model has been widely used in economics study and can be used to quantify the effect of climate variables on black pod incidence to forecast the right time to control the incidence. However, ARIMA model does not capture some turning points in cocoa black pod incidence. In order to improve forecasting performance, other explanatory variables such as climate variables should be included into ARIMA model as ARIMAX model. Therefore, this paper is to study the effect of climate variables on the cocoa black pod disease incidence using ARIMAX model. The findings of the study showed ARIMAX model using MA(1) and relative humidity at lag 7 days, RHt - 7 gave better R square value compared to ARIMA model using MA(1) which could be used to forecast the black pod incidence to assist the farmers determine timely application of fungicide spraying and culture practices to control the black pod incidence.

  15. Do drug warnings and market withdrawals have an impact on the number of calls to teratogen information services?

    PubMed

    Sheehy, O; Gendron, M-P; Martin, B; Bérard, A

    2012-06-01

    IMAGe provides information on risks and benefits of medication use during pregnancy and lactation. The aim of this study was to determine the impact of Health Canada warnings on the number of calls received at IMAGe. We analyzed calls received between January 2003 and March 2008. The impact of the following warning/withdrawal were studied: paroxetine and risk of cardiac malformations (09/29/2005), selective serotonin reuptake inhibitors (SSRIs) and risk of persistent pulmonary hypertension of the newborn (PPHN) (03/10/2006), and impact of rofecoxib market withdrawal (09/30/2004). Interrupted auto-regressive integrated -moving average (ARIMA) analyses were used to test the impact of each warning on the number of calls received to IMAGe. 61,505 calls were analyzed. The paroxetine warning had a temporary impact increasing the overall number of calls to IMAGe, and an abrupt permanent effect on the number of calls related to antidepressant exposures. The PPHN warning had no impact but we observed a significant increase in the number of calls following rofecoxib market withdrawal. Health Canada needs to consider the increase in the demand of information to IMAGe following warnings on the risk of medication use during pregnancy. © Georg Thieme Verlag KG Stuttgart · New York.

  16. High mean water vapour pressure promotes the transmission of bacillary dysentery.

    PubMed

    Li, Guo-Zheng; Shao, Feng-Feng; Zhang, Hao; Zou, Chun-Pu; Li, Hui-Hui; Jin, Jue

    2015-01-01

    Bacillary dysentery is an infectious disease caused by Shigella dysenteriae, which has a seasonal distribution. External environmental factors, including climate, play a significant role in its transmission. This paper identifies climate-related risk factors and their role in bacillary dysentery transmission. Harbin, in northeast China, with a temperate climate, and Quzhou, in southern China, with a subtropical climate, are chosen as the study locations. The least absolute shrinkage and selectionator operator is applied to select relevant climate factors involved in the transmission of bacillary dysentery. Based on the selected relevant climate factors and incidence rates, an AutoRegressive Integrated Moving Average (ARIMA) model is established successfully as a time series prediction model. The numerical results demonstrate that the mean water vapour pressure over the previous month results in a high relative risk for bacillary dysentery transmission in both cities, and the ARIMA model can successfully perform such a prediction. These results provide better explanations for the relationship between climate factors and bacillary dysentery transmission than those put forth in other studies that use only correlation coefficients or fitting models. The findings in this paper demonstrate that the mean water vapour pressure over the previous month is an important predictor for the transmission of bacillary dysentery.

  17. Comparison of Artificial Neural Networks and ARIMA statistical models in simulations of target wind time series

    NASA Astrophysics Data System (ADS)

    Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas

    2015-04-01

    The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.

  18. Long-range prediction of the low-frequency mode in the low-level Indian monsoon circulation with a simple statistical method

    NASA Astrophysics Data System (ADS)

    Chen, Tsing-Chang; Yen, Ming-Cheng; Wu, Kuang-Der; Ng, Thomas

    1992-08-01

    The time evolution of the Indian monsoon is closely related to locations of the northward migrating monsoon troughs and ridges which can be well depicted with the 30 60day filtered 850-mb streamfunction. Thus, long-range forecasts of the large-scale low-level monsoon can be obtained from those of the filtered 850-mb streamfunction. These long-range forecasts were made in this study in terms of the Auto Regressive (AR) Moving-Average process. The historical series of the AR model were constructed with the 30 60day filtered 850-mb streamfunction [˜ψ (850mb)] time series of 4months. However, the phase of the last low-frequency cycle in the ˜ψ (850mb) time series can be skewed by the bandpass filtering. To reduce this phase skewness, a simple scheme is introduced. With this phase modification of the filtered 850-mb streamfunction, we performed the pilot forecast experiments of three summers with the AR forecast process. The forecast errors in the positions of the northward propagating monsoon troughs and ridges at Day 20 are generally within the range of 1~2days behind the observed, except in some extreme cases.

  19. Distributed Peer-to-Peer Target Tracking in Wireless Sensor Networks

    PubMed Central

    Wang, Xue; Wang, Sheng; Bi, Dao-Wei; Ma, Jun-Jie

    2007-01-01

    Target tracking is usually a challenging application for wireless sensor networks (WSNs) because it is always computation-intensive and requires real-time processing. This paper proposes a practical target tracking system based on the auto regressive moving average (ARMA) model in a distributed peer-to-peer (P2P) signal processing framework. In the proposed framework, wireless sensor nodes act as peers that perform target detection, feature extraction, classification and tracking, whereas target localization requires the collaboration between wireless sensor nodes for improving the accuracy and robustness. For carrying out target tracking under the constraints imposed by the limited capabilities of the wireless sensor nodes, some practically feasible algorithms, such as the ARMA model and the 2-D integer lifting wavelet transform, are adopted in single wireless sensor nodes due to their outstanding performance and light computational burden. Furthermore, a progressive multi-view localization algorithm is proposed in distributed P2P signal processing framework considering the tradeoff between the accuracy and energy consumption. Finally, a real world target tracking experiment is illustrated. Results from experimental implementations have demonstrated that the proposed target tracking system based on a distributed P2P signal processing framework can make efficient use of scarce energy and communication resources and achieve target tracking successfully.

  20. Brillouin Scattering Spectrum Analysis Based on Auto-Regressive Spectral Estimation

    NASA Astrophysics Data System (ADS)

    Huang, Mengyun; Li, Wei; Liu, Zhangyun; Cheng, Linghao; Guan, Bai-Ou

    2018-06-01

    Auto-regressive (AR) spectral estimation technology is proposed to analyze the Brillouin scattering spectrum in Brillouin optical time-domain refelectometry. It shows that AR based method can reliably estimate the Brillouin frequency shift with an accuracy much better than fast Fourier transform (FFT) based methods provided the data length is not too short. It enables about 3 times improvement over FFT at a moderate spatial resolution.

  1. Modeling Complex Phenomena Using Multiscale Time Sequences

    DTIC Science & Technology

    2009-08-24

    measures based on Hurst and Holder exponents , auto-regressive methods and Fourier and wavelet decomposition methods. The applications for this technology...relate to each other. This can be done by combining a set statistical fractal measures based on Hurst and Holder exponents , auto-regressive...different scales and how these scales relate to each other. This can be done by combining a set statistical fractal measures based on Hurst and

  2. SU-C-BRA-05: Delineating High-Dose Clinical Target Volumes for Head and Neck Tumors Using Machine Learning Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardenas, C; The University of Texas Graduate School of Biomedical Sciences, Houston, TX; Wong, A

    Purpose: To develop and test population-based machine learning algorithms for delineating high-dose clinical target volumes (CTVs) in H&N tumors. Automating and standardizing the contouring of CTVs can reduce both physician contouring time and inter-physician variability, which is one of the largest sources of uncertainty in H&N radiotherapy. Methods: Twenty-five node-negative patients treated with definitive radiotherapy were selected (6 right base of tongue, 11 left and 9 right tonsil). All patients had GTV and CTVs manually contoured by an experienced radiation oncologist prior to treatment. This contouring process, which is driven by anatomical, pathological, and patient specific information, typically results inmore » non-uniform margin expansions about the GTV. Therefore, we tested two methods to delineate high-dose CTV given a manually-contoured GTV: (1) regression-support vector machines(SVM) and (2) classification-SVM. These models were trained and tested on each patient group using leave-one-out cross-validation. The volume difference(VD) and Dice similarity coefficient(DSC) between the manual and auto-contoured CTV were calculated to evaluate the results. Distances from GTV-to-CTV were computed about each patient’s GTV and these distances, in addition to distances from GTV to surrounding anatomy in the expansion direction, were utilized in the regression-SVM method. The classification-SVM method used categorical voxel-information (GTV, selected anatomical structures, else) from a 3×3×3cm3 ROI centered about the voxel to classify voxels as CTV. Results: Volumes for the auto-contoured CTVs ranged from 17.1 to 149.1cc and 17.4 to 151.9cc; the average(range) VD between manual and auto-contoured CTV were 0.93 (0.48–1.59) and 1.16(0.48–1.97); while average(range) DSC values were 0.75(0.59–0.88) and 0.74(0.59–0.81) for the regression-SVM and classification-SVM methods, respectively. Conclusion: We developed two novel machine learning methods to delineate high-dose CTV for H&N patients. Both methods showed promising results that hint to a solution to the standardization of the contouring process of clinical target volumes. Varian Medical Systems grant.« less

  3. Methodology for the AutoRegressive Planet Search (ARPS) Project

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric; Caceres, Gabriel; ARPS Collaboration

    2018-01-01

    The detection of periodic signals of transiting exoplanets is often impeded by the presence of aperiodic photometric variations. This variability is intrinsic to the host star in space-based observations (typically arising from magnetic activity) and from observational conditions in ground-based observations. The most common statistical procedures to remove stellar variations are nonparametric, such as wavelet decomposition or Gaussian Processes regression. However, many stars display variability with autoregressive properties, wherein later flux values are correlated with previous ones. Providing the time series is evenly spaced, parametric autoregressive models can prove very effective. Here we present the methodology of the Autoregessive Planet Search (ARPS) project which uses Autoregressive Integrated Moving Average (ARIMA) models to treat a wide variety of stochastic short-memory processes, as well as nonstationarity. Additionally, we introduce a planet-search algorithm to detect periodic transits in the time-series residuals after application of ARIMA models. Our matched-filter algorithm, the Transit Comb Filter (TCF), replaces the traditional box-fitting step. We construct a periodogram based on the TCF to concentrate the signal of these periodic spikes. Various features of the original light curves, the ARIMA fits, the TCF periodograms, and folded light curves at peaks of the TCF periodogram can then be collected to provide constraints for planet detection. These features provide input into a multivariate classifier when a training set is available. The ARPS procedure has been applied NASA's Kepler mission observations of ~200,000 stars (Caceres, Dissertation Talk, this meeting) and will be applied in the future to other datasets.

  4. [Prediction and spatial distribution of recruitment trees of natural secondary forest based on geographically weighted Poisson model].

    PubMed

    Zhang, Ling Yu; Liu, Zhao Gang

    2017-12-01

    Based on the data collected from 108 permanent plots of the forest resources survey in Maoershan Experimental Forest Farm during 2004-2016, this study investigated the spatial distribution of recruitment trees in natural secondary forest by global Poisson regression and geographically weighted Poisson regression (GWPR) with four bandwidths of 2.5, 5, 10 and 15 km. The simulation effects of the 5 regressions and the factors influencing the recruitment trees in stands were analyzed, a description was given to the spatial autocorrelation of the regression residuals on global and local levels using Moran's I. The results showed that the spatial distribution of the number of natural secondary forest recruitment was significantly influenced by stands and topographic factors, especially average DBH. The GWPR model with small scale (2.5 km) had high accuracy of model fitting, a large range of model parameter estimates was generated, and the localized spatial distribution effect of the model parameters was obtained. The GWPR model at small scale (2.5 and 5 km) had produced a small range of model residuals, and the stability of the model was improved. The global spatial auto-correlation of the GWPR model residual at the small scale (2.5 km) was the lowe-st, and the local spatial auto-correlation was significantly reduced, in which an ideal spatial distribution pattern of small clusters with different observations was formed. The local model at small scale (2.5 km) was much better than the global model in the simulation effect on the spatial distribution of recruitment tree number.

  5. 77 FR 39767 - Self-Regulatory Organizations; National Stock Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... has executed on average per trading day (excluding partial trading days) in AutoEx or Order Delivery... (``AutoEx'') shall mean only those executed shares of the ETP Holder that are submitted in AutoEx mode... period, a combined ADV in both AutoEx and Order Delivery of at least 11.5 million shares, of which at...

  6. Estimates of Average Glandular Dose with Auto-modes of X-ray Exposures in Digital Breast Tomosynthesis.

    PubMed

    Kamal, Izdihar; Chelliah, Kanaga K; Mustafa, Nawal

    2015-05-01

    The aim of this research was to examine the average glandular dose (AGD) of radiation among different breast compositions of glandular and adipose tissue with auto-modes of exposure factor selection in digital breast tomosynthesis. This experimental study was carried out in the National Cancer Society, Kuala Lumpur, Malaysia, between February 2012 and February 2013 using a tomosynthesis digital mammography X-ray machine. The entrance surface air kerma and the half-value layer were determined using a 100H thermoluminescent dosimeter on 50% glandular and 50% adipose tissue (50/50) and 20% glandular and 80% adipose tissue (20/80) commercially available breast phantoms (Computerized Imaging Reference Systems, Inc., Norfolk, Virginia, USA) with auto-time, auto-filter and auto-kilovolt modes. The lowest AGD for the 20/80 phantom with auto-time was 2.28 milliGray (mGy) for two dimension (2D) and 2.48 mGy for three dimensional (3D) images. The lowest AGD for the 50/50 phantom with auto-time was 0.97 mGy for 2D and 1.0 mGy for 3D. The AGD values for both phantoms were lower against a high kilovolt peak and the use of auto-filter mode was more practical for quick acquisition while limiting the probability of operator error.

  7. Local Linear Regression for Data with AR Errors.

    PubMed

    Li, Runze; Li, Yan

    2009-07-01

    In many statistical applications, data are collected over time, and they are likely correlated. In this paper, we investigate how to incorporate the correlation information into the local linear regression. Under the assumption that the error process is an auto-regressive process, a new estimation procedure is proposed for the nonparametric regression by using local linear regression method and the profile least squares techniques. We further propose the SCAD penalized profile least squares method to determine the order of auto-regressive process. Extensive Monte Carlo simulation studies are conducted to examine the finite sample performance of the proposed procedure, and to compare the performance of the proposed procedures with the existing one. From our empirical studies, the newly proposed procedures can dramatically improve the accuracy of naive local linear regression with working-independent error structure. We illustrate the proposed methodology by an analysis of real data set.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faranda, Davide, E-mail: davide.faranda@cea.fr; Dubrulle, Bérengère; Daviaud, François

    We introduce a novel way to extract information from turbulent datasets by applying an Auto Regressive Moving Average (ARMA) statistical analysis. Such analysis goes well beyond the analysis of the mean flow and of the fluctuations and links the behavior of the recorded time series to a discrete version of a stochastic differential equation which is able to describe the correlation structure in the dataset. We introduce a new index Υ that measures the difference between the resulting analysis and the Obukhov model of turbulence, the simplest stochastic model reproducing both Richardson law and the Kolmogorov spectrum. We test themore » method on datasets measured in a von Kármán swirling flow experiment. We found that the ARMA analysis is well correlated with spatial structures of the flow, and can discriminate between two different flows with comparable mean velocities, obtained by changing the forcing. Moreover, we show that the Υ is highest in regions where shear layer vortices are present, thereby establishing a link between deviations from the Kolmogorov model and coherent structures. These deviations are consistent with the ones observed by computing the Hurst exponents for the same time series. We show that some salient features of the analysis are preserved when considering global instead of local observables. Finally, we analyze flow configurations with multistability features where the ARMA technique is efficient in discriminating different stability branches of the system.« less

  9. A time series analysis of presentations to Queensland health facilities for alcohol-related conditions, following the increase in 'alcopops' tax.

    PubMed

    Kisely, Steve; Crowe, Elizabeth; Lawrence, David; White, Angela; Connor, Jason

    2013-08-01

    In response to concerns about the health consequences of high-risk drinking by young people, the Australian Government increased the tax on pre-mixed alcoholic beverages ('alcopops') favoured by this demographic. We measured changes in admissions for alcohol-related harm to health throughout Queensland, before and after the tax increase in April 2008. We used data from the Queensland Trauma Register, Hospitals Admitted Patients Data Collection, and the Emergency Department Information System to calculate alcohol-related admission rates per 100,000 people, for 15 - 29 year-olds. We analysed data over 3 years (April 2006 - April 2009), using interrupted time-series analyses. This covered 2 years before, and 1 year after, the tax increase. We investigated both mental and behavioural consequences (via F10 codes), and intentional/unintentional injuries (S and T codes). We fitted an auto-regressive integrated moving average (ARIMA) model, to test for any changes following the increased tax. There was no decrease in alcohol-related admissions in 15 - 29 year-olds. We found similar results for males and females, as well as definitions of alcohol-related harms that were narrow (F10 codes only) and broad (F10, S and T codes). The increased tax on 'alcopops' was not associated with any reduction in hospital admissions for alcohol-related harms in Queensland 15 - 29 year-olds.

  10. [Application of R-based multiple seasonal ARIMA model, in predicting the incidence of hand, foot and mouth disease in Shaanxi province].

    PubMed

    Liu, F; Zhu, N; Qiu, L; Wang, J J; Wang, W H

    2016-08-10

    To apply the ' auto-regressive integrated moving average product seasonal model' in predicting the number of hand, foot and mouth disease in Shaanxi province. In Shaanxi province, the trend of hand, foot and mouth disease was analyzed and tested, under the use of R software, between January 2009 and June 2015. Multiple seasonal ARIMA model was then fitted under time series to predict the number of hand, foot and mouth disease in 2016 and 2017. Seasonal effect was seen in hand, foot and mouth disease in Shaanxi province. A multiple seasonal ARIMA (2,1,0)×(1,1,0)12 was established, with the equation as (1 -B)(1 -B12)Ln (Xt) =((1-1.000B)/(1-0.532B-0.363B(2))*(1-0.644B12-0.454B12(2)))*Epsilont. The mean of absolute error and the relative error were 531.535 and 0.114, respectively when compared to the simulated number of patients from Jun to Dec in 2015. RESULTS under the prediction of multiple seasonal ARIMA model showed that the numbers of patients in both 2016 and 2017 were similar to that of 2015 in Shaanxi province. Multiple seasonal ARIMA (2,1,0)×(1,1,0)12 model could be used to successfully predict the incidence of hand, foot and mouth disease in Shaanxi province.

  11. Extracting information from AGN variability

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.; Vogeley, Michael S.; Richards, Gordon T.

    2017-09-01

    Active galactic nuclei (AGNs) exhibit rapid, high-amplitude stochastic flux variations across the entire electromagnetic spectrum on time-scales ranging from hours to years. The cause of this variability is poorly understood. We present a Green's function-based method for using variability to (1) measure the time-scales on which flux perturbations evolve and (2) characterize the driving flux perturbations. We model the observed light curve of an AGN as a linear differential equation driven by stochastic impulses. We analyse the light curve of the Kepler AGN Zw 229-15 and find that the observed variability behaviour can be modelled as a damped harmonic oscillator perturbed by a coloured noise process. The model power spectrum turns over on time-scale 385 d. On shorter time-scales, the log-power-spectrum slope varies between 2 and 4, explaining the behaviour noted by previous studies. We recover and identify both the 5.6 and 67 d time-scales reported by previous work using the Green's function of the Continuous-time AutoRegressive Moving Average equation rather than by directly fitting the power spectrum of the light curve. These are the time-scales on which flux perturbations grow, and on which flux perturbations decay back to the steady-state flux level, respectively. We make the software package kālī used to study light curves using our method available to the community.

  12. Fluctuations of the experience of togetherness within the team over time: task-cohesion and shared understanding throughout a sporting regular season.

    PubMed

    Bourbousson, Jérôme; Fortes-Bourbousson, Marina

    2017-06-01

    Based on a diagnosis action research design, the present study assessed the fluctuations of the team experience of togetherness. Reported experiences of 12 basketball team members playing in the under-18 years old national championship were studied during a four-month training and competitive period. Time series analysis (Auto-Regressive Integrated Moving Average procedures) served to describe temporal properties of the way in which the fluctuations of task-cohesion and shared understanding were step-by-step experienced over time, respectively. Correlations, running-correlations and cross-lagged correlations were used to describe the temporal links that governed the relationships between both phenomena. The results indicated that the task-cohesion dimensions differed mainly for shared understanding dynamics in that their time fluctuations were not embedded in external events, and that the variations in shared understanding tend to precede 'individual attractions to the task' variations with seven team practical sessions. This study argues for further investigation of how 'togetherness' is experienced alternatively as a feeling of cohesion or shared understanding. Practitioner Summary: The present action research study investigated the experience that the team members have to share information during practice, and the subsequent benefices on team cohesion. Results call for specific interventions that make team members accept the fluctuating nature of team phenomena, to help them maintaining their daily efforts.

  13. Impact of weather factors on hand, foot and mouth disease, and its role in short-term incidence trend forecast in Huainan City, Anhui Province.

    PubMed

    Zhao, Desheng; Wang, Lulu; Cheng, Jian; Xu, Jun; Xu, Zhiwei; Xie, Mingyu; Yang, Huihui; Li, Kesheng; Wen, Lingying; Wang, Xu; Zhang, Heng; Wang, Shusi; Su, Hong

    2017-03-01

    Hand, foot, and mouth disease (HFMD) is one of the most common communicable diseases in China, and current climate change had been recognized as a significant contributor. Nevertheless, no reliable models have been put forward to predict the dynamics of HFMD cases based on short-term weather variations. The present study aimed to examine the association between weather factors and HFMD, and to explore the accuracy of seasonal auto-regressive integrated moving average (SARIMA) model with local weather conditions in forecasting HFMD. Weather and HFMD data from 2009 to 2014 in Huainan, China, were used. Poisson regression model combined with a distributed lag non-linear model (DLNM) was applied to examine the relationship between weather factors and HFMD. The forecasting model for HFMD was performed by using the SARIMA model. The results showed that temperature rise was significantly associated with an elevated risk of HFMD. Yet, no correlations between relative humidity, barometric pressure and rainfall, and HFMD were observed. SARIMA models with temperature variable fitted HFMD data better than the model without it (sR 2 increased, while the BIC decreased), and the SARIMA (0, 1, 1)(0, 1, 0) 52 offered the best fit for HFMD data. In addition, compared with females and nursery children, males and scattered children may be more suitable for using SARIMA model to predict the number of HFMD cases and it has high precision. In conclusion, high temperature could increase the risk of contracting HFMD. SARIMA model with temperature variable can effectively improve its forecast accuracy, which can provide valuable information for the policy makers and public health to construct a best-fitting model and optimize HFMD prevention.

  14. Impact of weather factors on hand, foot and mouth disease, and its role in short-term incidence trend forecast in Huainan City, Anhui Province

    NASA Astrophysics Data System (ADS)

    Zhao, Desheng; Wang, Lulu; Cheng, Jian; Xu, Jun; Xu, Zhiwei; Xie, Mingyu; Yang, Huihui; Li, Kesheng; Wen, Lingying; Wang, Xu; Zhang, Heng; Wang, Shusi; Su, Hong

    2017-03-01

    Hand, foot, and mouth disease (HFMD) is one of the most common communicable diseases in China, and current climate change had been recognized as a significant contributor. Nevertheless, no reliable models have been put forward to predict the dynamics of HFMD cases based on short-term weather variations. The present study aimed to examine the association between weather factors and HFMD, and to explore the accuracy of seasonal auto-regressive integrated moving average (SARIMA) model with local weather conditions in forecasting HFMD. Weather and HFMD data from 2009 to 2014 in Huainan, China, were used. Poisson regression model combined with a distributed lag non-linear model (DLNM) was applied to examine the relationship between weather factors and HFMD. The forecasting model for HFMD was performed by using the SARIMA model. The results showed that temperature rise was significantly associated with an elevated risk of HFMD. Yet, no correlations between relative humidity, barometric pressure and rainfall, and HFMD were observed. SARIMA models with temperature variable fitted HFMD data better than the model without it (s R 2 increased, while the BIC decreased), and the SARIMA (0, 1, 1)(0, 1, 0)52 offered the best fit for HFMD data. In addition, compared with females and nursery children, males and scattered children may be more suitable for using SARIMA model to predict the number of HFMD cases and it has high precision. In conclusion, high temperature could increase the risk of contracting HFMD. SARIMA model with temperature variable can effectively improve its forecast accuracy, which can provide valuable information for the policy makers and public health to construct a best-fitting model and optimize HFMD prevention.

  15. Neuro-fuzzy and neural network techniques for forecasting sea level in Darwin Harbor, Australia

    NASA Astrophysics Data System (ADS)

    Karimi, Sepideh; Kisi, Ozgur; Shiri, Jalal; Makarynskyy, Oleg

    2013-03-01

    Accurate predictions of sea level with different forecast horizons are important for coastal and ocean engineering applications, as well as in land drainage and reclamation studies. The methodology of tidal harmonic analysis, which is generally used for obtaining a mathematical description of the tides, is data demanding requiring processing of tidal observation collected over several years. In the present study, hourly sea levels for Darwin Harbor, Australia were predicted using two different, data driven techniques, adaptive neuro-fuzzy inference system (ANFIS) and artificial neural network (ANN). Multi linear regression (MLR) technique was used for selecting the optimal input combinations (lag times) of hourly sea level. The input combination comprises current sea level as well as five previous level values found to be optimal. For the ANFIS models, five different membership functions namely triangular, trapezoidal, generalized bell, Gaussian and two Gaussian membership function were tested and employed for predicting sea level for the next 1 h, 24 h, 48 h and 72 h. The used ANN models were trained using three different algorithms, namely, Levenberg-Marquardt, conjugate gradient and gradient descent. Predictions of optimal ANFIS and ANN models were compared with those of the optimal auto-regressive moving average (ARMA) models. The coefficient of determination, root mean square error and variance account statistics were used as comparison criteria. The obtained results indicated that triangular membership function was optimal for predictions with the ANFIS models while adaptive learning rate and Levenberg-Marquardt were most suitable for training the ANN models. Consequently, ANFIS and ANN models gave similar forecasts and performed better than the developed for the same purpose ARMA models for all the prediction intervals.

  16. Validation of a Magnetic Resonance Imaging-based Auto-contouring Software Tool for Gross Tumour Delineation in Head and Neck Cancer Radiotherapy Planning.

    PubMed

    Doshi, T; Wilson, C; Paterson, C; Lamb, C; James, A; MacKenzie, K; Soraghan, J; Petropoulakis, L; Di Caterina, G; Grose, D

    2017-01-01

    To carry out statistical validation of a newly developed magnetic resonance imaging (MRI) auto-contouring software tool for gross tumour volume (GTV) delineation in head and neck tumours to assist in radiotherapy planning. Axial MRI baseline scans were obtained for 10 oropharyngeal and laryngeal cancer patients. GTV was present on 102 axial slices and auto-contoured using the modified fuzzy c-means clustering integrated with the level set method (FCLSM). Peer-reviewed (C-gold) manual contours were used as the reference standard to validate auto-contoured GTVs (C-auto) and mean manual contours (C-manual) from two expert clinicians (C1 and C2). Multiple geometric metrics, including the Dice similarity coefficient (DSC), were used for quantitative validation. A DSC≥0.7 was deemed acceptable. Inter- and intra-variabilities among the manual contours were also validated. The two-dimensional contours were then reconstructed in three dimensions for GTV volume calculation, comparison and three-dimensional visualisation. The mean DSC between C-gold and C-auto was 0.79. The mean DSC between C-gold and C-manual was 0.79 and that between C1 and C2 was 0.80. The average time for GTV auto-contouring per patient was 8 min (range 6-13 min; mean 45 s per axial slice) compared with 15 min (range 6-23 min; mean 88 s per axial slice) for C1. The average volume concordance between C-gold and C-auto volumes was 86.51% compared with 74.16% between C-gold and C-manual. The average volume concordance between C1 and C2 volumes was 86.82%. This newly designed MRI-based auto-contouring software tool shows initial acceptable results in GTV delineation of oropharyngeal and laryngeal tumours using FCLSM. This auto-contouring software tool may help reduce inter- and intra-variability and can assist clinical oncologists with time-consuming, complex radiotherapy planning. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  17. Development and validation of an automated delirium risk assessment system (Auto-DelRAS) implemented in the electronic health record system.

    PubMed

    Moon, Kyoung-Ja; Jin, Yinji; Jin, Taixian; Lee, Sun-Mi

    2018-01-01

    A key component of the delirium management is prevention and early detection. To develop an automated delirium risk assessment system (Auto-DelRAS) that automatically alerts health care providers of an intensive care unit (ICU) patient's delirium risk based only on data collected in an electronic health record (EHR) system, and to evaluate the clinical validity of this system. Cohort and system development designs were used. Medical and surgical ICUs in two university hospitals in Seoul, Korea. A total of 3284 patients for the development of Auto-DelRAS, 325 for external validation, 694 for validation after clinical applications. The 4211 data items were extracted from the EHR system and delirium was measured using CAM-ICU (Confusion Assessment Method for Intensive Care Unit). The potential predictors were selected and a logistic regression model was established to create a delirium risk scoring algorithm to construct the Auto-DelRAS. The Auto-DelRAS was evaluated at three months and one year after its application to clinical practice to establish the predictive validity of the system. Eleven predictors were finally included in the logistic regression model. The results of the Auto-DelRAS risk assessment were shown as high/moderate/low risk on a Kardex screen. The predictive validity, analyzed after the clinical application of Auto-DelRAS after one year, showed a sensitivity of 0.88, specificity of 0.72, positive predictive value of 0.53, negative predictive value of 0.94, and a Youden index of 0.59. A relatively high level of predictive validity was maintained with the Auto-DelRAS system, even one year after it was applied to clinical practice. Copyright © 2017. Published by Elsevier Ltd.

  18. Forecasting ESKAPE infections through a time-varying auto-adaptive algorithm using laboratory-based surveillance data.

    PubMed

    Ballarin, Antonio; Posteraro, Brunella; Demartis, Giuseppe; Gervasi, Simona; Panzarella, Fabrizio; Torelli, Riccardo; Paroni Sterbini, Francesco; Morandotti, Grazia; Posteraro, Patrizia; Ricciardi, Walter; Gervasi Vidal, Kristian A; Sanguinetti, Maurizio

    2014-12-06

    Mathematical or statistical tools are capable to provide a valid help to improve surveillance systems for healthcare and non-healthcare-associated bacterial infections. The aim of this work is to evaluate the time-varying auto-adaptive (TVA) algorithm-based use of clinical microbiology laboratory database to forecast medically important drug-resistant bacterial infections. Using TVA algorithm, six distinct time series were modelled, each one representing the number of episodes per single 'ESKAPE' (E nterococcus faecium, S taphylococcus aureus, K lebsiella pneumoniae, A cinetobacter baumannii, P seudomonas aeruginosa and E nterobacter species) infecting pathogen, that had occurred monthly between 2002 and 2011 calendar years at the Università Cattolica del Sacro Cuore general hospital. Monthly moving averaged numbers of observed and forecasted ESKAPE infectious episodes were found to show a complete overlapping of their respective smoothed time series curves. Overall good forecast accuracy was observed, with percentages ranging from 82.14% for E. faecium infections to 90.36% for S. aureus infections. Our approach may regularly provide physicians with forecasted bacterial infection rates to alert them about the spread of antibiotic-resistant bacterial species, especially when clinical microbiological results of patients' specimens are delayed.

  19. Estimating linear temporal trends from aggregated environmental monitoring data

    USGS Publications Warehouse

    Erickson, Richard A.; Gray, Brian R.; Eager, Eric A.

    2017-01-01

    Trend estimates are often used as part of environmental monitoring programs. These trends inform managers (e.g., are desired species increasing or undesired species decreasing?). Data collected from environmental monitoring programs is often aggregated (i.e., averaged), which confounds sampling and process variation. State-space models allow sampling variation and process variations to be separated. We used simulated time-series to compare linear trend estimations from three state-space models, a simple linear regression model, and an auto-regressive model. We also compared the performance of these five models to estimate trends from a long term monitoring program. We specifically estimated trends for two species of fish and four species of aquatic vegetation from the Upper Mississippi River system. We found that the simple linear regression had the best performance of all the given models because it was best able to recover parameters and had consistent numerical convergence. Conversely, the simple linear regression did the worst job estimating populations in a given year. The state-space models did not estimate trends well, but estimated population sizes best when the models converged. We found that a simple linear regression performed better than more complex autoregression and state-space models when used to analyze aggregated environmental monitoring data.

  20. Inhalant Use among Indiana School Children, 1991-2004

    ERIC Educational Resources Information Center

    Ding, Kele; Torabi, Mohammad R.; Perera, Bilesha; Jun, Mi Kyung; Jones-McKyer, E. Lisako

    2007-01-01

    Objective: To examine the prevalence and trend of inhalant use among Indiana public school students. Methods: The Alcohol, Tobacco, and Other Drug Use among Indiana Children and Adolescents surveys conducted annually between 1991 and 2004 were reanalyzed using 2-way moving average, Poisson regression, and ANOVA tests. Results: The prevalence had…

  1. Effect of external PEEP in patients under controlled mechanical ventilation with an auto-PEEP of 5 cmH2O or higher.

    PubMed

    Natalini, Giuseppe; Tuzzo, Daniele; Rosano, Antonio; Testa, Marco; Grazioli, Michele; Pennestrì, Vincenzo; Amodeo, Guido; Berruto, Francesco; Fiorillo, Marialinda; Peratoner, Alberto; Tinnirello, Andrea; Filippini, Matteo; Marsilia, Paolo F; Minelli, Cosetta; Bernardini, Achille

    2016-12-01

    In some patients with auto-positive end-expiratory pressure (auto-PEEP), application of PEEP lower than auto-PEEP maintains a constant total PEEP, therefore reducing the inspiratory threshold load without detrimental cardiovascular or respiratory effects. We refer to these patients as "complete PEEP-absorbers." Conversely, adverse effects of PEEP application could occur in patients with auto-PEEP when the total PEEP rises as a consequence. From a pathophysiological perspective, all subjects with flow limitation are expected to be "complete PEEP-absorbers," whereas PEEP should increase total PEEP in all other patients. This study aimed to empirically assess the extent to which flow limitation alone explains a "complete PEEP-absorber" behavior (i.e., absence of further hyperinflation with PEEP), and to identify other factors associated with it. One hundred patients with auto-PEEP of at least 5 cmH2O at zero end-expiratory pressure (ZEEP) during controlled mechanical ventilation were enrolled. Total PEEP (i.e., end-expiratory plateau pressure) was measured both at ZEEP and after applied PEEP equal to 80 % of auto-PEEP measured at ZEEP. All measurements were repeated three times, and the average value was used for analysis. Forty-seven percent of the patients suffered from chronic pulmonary disease and 52 % from acute pulmonary disease; 61 % showed flow limitation at ZEEP, assessed by manual compression of the abdomen. The mean total PEEP was 7 ± 2 cmH2O at ZEEP and 9 ± 2 cmH2O after the application of PEEP (p < 0.001). Thirty-three percent of the patients were "complete PEEP-absorbers." Multiple logistic regression was used to predict the behavior of "complete PEEP-absorber." The best model included a respiratory rate lower than 20 breaths/min and the presence of flow limitation. The predictive ability of the model was excellent, with an overoptimism-corrected area under the receiver operating characteristics curve of 0.89 (95 % CI 0.80-0.97). Expiratory flow limitation was associated with both high and complete "PEEP-absorber" behavior, but setting a relatively high respiratory rate on the ventilator can prevent from observing complete "PEEP-absorption." Therefore, the effect of PEEP application in patients with auto-PEEP can be accurately predicted at the bedside by measuring the respiratory rate and observing the flow-volume loop during manual compression of the abdomen.

  2. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy.

    PubMed

    Sharfo, Abdul Wahab M; Breedveld, Sebastiaan; Voet, Peter W J; Heijkoop, Sabrina T; Mens, Jan-Willem M; Hoogeman, Mischa S; Heijmen, Ben J M

    2016-01-01

    To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-beam IMRT plans (CLINICAL), and to dual-arc VMAT plans generated manually by an expert planner (manVMAT). Furthermore, all plans were benchmarked against 20-beam equi-angular IMRT plans (autoIMRT). For all plans, a PTV coverage of 99.5% by at least 95% of the prescribed dose (46 Gy) had the highest planning priority, followed by minimization of V45Gy for small bowel (SB). Other OARs considered were bladder, rectum, and sigmoid. All plans had a highly similar PTV coverage, within the clinical constraints (above). After plan normalizations for exactly equal median PTV doses in corresponding plans, all evaluated OAR parameters in autoVMAT plans were on average lower than in the CLINICAL plans with an average reduction in SB V45Gy of 34.6% (p<0.001). For 41/44 autoVMAT plans, SB V45Gy was lower than for manVMAT (p<0.001, average reduction 30.3%), while SB V15Gy increased by 2.3% (p = 0.011). AutoIMRT reduced SB V45Gy by another 2.7% compared to autoVMAT, while also resulting in a 9.0% reduction in SB V15Gy (p<0.001), but with a prolonged delivery time. Differences between manVMAT and autoVMAT in bladder, rectal and sigmoid doses were ≤ 1%. Improvements in SB dose delivery with autoVMAT instead of manVMAT were higher for empty bladder PTVs compared to full bladder PTVs, due to differences in concavity of the PTVs. Quality of automatically generated VMAT plans was superior to manually generated plans. Automatic VMAT plan generation for cervical cancer has been implemented in our clinical routine. Due to the achieved workload reduction, extension of plan libraries has become feasible.

  3. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Franzke, C.; Gramacy, R. B.; Watkins, N. W.

    2012-12-01

    Recent studies have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average (ARFIMA) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d,with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series such as the Central England Temperature. Many physical processes, for example the Faraday time series from Antarctica, are highly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption. Specifically, we assume a symmetric α -stable distribution for the innovations. Such processes provide good, flexible, initial models for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance σ d of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  4. Time series analysis of temporal trends in the pertussis incidence in Mainland China from 2005 to 2016.

    PubMed

    Zeng, Qianglin; Li, Dandan; Huang, Gui; Xia, Jin; Wang, Xiaoming; Zhang, Yamei; Tang, Wanping; Zhou, Hui

    2016-08-31

    Short-term forecast of pertussis incidence is helpful for advanced warning and planning resource needs for future epidemics. By utilizing the Auto-Regressive Integrated Moving Average (ARIMA) model and Exponential Smoothing (ETS) model as alterative models with R software, this paper analyzed data from Chinese Center for Disease Control and Prevention (China CDC) between January 2005 and June 2016. The ARIMA (0,1,0)(1,1,1)12 model (AICc = 1342.2 BIC = 1350.3) was selected as the best performing ARIMA model and the ETS (M,N,M) model (AICc = 1678.6, BIC = 1715.4) was selected as the best performing ETS model, and the ETS (M,N,M) model with the minimum RMSE was finally selected for in-sample-simulation and out-of-sample forecasting. Descriptive statistics showed that the reported number of pertussis cases by China CDC increased by 66.20% from 2005 (4058 cases) to 2015 (6744 cases). According to Hodrick-Prescott filter, there was an apparent cyclicity and seasonality in the pertussis reports. In out of sample forecasting, the model forecasted a relatively high incidence cases in 2016, which predicates an increasing risk of ongoing pertussis resurgence in the near future. In this regard, the ETS model would be a useful tool in simulating and forecasting the incidence of pertussis, and helping decision makers to take efficient decisions based on the advanced warning of disease incidence.

  5. Flood Nowcasting With Linear Catchment Models, Radar and Kalman Filters

    NASA Astrophysics Data System (ADS)

    Pegram, Geoff; Sinclair, Scott

    A pilot study using real time rainfall data as input to a parsimonious linear distributed flood forecasting model is presented. The aim of the study is to deliver an operational system capable of producing flood forecasts, in real time, for the Mgeni and Mlazi catchments near the city of Durban in South Africa. The forecasts can be made at time steps which are of the order of a fraction of the catchment response time. To this end, the model is formulated in Finite Difference form in an equation similar to an Auto Regressive Moving Average (ARMA) model; it is this formulation which provides the required computational efficiency. The ARMA equation is a discretely coincident form of the State-Space equations that govern the response of an arrangement of linear reservoirs. This results in a functional relationship between the reservoir response con- stants and the ARMA coefficients, which guarantees stationarity of the ARMA model. Input to the model is a combined "Best Estimate" spatial rainfall field, derived from a combination of weather RADAR and Satellite rainfield estimates with point rain- fall given by a network of telemetering raingauges. Several strategies are employed to overcome the uncertainties associated with forecasting. Principle among these are the use of optimal (double Kalman) filtering techniques to update the model states and parameters in response to current streamflow observations and the application of short term forecasting techniques to provide future estimates of the rainfield as input to the model.

  6. Crimean-Congo hemorrhagic fever and its relationship with climate factors in southeast Iran: a 13-year experience.

    PubMed

    Ansari, Hossein; Shahbaz, Babak; Izadi, Shahrokh; Zeinali, Mohammad; Tabatabaee, Seyyed Mehdi; Mahmoodi, Mahmood; Holakouie Naieni, Kourosh; Mansournia, Mohammad Ali

    2014-06-11

    Crimean-Congo hemorrhagic fever (CCHF) is endemic in southeast Iran. In this study we present the epidemiological features of CCHF and its relationship with climate factors in over a 13-year span. Surveillance system data of CCHF from 2000 to 2012 were obtained from the Province Health Centre of Zahedan University of Medical Sciences in southeast Iran. The climate data were obtained from the climate organization. The seasonal auto-regression integrated moving average (SARIMA) model was used for time series analysis to produce a model as applicable as possible in predicting the variations in the occurrence of the disease. Between 2000 and 2012, 647 confirmed CCHF cases were reported from Sistan-va-Baluchistan province. The total case fatality rate was about 10.0%. Climate variables including mean temperature (°C), accumulated rainfall (mm), and maximum relative humidity (%) were significantly correlated with monthly incidence of CCHF (p <0.05). There was no clear pattern of decline in the reported number of cases within the study's time span. The first spike in the number of CCHF cases in Iran occurred after the first surge of the disease in Pakistan. This study shows the potential of climate indicators as predictive factors in modeling the occurrence of CCHF, even though it has to be appreciated whether there is any need for a practically applicable model. There are also other factors, such as entomological indicators and virological finding that must be considered.

  7. Long Memory in STOCK Market Volatility: the International Evidence

    NASA Astrophysics Data System (ADS)

    Yang, Chunxia; Hu, Sen; Xia, Bingying; Wang, Rui

    2012-08-01

    It is still a hot topic to catch the auto-dependence behavior of volatility. Here, based on the measurement of average volatility, under different observation window size, we investigated the dependence of successive volatility of several main stock indices and their simulated GARCH(1, 1) model, there were obvious linear auto-dependence in the logarithm of volatility under a small observation window size and nonlinear auto-dependence under a big observation. After calculating the correlation and mutual information of the logarithm of volatility for Dow Jones Industrial Average during different periods, we find that some influential events can change the correlation structure and the volatilities of different periods have distinct influence on that of the remote future. Besides, GARCH model could produce similar behavior of dependence as real data and long memory property. But our analyses show that the auto-dependence of volatility in GARCH is different from that in real data, and the long memory is undervalued by GARCH.

  8. Review the number of accidents in Tehran over a two-year period and prediction of the number of events based on a time-series model

    PubMed Central

    Teymuri, Ghulam Heidar; Sadeghian, Marzieh; Kangavari, Mehdi; Asghari, Mehdi; Madrese, Elham; Abbasinia, Marzieh; Ahmadnezhad, Iman; Gholizadeh, Yavar

    2013-01-01

    Background: One of the significant dangers that threaten people’s lives is the increased risk of accidents. Annually, more than 1.3 million people die around the world as a result of accidents, and it has been estimated that approximately 300 deaths occur daily due to traffic accidents in the world with more than 50% of that number being people who were not even passengers in the cars. The aim of this study was to examine traffic accidents in Tehran and forecast the number of future accidents using a time-series model. Methods: The study was a cross-sectional study that was conducted in 2011. The sample population was all traffic accidents that caused death and physical injuries in Tehran in 2010 and 2011, as registered in the Tehran Emergency ward. The present study used Minitab 15 software to provide a description of accidents in Tehran for the specified time period as well as those that occurred during April 2012. Results: The results indicated that the average number of daily traffic accidents in Tehran in 2010 was 187 with a standard deviation of 83.6. In 2011, there was an average of 180 daily traffic accidents with a standard deviation of 39.5. One-way analysis of variance indicated that the average number of accidents in the city was different for different months of the year (P < 0.05). Most of the accidents occurred in March, July, August, and September. Thus, more accidents occurred in the summer than in the other seasons. The number of accidents was predicted based on an auto-regressive, moving average (ARMA) for April 2012. The number of accidents displayed a seasonal trend. The prediction of the number of accidents in the city during April of 2012 indicated that a total of 4,459 accidents would occur with mean of 149 accidents per day during these three months. Conclusion: The number of accidents in Tehran displayed a seasonal trend, and the number of accidents was different for different seasons of the year. PMID:26120405

  9. An improved portmanteau test for autocorrelated errors in interrupted time-series regression models.

    PubMed

    Huitema, Bradley E; McKean, Joseph W

    2007-08-01

    A new portmanteau test for autocorrelation among the errors of interrupted time-series regression models is proposed. Simulation results demonstrate that the inferential properties of the proposed Q(H-M) test statistic are considerably more satisfactory than those of the well known Ljung-Box test and moderately better than those of the Box-Pierce test. These conclusions generally hold for a wide variety of autoregressive (AR), moving averages (MA), and ARMA error processes that are associated with time-series regression models of the form described in Huitema and McKean (2000a, 2000b).

  10. Hybrid Support Vector Regression and Autoregressive Integrated Moving Average Models Improved by Particle Swarm Optimization for Property Crime Rates Forecasting with Economic Indicators

    PubMed Central

    Alwee, Razana; Hj Shamsuddin, Siti Mariyam; Sallehuddin, Roselina

    2013-01-01

    Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models. PMID:23766729

  11. Hybrid support vector regression and autoregressive integrated moving average models improved by particle swarm optimization for property crime rates forecasting with economic indicators.

    PubMed

    Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Sallehuddin, Roselina

    2013-01-01

    Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models.

  12. Wildfire suppression cost forecasts from the US Forest Service

    Treesearch

    Karen L. Abt; Jeffrey P. Prestemon; Krista M. Gebert

    2009-01-01

    The US Forest Service and other land-management agencies seek better tools for nticipating future expenditures for wildfire suppression. We developed regression models for forecasting US Forest Service suppression spending at 1-, 2-, and 3-year lead times. We compared these models to another readily available forecast model, the 10-year moving average model,...

  13. Comparative Performance Evaluation of Rainfall-runoff Models, Six of Black-box Type and One of Conceptual Type, From The Galway Flow Forecasting System (gffs) Package, Applied On Two Irish Catchments

    NASA Astrophysics Data System (ADS)

    Goswami, M.; O'Connor, K. M.; Shamseldin, A. Y.

    The "Galway Real-Time River Flow Forecasting System" (GFFS) is a software pack- age developed at the Department of Engineering Hydrology, of the National University of Ireland, Galway, Ireland. It is based on a selection of lumped black-box and con- ceptual rainfall-runoff models, all developed in Galway, consisting primarily of both the non-parametric (NP) and parametric (P) forms of two black-box-type rainfall- runoff models, namely, the Simple Linear Model (SLM-NP and SLM-P) and the seasonally-based Linear Perturbation Model (LPM-NP and LPM-P), together with the non-parametric wetness-index-based Linearly Varying Gain Factor Model (LVGFM), the black-box Artificial Neural Network (ANN) Model, and the conceptual Soil Mois- ture Accounting and Routing (SMAR) Model. Comprised of the above suite of mod- els, the system enables the user to calibrate each model individually, initially without updating, and it is capable also of producing combined (i.e. consensus) forecasts us- ing the Simple Average Method (SAM), the Weighted Average Method (WAM), or the Artificial Neural Network Method (NNM). The updating of each model output is achieved using one of four different techniques, namely, simple Auto-Regressive (AR) updating, Linear Transfer Function (LTF) updating, Artificial Neural Network updating (NNU), and updating by the Non-linear Auto-Regressive Exogenous-input method (NARXM). The models exhibit a considerable range of variation in degree of complexity of structure, with corresponding degrees of complication in objective func- tion evaluation. Operating in continuous river-flow simulation and updating modes, these models and techniques have been applied to two Irish catchments, namely, the Fergus and the Brosna. A number of performance evaluation criteria have been used to comparatively assess the model discharge forecast efficiency.

  14. Organizational Response to the Introduction of New Computer Software Technology

    DTIC Science & Technology

    1991-07-01

    the documentation to be much use at all." Another said that "the tutorial did a good job, but ... the manual did an average job." The Lotus Manuscript...when they have a specific use in mind and believe they can find the information easily in the manual . 12 The AutoCAD users were also split on their...AutoCAD user with AutoLISP , a programming language included in the package. (Some CADD packages come with these features and others as part of the

  15. A study of sound generation in subsonic rotors, volume 2

    NASA Technical Reports Server (NTRS)

    Chalupnik, J. D.; Clark, L. T.

    1975-01-01

    Computer programs were developed for use in the analysis of sound generation by subsonic rotors. Program AIRFOIL computes the spectrum of radiated sound from a single airfoil immersed in a laminar flow field. Program ROTOR extends this to a rotating frame, and provides a model for sound generation in subsonic rotors. The program also computes tone sound generation due to steady state forces on the blades. Program TONE uses a moving source analysis to generate a time series for an array of forces moving in a circular path. The resultant time series are than Fourier transformed to render the results in spectral form. Program SDATA is a standard time series analysis package. It reads in two discrete time series and forms auto and cross covariances and normalizes these to form correlations. The program then transforms the covariances to yield auto and cross power spectra by means of a Fourier transformation.

  16. Resource-Efficient, Hierarchical Auto-Tuning of a Hybrid Lattice Boltzmann Computation on the Cray XT4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Computational Research Division, Lawrence Berkeley National Laboratory; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Berkeley

    2009-05-04

    We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to conventional auto-tuning at the local SMP node, we tune at the message-passing level to determine the optimal aspect ratio as well as the correct balance between MPI tasks and threads permore » MPI task. Our study presents a detailed performance analysis when moving along an isocurve of constant hardware usage: fixed total memory, total cores, and total nodes. Overall, our work points to approaches for improving intra- and inter-node efficiency on large-scale multicore systems for demanding scientific applications.« less

  17. Considerations for monitoring raptor population trends based on counts of migrants

    USGS Publications Warehouse

    Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.

    1989-01-01

    Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.

  18. Quantitation of mandibular symphysis volume as a source of bone grafting.

    PubMed

    Verdugo, Fernando; Simonian, Krikor; Smith McDonald, Roberto; Nowzari, Hessam

    2010-06-01

    Autogenous intramembranous bone graft present several advantages such as minimal resorption and high concentration of bone morphogenetic proteins. A method for measuring the amount of bone that can be harvested from the symphysis area has not been reported in real patients. The aim of the present study was to intrasurgically quantitate the volume of the symphysis bone graft that can be safely harvested in live patients and compare it with AutoCAD (version 16.0, Autodesk, Inc., San Rafael, CA, USA) tomographic calculations. AutoCAD software program quantitated symphysis bone graft in 40 patients using computerized tomographies. Direct intrasurgical measurements were recorded thereafter and compared with AutoCAD data. The bone volume was measured at the recipient sites of a subgroup of 10 patients, 6 months post sinus augmentation. The volume of bone graft measured by AutoCAD averaged 1.4 mL (SD 0.6 mL, range: 0.5-2.7 mL). The volume of bone graft measured intrasurgically averaged 2.3 mL (SD 0.4 mL, range 1.7-2.8 mL). The statistical difference between the two measurement methods was significant. The bone volume measured at the recipient sites 6 months post sinus augmentation averaged 1.9 mL (SD 0.3 mL, range 1.3-2.6 mL) with a mean loss of 0.4 mL. AutoCAD did not overestimate the volume of bone that can be safely harvested from the mandibular symphysis. The use of the design software program may improve surgical treatment planning prior to sinus augmentation.

  19. Causes of Urban Sprawl in the United States: Auto Reliance as Compared to Natural Evolution, Flight from Blight, and Local Revenue Reliance

    ERIC Educational Resources Information Center

    Wassmer, Robert W.

    2008-01-01

    This paper describes a statistical study of the contribution of theories previously offered by economists to explain differences in the degree of urban decentralization in the U.S. The focus is on a relative comparison of the influence of auto reliance. A regression analysis reveals that a 10 percent reduction in the percentage of households…

  20. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy

    PubMed Central

    Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.

    2016-01-01

    Purpose To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-beam IMRT plans (CLINICAL), and to dual-arc VMAT plans generated manually by an expert planner (manVMAT). Furthermore, all plans were benchmarked against 20-beam equi-angular IMRT plans (autoIMRT). For all plans, a PTV coverage of 99.5% by at least 95% of the prescribed dose (46 Gy) had the highest planning priority, followed by minimization of V45Gy for small bowel (SB). Other OARs considered were bladder, rectum, and sigmoid. Results All plans had a highly similar PTV coverage, within the clinical constraints (above). After plan normalizations for exactly equal median PTV doses in corresponding plans, all evaluated OAR parameters in autoVMAT plans were on average lower than in the CLINICAL plans with an average reduction in SB V45Gy of 34.6% (p<0.001). For 41/44 autoVMAT plans, SB V45Gy was lower than for manVMAT (p<0.001, average reduction 30.3%), while SB V15Gy increased by 2.3% (p = 0.011). AutoIMRT reduced SB V45Gy by another 2.7% compared to autoVMAT, while also resulting in a 9.0% reduction in SB V15Gy (p<0.001), but with a prolonged delivery time. Differences between manVMAT and autoVMAT in bladder, rectal and sigmoid doses were ≤ 1%. Improvements in SB dose delivery with autoVMAT instead of manVMAT were higher for empty bladder PTVs compared to full bladder PTVs, due to differences in concavity of the PTVs. Conclusions Quality of automatically generated VMAT plans was superior to manually generated plans. Automatic VMAT plan generation for cervical cancer has been implemented in our clinical routine. Due to the achieved workload reduction, extension of plan libraries has become feasible. PMID:28033342

  1. Macrocell path loss prediction using artificial intelligence techniques

    NASA Astrophysics Data System (ADS)

    Usman, Abraham U.; Okereke, Okpo U.; Omizegba, Elijah E.

    2014-04-01

    The prediction of propagation loss is a practical non-linear function approximation problem which linear regression or auto-regression models are limited in their ability to handle. However, some computational Intelligence techniques such as artificial neural networks (ANNs) and adaptive neuro-fuzzy inference systems (ANFISs) have been shown to have great ability to handle non-linear function approximation and prediction problems. In this study, the multiple layer perceptron neural network (MLP-NN), radial basis function neural network (RBF-NN) and an ANFIS network were trained using actual signal strength measurement taken at certain suburban areas of Bauchi metropolis, Nigeria. The trained networks were then used to predict propagation losses at the stated areas under differing conditions. The predictions were compared with the prediction accuracy of the popular Hata model. It was observed that ANFIS model gave a better fit in all cases having higher R2 values in each case and on average is more robust than MLP and RBF models as it generalises better to a different data.

  2. Zika pandemic online trends, incidence and health risk communication: a time trend study.

    PubMed

    Adebayo, Gbenga; Neumark, Yehuda; Gesser-Edelsburg, Anat; Abu Ahmad, Wiessam; Levine, Hagai

    2017-01-01

    We aimed to describe the online search trends of Zika and examine their association with Zika incidence, assess the content of Zika-related press releases issued by leading health authorities and examine the association between online trends and press release timing. Using Google Trends, the 1 May 2015 to 30 May 2016 online trends of Zika and associated search terms were studied globally and in the five countries with the highest numbers of suspected cases. Correlations were then examined between online trends and Zika incidence in these countries. All Zika-related press releases issued by WHO/Pan America Health Organization (PAHO) and Centers for Disease Control and Prevention (CDC) during the study period were assessed for transparency, uncertainty and audience segmentation. Witte's Extended Parallel Process Model was applied to assess self-efficacy, response efficacy, susceptibility and severity. AutoRegressive Integrated Moving Average with an eXogenous predictor variable (ARIMAX) (p,d,q) regression modelling was used to quantify the association between online trends and the timing of press releases. Globally, Zika online search trends were low until the beginning of 2016, when interest rose steeply. Strong correlations (r=0.748-0.922; p<0.001) were observed between online trends and the number of suspected Zika cases in four of the five countries studied. Compared with press releases issued by WHO/PAHO, CDC press releases were significantly more likely to provide contact details and links to other resources, include figures/graphs, be risk-advisory in nature and be more readable and briefer. ARIMAX modelling results indicate that online trends preceded by 1 week press releases by WHO (stationary-R 2 =0.345; p<0.001) and CDC (stationary-R 2 =0.318; p=0.014). These results suggest that online trends can aid in pandemic surveillance. Identification of shortcomings in the content and timing of Zika press releases can help guide health communication efforts in the current pandemic and future public health emergencies.

  3. Linking Bakhtin with feminist poststructuralism to unravel the allure of auto/biographies

    NASA Astrophysics Data System (ADS)

    Rodriguez, Alberto J.

    2000-03-01

    By linking feminist poststructuralism with Bakhtin's concepts of voice and ventriloquation, an approach is proposed for the critical engagement with auto/biographical text. It is argued that by becoming better aware of the teller's intentionality and her/his insights gained from telling a (re)constructed version of self, the listener and the teller can engage in personal and socially transformative dialog. This dialog can assist the teller/listener to move from superficial affirmation of (re)interpreted lived experiences to more socially responsive action. An example is provided to illustrate implications of this approach for science teaching and education research.

  4. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: a case study in endemic districts of Bhutan.

    PubMed

    Wangdi, Kinley; Singhasivanon, Pratap; Silawan, Tassanee; Lawpoolsri, Saranath; White, Nicholas J; Kaewkungwal, Jaranit

    2010-09-03

    Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009 and 67 to 149 cases in 2010, where population in 2009 was 285,375 and the expected population of 2010 to be 289,085. The ARIMAX model of monthly cases and climatic factors showed considerable variations among the different districts. In general, the mean maximum temperature lagged at one month was a strong positive predictor of an increased malaria cases for four districts. The monthly number of cases of the previous month was also a significant predictor in one district, whereas no variable could predict malaria cases for two districts. The ARIMA models of time-series analysis were useful in forecasting the number of cases in the endemic areas of Bhutan. There was no consistency in the predictors of malaria cases when using ARIMAX model with selected lag times and climatic predictors. The ARIMA forecasting models could be employed for planning and managing malaria prevention and control programme in Bhutan.

  5. Calibrators measurement system for headlamp tester of motor vehicle base on machine vision

    NASA Astrophysics Data System (ADS)

    Pan, Yue; Zhang, Fan; Xu, Xi-ping; Zheng, Zhe

    2014-09-01

    With the development of photoelectric detection technology, machine vision has a wider use in the field of industry. The paper mainly introduces auto lamps tester calibrator measuring system, of which CCD image sampling system is the core. Also, it shows the measuring principle of optical axial angle and light intensity, and proves the linear relationship between calibrator's facula illumination and image plane illumination. The paper provides an important specification of CCD imaging system. Image processing by MATLAB can get flare's geometric midpoint and average gray level. By fitting the statistics via the method of the least square, we can get regression equation of illumination and gray level. It analyzes the error of experimental result of measurement system, and gives the standard uncertainty of synthesis and the resource of optical axial angle. Optical axial angle's average measuring accuracy is controlled within 40''. The whole testing process uses digital means instead of artificial factors, which has higher accuracy, more repeatability and better mentality than any other measuring systems.

  6. Determinants of isocyanate exposures in auto body repair and refinishing shops.

    PubMed

    Woskie, S R; Sparer, J; Gore, R J; Stowe, M; Bello, D; Liu, Y; Youngs, F; Redlich, C; Eisen, E; Cullen, M

    2004-07-01

    As part of the Survey of Painters and Repairers of Auto bodies by Yale (SPRAY), the determinants of isocyanate exposure in auto body repair shops were evaluated. Measurements (n = 380) of hexamethylene diisocyanate-based monomer and polyisocyanate and isophorone diisocyanate-based polyisocyanate were collected from 33 auto body shops. The median total reactive isocyanate concentrations expressed as mass concentration of the NCO functional group were: 206 microg NCO/m3 for spray operations; 0.93 microg NCO/m3 for samples collected in the vicinity of spray operations done on the shop floor (near spray); 0.05 microg NCO/m3 for office or other shop areas adjacent to spray areas (workplace background); 0.17 microg NCO/m3 for paint mixing and gun cleaning operations (mixing); 0.27 microg NCO/m3 for sanding operations. Exposure determinants for the sample NCO mass load were identified using linear regression, tobit regression and logistic regression models. For spray samples in a spray booth the significant determinants were the number of milliliters of NCO applied, the gallons of clear coat used by the shop each month and the type of spray booth used (custom built crossdraft, prefabricated crossdraft or downdraft/semi-downdraft). For near spray (bystander) samples, outdoor temperature >65 degrees F (18 degrees C) and shop size >5000 feet2 (465 m2) were significant determinants of exposure levels. For workplace background samples the shop annual income was the most important determinant. For sanding samples, the shop annual income and outdoor temperature >65 degrees F (18 degrees C) were the most significant determinants. Identification of these key exposure determinants will be useful in targeting exposure evaluation and control efforts to reduce isocyanate exposures.

  7. The dynamic correlation between policy uncertainty and stock market returns in China

    NASA Astrophysics Data System (ADS)

    Yang, Miao; Jiang, Zhi-Qiang

    2016-11-01

    The dynamic correlation is examined between government's policy uncertainty and Chinese stock market returns in the period from January 1995 to December 2014. We find that the stock market is significantly correlated to policy uncertainty based on the results of the Vector Auto Regression (VAR) and Structural Vector Auto Regression (SVAR) models. In contrast, the results of the Dynamic Conditional Correlation Generalized Multivariate Autoregressive Conditional Heteroscedasticity (DCC-MGARCH) model surprisingly show a low dynamic correlation coefficient between policy uncertainty and market returns, suggesting that the fluctuations of each variable are greatly influenced by their values in the preceding period. Our analysis highlights the understanding of the dynamical relationship between stock market and fiscal and monetary policy.

  8. AGN Variability: Probing Black Hole Accretion

    NASA Astrophysics Data System (ADS)

    Moreno, Jackeline; O'Brien, Jack; Vogeley, Michael S.; Richards, Gordon T.; Kasliwal, Vishal P.

    2017-01-01

    We combine the long temporal baseline of Sloan Digital Sky Survey (SDSS) for quasars in Stripe 82 with the high precision photometry of the Kepler/K2 Satellite to study the physics of optical variability in the accretion disk and supermassive black hole engine. We model the lightcurves directly as Continuous-time Auto Regressive Moving Average processes (C-ARMA) with the Kali analysis package (Kasliwal et al. 2016). These models are extremely robust to irregular sampling and can capture aperiodic variability structure on various timescales. We also estimate the power spectral density and structure function of both the model family and the data. A Green's function kernel may also be estimated for the resulting C-ARMA parameter fit, which may be interpreted as the response to driving impulses such as hotspots in the accretion disk. We also examine available spectra for our AGN sample to relate observed and modelled behavior to spectral properties. The objective of this work is twofold: to explore the proper physical interpretation of different families of C-ARMA models applied to AGN optical flux variability and to relate empirical characteristic timescales of our AGN sample to physical theory or to properties estimated from spectra or simulations like the disk viscosity and temperature. We find that AGN with strong variability features on timescales resolved by K2 are well modelled by a low order C-ARMA family while K2 lightcurves with weak amplitude variability are dominated by outliers and measurement errors which force higher order model fits. This work explores a novel approach to combining SDSS and K2 data sets and presents recovered characteristic timescales of AGN variability.

  9. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    NASA Astrophysics Data System (ADS)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  10. Year-round spatiotemporal distribution of harbour porpoises within and around the Maryland wind energy area

    PubMed Central

    O’Brien, Michael; Lyubchich, Vyacheslav; Roberts, Jason J.; Halpin, Patrick N.; Rice, Aaron N.; Bailey, Helen

    2017-01-01

    Offshore windfarms provide renewable energy, but activities during the construction phase can affect marine mammals. To understand how the construction of an offshore windfarm in the Maryland Wind Energy Area (WEA) off Maryland, USA, might impact harbour porpoises (Phocoena phocoena), it is essential to determine their poorly understood year-round distribution. Although habitat-based models can help predict the occurrence of species in areas with limited or no sampling, they require validation to determine the accuracy of the predictions. Incorporating more than 18 months of harbour porpoise detection data from passive acoustic monitoring, generalized auto-regressive moving average and generalized additive models were used to investigate harbour porpoise occurrence within and around the Maryland WEA in relation to temporal and environmental variables. Acoustic detection metrics were compared to habitat-based density estimates derived from aerial and boat-based sightings to validate the model predictions. Harbour porpoises occurred significantly more frequently during January to May, and foraged significantly more often in the evenings to early mornings at sites within and outside the Maryland WEA. Harbour porpoise occurrence peaked at sea surface temperatures of 5°C and chlorophyll a concentrations of 4.5 to 7.4 mg m-3. The acoustic detections were significantly correlated with the predicted densities, except at the most inshore site. This study provides insight into previously unknown fine-scale spatial and temporal patterns in distribution of harbour porpoises offshore of Maryland. The results can be used to help inform future monitoring and mitigate the impacts of windfarm construction and other human activities. PMID:28467455

  11. Year-round spatiotemporal distribution of harbour porpoises within and around the Maryland wind energy area.

    PubMed

    Wingfield, Jessica E; O'Brien, Michael; Lyubchich, Vyacheslav; Roberts, Jason J; Halpin, Patrick N; Rice, Aaron N; Bailey, Helen

    2017-01-01

    Offshore windfarms provide renewable energy, but activities during the construction phase can affect marine mammals. To understand how the construction of an offshore windfarm in the Maryland Wind Energy Area (WEA) off Maryland, USA, might impact harbour porpoises (Phocoena phocoena), it is essential to determine their poorly understood year-round distribution. Although habitat-based models can help predict the occurrence of species in areas with limited or no sampling, they require validation to determine the accuracy of the predictions. Incorporating more than 18 months of harbour porpoise detection data from passive acoustic monitoring, generalized auto-regressive moving average and generalized additive models were used to investigate harbour porpoise occurrence within and around the Maryland WEA in relation to temporal and environmental variables. Acoustic detection metrics were compared to habitat-based density estimates derived from aerial and boat-based sightings to validate the model predictions. Harbour porpoises occurred significantly more frequently during January to May, and foraged significantly more often in the evenings to early mornings at sites within and outside the Maryland WEA. Harbour porpoise occurrence peaked at sea surface temperatures of 5°C and chlorophyll a concentrations of 4.5 to 7.4 mg m-3. The acoustic detections were significantly correlated with the predicted densities, except at the most inshore site. This study provides insight into previously unknown fine-scale spatial and temporal patterns in distribution of harbour porpoises offshore of Maryland. The results can be used to help inform future monitoring and mitigate the impacts of windfarm construction and other human activities.

  12. Impact of meteorological changes on the incidence of scarlet fever in Hefei City, China

    NASA Astrophysics Data System (ADS)

    Duan, Yu; Huang, Xiao-lei; Wang, Yu-jie; Zhang, Jun-qing; Zhang, Qi; Dang, Yue-wen; Wang, Jing

    2016-10-01

    Studies on scarlet fever with meteorological factors included were few. We aimed to illustrate meteorological factors' effects on monthly incidence of scarlet fever. Cases of scarlet fever were collected from the report of legal infectious disease in Hefei City from 1985 to 2006; the meteorological data were obtained from the weather bureau of Hefei City. Monthly incidence and corresponding meteorological data in these 22 years were used to develop the model. The model of auto regressive integrated moving average with covariates was used in statistical analyses. There was a highest peak from March to June and a small peak from November to January. The incidence of scarlet fever ranges from 0 to 0.71502 (per 105 population). SARIMAX (1,0,0)(1,0,0)12 model was fitted with monthly incidence and meteorological data optimally. It was shown that relative humidity ( β = -0.002, p = 0.020), mean temperature ( β = 0.006, p = 0.004), and 1 month lag minimum temperature ( β = -0.007, p < 0.001) had effect on the incidence of scarlet fever in Hefei. Besides, the incidence in a previous month (AR( β) = 0.469, p < 0.001) and in 12 months before (SAR( β) = 0.255, p < 0.001) was positively associated with the incidence. This study shows that scarlet fever incidence was negatively associated with monthly minimum temperature and relative humidity while was positively associated with mean temperature in Hefei City, China. Besides, the ARIMA model could be useful not only for prediction but also for the analysis of multiple correlations.

  13. Evaluation of the effects of climate and man intervention on ground waters and their dependent ecosystems using time series analysis

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Stefanopoulos, Kyriakos

    2011-06-01

    SummaryGroundwaters and their dependent ecosystems are affected both by the meteorological conditions as well as from human interventions, mainly in the form of groundwater abstractions for irrigation needs. This work aims at investigating the quantitative effects of meteorological conditions and man intervention on groundwater resources and their dependent ecosystems. Various seasonal Auto-Regressive Integrated Moving Average (ARIMA) models with external predictor variables were used in order to model the influence of meteorological conditions and man intervention on the groundwater level time series. Initially, a seasonal ARIMA model that simulates the abstraction time series using as external predictor variable temperature ( T) was prepared. Thereafter, seasonal ARIMA models were developed in order to simulate groundwater level time series in 8 monitoring locations, using the appropriate predictor variables determined for each individual case. The spatial component was introduced through the use of Geographical Information Systems (GIS). Application of the proposed methodology took place in the Neon Sidirochorion alluvial aquifer (Northern Greece), for which a 7-year long time series (i.e., 2003-2010) of piezometric and groundwater abstraction data exists. According to the developed ARIMA models, three distinct groups of groundwater level time series exist; the first one proves to be dependent only on the meteorological parameters, the second group demonstrates a mixed dependence both on meteorological conditions and on human intervention, whereas the third group shows a clear influence from man intervention. Moreover, there is evidence that groundwater abstraction has affected an important protected ecosystem.

  14. Analysis and prediction of rainfall trends over Bangladesh using Mann-Kendall, Spearman's rho tests and ARIMA model

    NASA Astrophysics Data System (ADS)

    Rahman, Mohammad Atiqur; Yunsheng, Lou; Sultana, Nahid

    2017-08-01

    In this study, 60-year monthly rainfall data of Bangladesh were analysed to detect trends. Modified Mann-Kendall, Spearman's rho tests and Sen's slope estimators were applied to find the long-term annual, dry season and monthly trends. Sequential Mann-Kendall analysis was applied to detect the potential trend turning points. Spatial variations of the trends were examined using inverse distance weighting (IDW) interpolation. AutoRegressive integrated moving average (ARIMA) model was used for the country mean rainfall and for other two stations data which depicted the highest and the lowest trend in the Mann-Kendall and Spearman's rho tests. Results showed that there is no significant trend in annual rainfall pattern except increasing trends for Cox's Bazar, Khulna, Satkhira and decreasing trend for Srimagal areas. For the dry season, only Bogra area represented significant decreasing trend. Long-term monthly trends demonstrated a mixed pattern; both negative and positive changes were found from February to September. Comilla area showed a significant decreasing trend for consecutive 3 months while Rangpur and Khulna stations confirmed the significant rising trends for three different months in month-wise trends analysis. Rangpur station data gave a maximum increasing trend in April whereas a maximum decreasing trend was found in August for Comilla station. ARIMA models predict +3.26, +8.6 and -2.30 mm rainfall per year for the country, Cox's Bazar and Srimangal areas, respectively. However, all the test results and predictions revealed a good agreement among them in the study.

  15. Active structural acoustic control of helicopter interior multifrequency noise using input-output-based hybrid control

    NASA Astrophysics Data System (ADS)

    Ma, Xunjun; Lu, Yang; Wang, Fengjiao

    2017-09-01

    This paper presents the recent advances in reduction of multifrequency noise inside helicopter cabin using an active structural acoustic control system, which is based on active gearbox struts technical approach. To attenuate the multifrequency gearbox vibrations and resulting noise, a new scheme of discrete model predictive sliding mode control has been proposed based on controlled auto-regressive moving average model. Its implementation only needs input/output data, hence a broader frequency range of controlled system is modelled and the burden on the state observer design is released. Furthermore, a new iteration form of the algorithm is designed, improving the developing efficiency and run speed. To verify the algorithm's effectiveness and self-adaptability, experiments of real-time active control are performed on a newly developed helicopter model system. The helicopter model can generate gear meshing vibration/noise similar to a real helicopter with specially designed gearbox and active struts. The algorithm's control abilities are sufficiently checked by single-input single-output and multiple-input multiple-output experiments via different feedback strategies progressively: (1) control gear meshing noise through attenuating vibrations at the key points on the transmission path, (2) directly control the gear meshing noise in the cabin using the actuators. Results confirm that the active control system is practical for cancelling multifrequency helicopter interior noise, which also weakens the frequency-modulation of the tones. For many cases, the attenuations of the measured noise exceed the level of 15 dB, with maximum reduction reaching 31 dB. Also, the control process is demonstrated to be smoother and faster.

  16. A hybrid wavelet de-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series.

    PubMed

    Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-01-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. [An auto-iatrogenic disease].

    PubMed

    Reinhart, W H

    2004-12-01

    A 55-year-old practitioner from an island in the northern sea felt an increasing hypersensitivity of his entire body to various ambient and nutritional allergens and toxics. He started to treat himself with increasing doses of glucocorticoids and moved to a southern climate in Lanzarote and later on to the Swiss mountains in the grisons. On admission to our hospital in December he was in a disastrous psychotic condition, trying to cool down his body by laying naked on his bed at ambient temperatures around the freezing point. He had consumed on average 250 mg prednisone daily over weeks. As we found out later his personal assistant travelling with him was giving him glucocorticoids through the infusion during his hospital stay. He developed a necrotizing septic phlebitis at the infusion site followed by a Pseudomonas aeruginosa sepsis with fatal multiorgan failure. This case illustrates the dangers of self-treatment by doctors and the difficulties in treating a physician.

  18. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.

  19. Female Infertility and Serum Auto-antibodies: a Systematic Review.

    PubMed

    Deroux, Alban; Dumestre-Perard, Chantal; Dunand-Faure, Camille; Bouillet, Laurence; Hoffmann, Pascale

    2017-08-01

    On average, 10 % of infertile couples have unexplained infertility. Auto-immune disease (systemic lupus erythematosus, anti-phospholipid syndrome) accounts for a part of these cases. In the last 20 years, aspecific auto-immunity, defined as positivity of auto-antibodies in blood sample without clinical or biological criteria for defined diseases, has been evoked in a subpopulation of infertile women. A systematic review was performed (PUBMED) using the MESH search terms "infertility" and "auto-immunity" or "reproductive technique" or "assisted reproduction" or "in vitro fertilization" and "auto-immunity." We retained clinical and physiopathological studies that were applicable to the clinician in assuming joint management of both infertility associated with serum auto-antibodies in women. Thyroid auto-immunity which affects thyroid function could be a cause of infertility; even in euthyroidia, the presence of anti-thyroperoxydase antibodies and/or thyroglobulin are related to infertility. The presence of anti-phospholipid (APL) and/or anti-nuclear (ANA) antibodies seems to be more frequent in the population of infertile women; serum auto-antibodies are associated with early ovarian failure, itself responsible for fertility disorders. However, there exist few publications on this topic. The methods of dosage, as well as the clinical criteria of unexplained infertility deserve to be standardized to allow a precise response to the question of the role of serum auto-antibodies in these women. The direct pathogenesis of this auto-immunity is unknown, but therapeutic immunomodulators, prescribed on a case-by-case basis, could favor pregnancy even in cases of unexplained primary or secondary infertility.

  20. Is emotional dysregulation a risk indicator for auto-aggression behaviors in adolescents with oppositional defiant disorder?

    PubMed

    Muratori, Pietro; Pisano, Simone; Milone, Annarita; Masi, Gabriele

    2017-01-15

    The Child Behavior Checklist Dysregulation Profile (CBCL-DP), (high scores in Anxious/Depressed, Attention Problems, and Aggressive Behavior subscales), has been related to poor emotional and behavioral self-regulation in children and adolescents. Our aim is to evaluate if it may be associated with auto-aggression in youth with oppositional defiant disorder (ODD). Method In 72 consecutively referred youths with ODD, emotional dysregulation was assessed with the CBCL-DP, auto-aggression and physical aggression against other persons with the Modified Overt Aggression Scale. Regression analysis showed that greater higher CBCL-DP scores were associated to higher levels of auto-aggression, even when controlling for the levels of physical aggression against others and CBCL Total score. The small sample size, the cross-sectional design, and the lack of a control group limit the generalization of our findings. Referred ODD youths with higher scores of CBCL-DP are more likely to present auto-aggression, besides aggression against others. The CBCL could improve the screening and detection of these high-risk patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. More steps towards process automation for optical fabrication

    NASA Astrophysics Data System (ADS)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  2. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  3. Auto Service Career Preparation Moves to the Fast Track.

    ERIC Educational Resources Information Center

    Gray, Don

    2000-01-01

    Automotive Youth Educational Systems (AYES) is a school-to-career partnership among automotive manufacturers and dealers and selected high schools. AYES is designed to encourage students to consider careers in retail automotive service and to prepare them for entry-level positions at dealerships or colleges. (JOW)

  4. [Spatial patterns and influence factors of specialization in tea cultivation based on geographically weighted regression model: A case study of Anxi County of Fujian Province, China].

    PubMed

    Shui, Wei; DU, Yong; Chen, Yi Ping; Jian, Xiao Mei; Fan, Bing Xiong

    2017-04-18

    Anxi County, specializing in tea cultivation, was taken as a case in this research. Pearson correlation analysis, ordinary least squares model (OLS) and geographically weighted regression model (GWR) were used to select four primary influence factors of specialization in tea cultivation (i.e., the average elevation, net income per capita, proportion of agricultural population, and the distance from roads) by analyzing the specialization degree of each town of Anxi County. Meanwhile, the spatial patterns of specialization in tea cultivation of Anxi County were evaluated. The results indicated that specialization in tea cultivation of Anxi County showed an obvious spatial auto-correlation, and a spatial pattern with "low-middle-high" circle structure, which was similar to Von Thünen's circle structure model, appeared from the county town to its surrounding region. Meanwhile, GWR (0.624) had a better fitting degree than OLS (0.595), and GWR could reasonably expound the spatial data. Contrary to the agricultural location theory of Von Thünen's model, which indicated that distance from market was a determination factor, the specialization degree of tea cultivation in Anxi was mainly decided by natural conditions of mountain area, instead of the social factors. Specialization degree of tea cultivation was positively correlated with the average elevation, net income per capita and the proportion of agricultural population, while a negative correlation was found between the distance from roads and specialization degree of tea cultivation. Coefficients of regression between the specialization degree of tea cultivation and two factors (i.e., the average elevation and net income per capita) showed a spatial pattern of higher level in the north direction and lower level in the south direction. On the contrary, the regression coefficients for the proportion of agricultural population increased from south to north of Anxi County. Furthermore, regression coefficient for the distance from roads showed a spatial pattern of higher level in the northeast direction and lower level in the southwest direction of Anxi County.

  5. A multimodel approach to interannual and seasonal prediction of Danube discharge anomalies

    NASA Astrophysics Data System (ADS)

    Rimbu, Norel; Ionita, Monica; Patrut, Simona; Dima, Mihai

    2010-05-01

    Interannual and seasonal predictability of Danube river discharge is investigated using three model types: 1) time series models 2) linear regression models of discharge with large-scale climate mode indices and 3) models based on stable teleconnections. All models are calibrated using discharge and climatic data for the period 1901-1977 and validated for the period 1978-2008 . Various time series models, like autoregressive (AR), moving average (MA), autoregressive and moving average (ARMA) or singular spectrum analysis and autoregressive moving average (SSA+ARMA) models have been calibrated and their skills evaluated. The best results were obtained using SSA+ARMA models. SSA+ARMA models proved to have the highest forecast skill also for other European rivers (Gamiz-Fortis et al. 2008). Multiple linear regression models using large-scale climatic mode indices as predictors have a higher forecast skill than the time series models. The best predictors for Danube discharge are the North Atlantic Oscillation (NAO) and the East Atlantic/Western Russia patterns during winter and spring. Other patterns, like Polar/Eurasian or Tropical Northern Hemisphere (TNH) are good predictors for summer and autumn discharge. Based on stable teleconnection approach (Ionita et al. 2008) we construct prediction models through a combination of sea surface temperature (SST), temperature (T) and precipitation (PP) from the regions where discharge and SST, T and PP variations are stable correlated. Forecast skills of these models are higher than forecast skills of the time series and multiple regression models. The models calibrated and validated in our study can be used for operational prediction of interannual and seasonal Danube discharge anomalies. References Gamiz-Fortis, S., D. Pozo-Vazquez, R.M. Trigo, and Y. Castro-Diez, Quantifying the predictability of winter river flow in Iberia. Part I: intearannual predictability. J. Climate, 2484-2501, 2008. Gamiz-Fortis, S., D. Pozo-Vazquez, R.M. Trigo, and Y. Castro-Diez, Quantifying the predictability of winter river flow in Iberia. Part II: seasonal predictability. J. Climate, 2503-2518, 2008. Ionita, M., G. Lohmann, and N. Rimbu, Prediction of spring Elbe river discharge based on stable teleconnections with global temperature and precipitation. J. Climate. 6215-6226, 2008.

  6. TU-H-CAMPUS-JeP2-05: Can Automatic Delineation of Cardiac Substructures On Noncontrast CT Be Used for Cardiac Toxicity Analysis?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Y; Liao, Z; Jiang, W

    Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD)more » to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the clinical practice with a minor modification to the PV vessel.« less

  7. Automated IMRT planning in Pinnacle : A study in head-and-neck cancer.

    PubMed

    Kusters, J M A M; Bzdusek, K; Kumar, P; van Kollenburg, P G M; Kunze-Busch, M C; Wendling, M; Dijkema, T; Kaanders, J H A M

    2017-12-01

    This study evaluates the performance and planning efficacy of the Auto-Planning (AP) module in the clinical version of Pinnacle 9.10 (Philips Radiation Oncology Systems, Fitchburg, WI, USA). Twenty automated intensity-modulated radiotherapy (IMRT) plans were compared with the original manually planned clinical IMRT plans from patients with oropharyngeal cancer. Auto-Planning with IMRT offers similar coverage of the planning target volume as the original manually planned clinical plans, as well as better sparing of the contralateral parotid gland, contralateral submandibular gland, larynx, mandible, and brainstem. The mean dose of the contralateral parotid gland and contralateral submandibular gland could be reduced by 2.5 Gy and 1.7 Gy on average. The number of monitor units was reduced with an average of 143.9 (18%). Hands-on planning time was reduced from 1.5-3 h to less than 1 h. The Auto-Planning module was able to produce clinically acceptable head and neck IMRT plans with consistent quality.

  8. Fast and Adaptive Auto-focusing Microscope

    NASA Astrophysics Data System (ADS)

    Obara, Takeshi; Igarashi, Yasunobu; Hashimoto, Koichi

    Optical microscopes are widely used in biological and medical researches. By using the microscope, we can observe cellular movements including intracellular ions and molecules tagged with fluorescent dyes at a high magnification. However, a freely motile cell easily escapes from a 3D field of view of the typical microscope. Therefore, we propose a novel auto-focusing algorithm and develop a auto-focusing and tracking microscope. XYZ positions of a microscopic stage are feedback controlled to focus and track the cell automatically. A bright-field image is used to estimate a cellular position. XY centroids are used to estimate XY positions of the tracked cell. To estimate Z position, we use a diffraction pattern around the cell membrane. This estimation method is so-called Depth from Diffraction (DFDi). However, this method is not robust for individual differences between cells because the diffraction pattern depends on each cellular shape. Therefore, in this study, we propose a real-time correction of DFDi by using 2D Laplacian of an intracellular area as a goodness of the focus. To evaluate the performance of our developed algorithm and microscope, we auto-focus and track a freely moving paramecium. In this experimental result, the paramecium is auto-focused and kept inside the scope of the microscope during 45s. The evaluated focal error is within 5µm, while a length and a thickness of the paramecium are about 200µm and 50µm, respectively.

  9. Effect of air pollution on pediatric respiratory emergency room visits and hospital admissions.

    PubMed

    Farhat, S C L; Paulo, R L P; Shimoda, T M; Conceição, G M S; Lin, C A; Braga, A L F; Warth, M P N; Saldiva, P H N

    2005-02-01

    In order to assess the effect of air pollution on pediatric respiratory morbidity, we carried out a time series study using daily levels of PM10, SO2, NO2, ozone, and CO and daily numbers of pediatric respiratory emergency room visits and hospital admissions at the Children's Institute of the University of Sao Paulo Medical School, from August 1996 to August 1997. In this period there were 43,635 hospital emergency room visits, 4534 of which were due to lower respiratory tract disease. The total number of hospital admissions was 6785, 1021 of which were due to lower respiratory tract infectious and/or obstructive diseases. The three health end-points under investigation were the daily number of emergency room visits due to lower respiratory tract diseases, hospital admissions due to pneumonia, and hospital admissions due to asthma or bronchiolitis. Generalized additive Poisson regression models were fitted, controlling for smooth functions of time, temperature and humidity, and an indicator of weekdays. NO2 was positively associated with all outcomes. Interquartile range increases (65.04 microg/m3) in NO2 moving averages were associated with an 18.4% increase (95% confidence interval, 95% CI = 12.5-24.3) in emergency room visits due to lower respiratory tract diseases (4-day moving average), a 17.6% increase (95% CI = 3.3-32.7) in hospital admissions due to pneumonia or bronchopneumonia (3-day moving average), and a 31.4% increase (95% CI = 7.2-55.7) in hospital admissions due to asthma or bronchiolitis (2-day moving average). The study showed that air pollution considerably affects children's respiratory morbidity, deserving attention from the health authorities.

  10. Challenges of Electronic Medical Surveillance Systems

    DTIC Science & Technology

    2004-06-01

    More sophisticated approaches, such as regression models and classical autoregressive moving average ( ARIMA ) models that make estimates based on...with those predicted by a mathematical model . The primary benefit of ARIMA models is their ability to correct for local trends in the data so that...works well, for example, during a particularly severe flu season, where prolonged periods of high visit rates are adjusted to by the ARIMA model , thus

  11. Automated Computer Access Request System

    NASA Technical Reports Server (NTRS)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  12. The MOVES (Motor tic, Obsessions and compulsions, Vocal tic Evaluation Survey): cross-cultural evaluation of the French version and additional psychometric assessment.

    PubMed

    Jalenques, Isabelle; Guiguet-Auclair, Candy; Derost, Philippe; Joubert, Pauline; Foures, Louis; Hartmann, Andreas; Muellner, Julia; Rondepierre, Fabien

    2018-03-01

    The Motor tic, Obsessions and compulsions, Vocal tic Evaluation Survey (MOVES) is a self-report scale suggested as a severity scale for tics and related sensory phenomena observed in Gilles de la Tourette syndrome (GTS) and recommended as a screening instrument by the Committee on Rating Scale Development of the International Parkinson's Disease and Movement Disorder Society. To cross-culturally adapt a French version of the MOVES and to evaluate its psychometric properties. After the cross-cultural adaptation of the MOVES, we assessed its psychometric properties in 53 patients aged 12-16 years and in 54 patients aged 16 years and above: reliability and construct validity (relationships between items and scales), internal consistency and concurrent validity with the Yale Global Tic Severity Scale (YGTSS) and the Children's Yale-Brown Obsessive-Compulsive Scale (CY-BOCS) or the auto-Yale-Brown scale. The results showed very good acceptability with response rates greater than 92%, good internal consistency (Cronbach's alpha ranging from 0.62 and 0.89) and good test-retest reliability (ICCs ranging from 0.59 to 0.91). Concurrent validity with the YGTSS, CY-BOCS and auto-Yale-Brown scales showed strong expected correlations. The cut-off points tested for diagnostic performance gave satisfactory values of sensitivity, specificity, and positive and negative predictive values. Our study provides evidence of the good psychometric properties of the French version of the MOVES. The cross-cultural adaptation of this specific instrument will allow investigators to include French-speaking persons with GTS aged 12 years and over in national and international collaboration research projects.

  13. A novel Hartman Shack-based topography system: repeatability and agreement for corneal power with Scheimpflug+Placido topographer and rotating prism auto-keratorefractor.

    PubMed

    Prakash, Gaurav; Srivastava, Dhruv; Choudhuri, Sounak

    2015-12-01

    The purpose of this study is to analyze the repeatability and agreement of corneal power using a new Hartman type topographer in comparison to Scheimpflug+Placido and autorefractor devices. In this cross sectional, observational study performed at the cornea services of a specialty hospital, 100 normal eyes (100 consecutive candidates) without any previous ocular surgery or morbidity except refractive error were evaluated. All candidates underwent three measurements each on a Full gradient, Hartman type topographer (FG) (iDesign, AMO), Scheimpflug+Placido topographer (SP) (Sirius, CSO) and rotating prism auto-keratorefractor (AR) (KR1, Nidek). The parameters assessed were flat keratometry (K1), steep keratometry (K2), steep axis (K2 axis), mean K, J 0 and J 45. Intra-device repeatability and inter-device agreement were evaluated. On repeatability analysis, the intra-device means were not significantly different (ANOVA, p > 0.05). Intraclass correlations (ICC) were >0.98 except for J 0 and J 45. In terms of intra-measurement standard deviation (Sw), the SP and FG groups fared better than AR group (p < 0.001, ANOVA). On Sw versus Average plots, no significantly predictive fit was seen (p > 0.05, R (2) < 0.1 for all the values). On inter-device agreement analysis, there was no difference in means (ANOVA, p > 0.05). ICC ranged from 0.92 to 0.99 (p < 0.001). Regression fits on Bland-Altman plots suggested no clinically significant effect of average values over difference in means. The repeatability of Hartman type topographer in normal eyes is comparable to SP combination device and better than AR. The agreement between the three devices is good. However, we recommend against interchanging these devices between follow-ups or pooling their data.

  14. Impact of single-site axonal GABAergic synaptic events on cerebellar interneuron activity.

    PubMed

    de San Martin, Javier Zorrilla; Jalil, Abdelali; Trigo, Federico F

    2015-12-01

    Axonal ionotropic receptors are present in a variety of neuronal types, and their function has largely been associated with the modulation of axonal activity and synaptic release. It is usually assumed that activation of axonal GABA(A)Rs comes from spillover, but in cerebellar molecular layer interneurons (MLIs) the GABA source is different: in these cells, GABA release activates presynaptic GABA(A) autoreceptors (autoRs) together with postsynaptic targets, producing an autoR-mediated synaptic event. The frequency of presynaptic, autoR-mediated miniature currents is twice that of their somatodendritic counterparts, suggesting that autoR-mediated responses have an important effect on interneuron activity. Here, we used local Ca(2+) photolysis in MLI axons of juvenile rats to evoke GABA release from individual varicosities to study the activation of axonal autoRs in single release sites. Our data show that single-site autoR conductances are similar to postsynaptic dendritic conductances. In conditions of high [Cl(-)](i), autoR-mediated conductances range from 1 to 5 nS; this corresponds to ∼30-150 GABA(A) channels per presynaptic varicosity, a value close to the number of channels in postsynaptic densities. Voltage responses produced by the activation of autoRs in single varicosities are amplified by a Na(v)-dependent mechanism and propagate along the axon with a length constant of 91 µm. Immunolabeling determination of synapse location shows that on average, one third of the synapses produce autoR-mediated signals that are large enough to reach the axon initial segment. Finally, we show that single-site activation of presynaptic GABA(A) autoRs leads to an increase in MLI excitability and thus conveys a strong feedback signal that contributes to spiking activity. © 2015 Zorrilla de San Martin et al.

  15. Impact of single-site axonal GABAergic synaptic events on cerebellar interneuron activity

    PubMed Central

    Zorrilla de San Martin, Javier; Jalil, Abdelali

    2015-01-01

    Axonal ionotropic receptors are present in a variety of neuronal types, and their function has largely been associated with the modulation of axonal activity and synaptic release. It is usually assumed that activation of axonal GABAARs comes from spillover, but in cerebellar molecular layer interneurons (MLIs) the GABA source is different: in these cells, GABA release activates presynaptic GABAA autoreceptors (autoRs) together with postsynaptic targets, producing an autoR-mediated synaptic event. The frequency of presynaptic, autoR-mediated miniature currents is twice that of their somatodendritic counterparts, suggesting that autoR-mediated responses have an important effect on interneuron activity. Here, we used local Ca2+ photolysis in MLI axons of juvenile rats to evoke GABA release from individual varicosities to study the activation of axonal autoRs in single release sites. Our data show that single-site autoR conductances are similar to postsynaptic dendritic conductances. In conditions of high [Cl−]i, autoR-mediated conductances range from 1 to 5 nS; this corresponds to ∼30–150 GABAA channels per presynaptic varicosity, a value close to the number of channels in postsynaptic densities. Voltage responses produced by the activation of autoRs in single varicosities are amplified by a Nav-dependent mechanism and propagate along the axon with a length constant of 91 µm. Immunolabeling determination of synapse location shows that on average, one third of the synapses produce autoR-mediated signals that are large enough to reach the axon initial segment. Finally, we show that single-site activation of presynaptic GABAA autoRs leads to an increase in MLI excitability and thus conveys a strong feedback signal that contributes to spiking activity. PMID:26621773

  16. A drift line bias estimator: ARMA-based filter or calibration method, and its application in BDS/GPS-based attitude determination

    NASA Astrophysics Data System (ADS)

    Liang, Zhang; Yanqing, Hou; Jie, Wu

    2016-12-01

    The multi-antenna synchronized receiver (using a common clock) is widely applied in GNSS-based attitude determination (AD) or terrain deformations monitoring, and many other applications, since the high-accuracy single-differenced carrier phase can be used to improve the positioning or AD accuracy. Thus, the line bias (LB) parameter (fractional bias isolating) should be calibrated in the single-differenced phase equations. In the past decades, all researchers estimated the LB as a constant parameter in advance and compensated it in real time. However, the constant LB assumption is inappropriate in practical applications because of the physical length and permittivity changes of the cables, caused by the environmental temperature variation and the instability of receiver-self inner circuit transmitting delay. Considering the LB drift (or colored LB) in practical circumstances, this paper initiates a real-time estimator using auto regressive moving average-based (ARMA) prediction/whitening filter model or Moving average-based (MA) constant calibration model. In the ARMA-based filter model, four cases namely AR(1), ARMA(1, 1), AR(2) and ARMA(2, 1) are applied for the LB prediction. The real-time relative positioning model using the ARMA-based predicting LB is derived and it is theoretically proved that the positioning accuracy is better than the traditional double difference carrier phase (DDCP) model. The drifting LB is defined with a phase temperature changing rate integral function, which is a random walk process if the phase temperature changing rate is white noise, and is validated by the analysis of the AR model coefficient. The auto covariance function shows that the LB is indeed varying in time and estimating it as a constant is not safe, which is also demonstrated by the analysis on LB variation of each visible satellite during a zero and short baseline BDS/GPS experiment. Compared to the DDCP approach, in the zero-baseline experiment, the LB constant calibration (LBCC) and MA approaches improved the positioning accuracy of the vertical component, while slightly degrading the accuracy of the horizontal components. The ARMA(1, 0) model, however, improved the positioning accuracy of all three components, with 40 and 50 % improvement of the vertical component for BDS and GPS, respectively. In the short baseline experiment, compared to the DDCP approach, the LBCC approach yielded bad positioning solutions and degraded the AD accuracy; both MA and ARMA-based filter approaches improved the AD accuracy. Moreover, the ARMA(1, 0) and ARMA(1, 1) models have relatively better performance, improving to 55 % and 48 % the elevation angle in ARMA(1, 1) and MA model for GPS, respectively. Furthermore, the drifting LB variation is found to be continuous and slowly cumulative; the variation magnitudes in the unit of length are almost identical on different frequency carrier phases, so the LB variation does not show obvious correlation between different frequencies. Consequently, the wide-lane LB in the unit of cycle is very stable, while the narrow-lane LB varies largely in time. This reasoning probably also explains the phenomenon that the wide-lane LB originating in the satellites is stable, while the narrow-lane LB varies. The results of ARMA-based filters are better than the MA model, which probably implies that the modeling for drifting LB can further improve the precise point positioning accuracy.

  17. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: A case study in endemic districts of Bhutan

    PubMed Central

    2010-01-01

    Background Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. Methods This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. Results It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009 and 67 to 149 cases in 2010, where population in 2009 was 285,375 and the expected population of 2010 to be 289,085. The ARIMAX model of monthly cases and climatic factors showed considerable variations among the different districts. In general, the mean maximum temperature lagged at one month was a strong positive predictor of an increased malaria cases for four districts. The monthly number of cases of the previous month was also a significant predictor in one district, whereas no variable could predict malaria cases for two districts. Conclusions The ARIMA models of time-series analysis were useful in forecasting the number of cases in the endemic areas of Bhutan. There was no consistency in the predictors of malaria cases when using ARIMAX model with selected lag times and climatic predictors. The ARIMA forecasting models could be employed for planning and managing malaria prevention and control programme in Bhutan. PMID:20813066

  18. Trade and Industry Articulation Project: Final Report, 1981-82.

    ERIC Educational Resources Information Center

    Stewart, Betsy; And Others

    A project was undertaken at Cerritos College (CC) to establish a statewide model for the development of articulated trade and industry curriculum materials and methods that would allow students to move from the secondary to the college level without loss of time or resources. Six subject areas were chosen: auto body, automotive, drafting,…

  19. Lowering transit crime may save energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Shifting travelers from less-energy-efficient automobiles to more-energy-efficient transit vehicles is an essential energy conservation measure. During peak travel periods the average auto carries 1.4 persons and consumes 16 times more fuel per passenger mile than an urban bus carrying an average of 75 passengers. Today's travelers are using transit for less than 3 percent of their urban trips. Travelers reject transit because its costs--in terms of time, money, and quality of service--are higher than those for the auto. One element of the higher cost of using transit is the increased exposure to crime which occurs when a traveler shifts frommore » his private car to mass transit. The increased exposure is the result of the additional time transit travelers spend getting to and waiting at transit stops, as well as the additional time spent riding, and the lack of privacy while on transit. Furthermore, transit travelers have no control over their route, which may go through high-crime areas. In contrast, traveling by auto not only eliminates the time getting to and waiting at transit stops, but it also provides a secure compartment which can be locked. Traveling companions can be chosen to limit exposure to crime. In addition, auto travel provides the opportunity to select the safest and shortest route. Between the two extremes of high exposure to crime presented by public transit and low exposure to crime offered by private autos lies para-transit, such as taxis, carpools, and jitneys (small buses that carry passengers over a regular route according to a flexible schedule). (MCW)« less

  20. Modeling and forecasting of the under-five mortality rate in Kermanshah province in Iran: a time series analysis.

    PubMed

    Rostami, Mehran; Jalilian, Abdollah; Hamzeh, Behrooz; Laghaei, Zahra

    2015-01-01

    The target of the Fourth Millennium Development Goal (MDG-4) is to reduce the rate of under-five mortality by two-thirds between 1990 and 2015. Despite substantial progress towards achieving the target of the MDG-4 in Iran at the national level, differences at the sub-national levels should be taken into consideration. The under-five mortality data available from the Deputy of Public Health, Kermanshah University of Medical Sciences, was used in order to perform a time series analysis of the monthly under-five mortality rate (U5MR) from 2005 to 2012 in Kermanshah province in the west of Iran. After primary analysis, a seasonal auto-regressive integrated moving average model was chosen as the best fitting model based on model selection criteria. The model was assessed and proved to be adequate in describing variations in the data. However, the unexpected presence of a stochastic increasing trend and a seasonal component with a periodicity of six months in the fitted model are very likely to be consequences of poor quality of data collection and reporting systems. The present work is the first attempt at time series modeling of the U5MR in Iran, and reveals that improvement of under-five mortality data collection in health facilities and their corresponding systems is a major challenge to fully achieving the MGD-4 in Iran. Studies similar to the present work can enhance the understanding of the invisible patterns in U5MR, monitor progress towards the MGD-4, and predict the impact of future variations on the U5MR.

  1. Impact of meteorological changes on the incidence of scarlet fever in Hefei City, China.

    PubMed

    Duan, Yu; Huang, Xiao-Lei; Wang, Yu-Jie; Zhang, Jun-Qing; Zhang, Qi; Dang, Yue-Wen; Wang, Jing

    2016-10-01

    Studies on scarlet fever with meteorological factors included were few. We aimed to illustrate meteorological factors' effects on monthly incidence of scarlet fever. Cases of scarlet fever were collected from the report of legal infectious disease in Hefei City from 1985 to 2006; the meteorological data were obtained from the weather bureau of Hefei City. Monthly incidence and corresponding meteorological data in these 22 years were used to develop the model. The model of auto regressive integrated moving average with covariates was used in statistical analyses. There was a highest peak from March to June and a small peak from November to January. The incidence of scarlet fever ranges from 0 to 0.71502 (per 10 5 population). SARIMAX (1,0,0)(1,0,0) 12 model was fitted with monthly incidence and meteorological data optimally. It was shown that relative humidity (β = -0.002, p = 0.020), mean temperature (β = 0.006, p = 0.004), and 1 month lag minimum temperature (β = -0.007, p < 0.001) had effect on the incidence of scarlet fever in Hefei. Besides, the incidence in a previous month (AR(β) = 0.469, p < 0.001) and in 12 months before (SAR(β) = 0.255, p < 0.001) was positively associated with the incidence. This study shows that scarlet fever incidence was negatively associated with monthly minimum temperature and relative humidity while was positively associated with mean temperature in Hefei City, China. Besides, the ARIMA model could be useful not only for prediction but also for the analysis of multiple correlations.

  2. The effectiveness of faecal removal methods of pasture management to control the cyathostomin burden of donkeys.

    PubMed

    Corbett, Christopher J; Love, Sandy; Moore, Anna; Burden, Faith A; Matthews, Jacqui B; Denwood, Matthew J

    2014-01-24

    The level of anthelmintic resistance within some cyathostomin parasite populations has increased to the level where sole reliance on anthelmintic-based control protocols is not possible. Management-based nematode control methods, including removal of faeces from pasture, are widely recommended for use in association with a reduction in anthelmintic use to reduce selection pressure for drug resistance; however, very little work has been performed to quantitatively assess the effectiveness of such methods. We analysed data obtained from 345 donkeys at The Donkey Sanctuary (Devon, UK), managed under three different pasture management techniques, to investigate the effectiveness of faeces removal in strongyle control in equids. The management groups were as follows: no removal of faeces from pasture, manual, twice-weekly removal of faeces from pasture and automatic, twice-weekly removal of faeces from pasture (using a mechanical pasture sweeper). From turn-out onto pasture in May, monthly faecal egg counts were obtained for each donkey and the dataset subjected to an auto regressive moving average model. There was little to no difference in faecal egg counts between the two methods of faecal removal; both resulted in significantly improved cyathostomin control compared to the results obtained from the donkeys that grazed pasture from which there was no faecal removal. This study represents a valuable and unique assessment of the effectiveness of the removal of equine faeces from pasture, and provides an evidence base from which to advocate twice-weekly removal of faeces from pasture as an adjunct for equid nematode control. Widespread adoption of this practice could substantially reduce anthelmintic usage, and hence reduce selection pressure for nematode resistance to the currently effective anthelmintic products.

  3. The effectiveness of faecal removal methods of pasture management to control the cyathostomin burden of donkeys

    PubMed Central

    2014-01-01

    Background The level of anthelmintic resistance within some cyathostomin parasite populations has increased to the level where sole reliance on anthelmintic-based control protocols is not possible. Management-based nematode control methods, including removal of faeces from pasture, are widely recommended for use in association with a reduction in anthelmintic use to reduce selection pressure for drug resistance; however, very little work has been performed to quantitatively assess the effectiveness of such methods. Methods We analysed data obtained from 345 donkeys at The Donkey Sanctuary (Devon, UK), managed under three different pasture management techniques, to investigate the effectiveness of faeces removal in strongyle control in equids. The management groups were as follows: no removal of faeces from pasture, manual, twice-weekly removal of faeces from pasture and automatic, twice-weekly removal of faeces from pasture (using a mechanical pasture sweeper). From turn-out onto pasture in May, monthly faecal egg counts were obtained for each donkey and the dataset subjected to an auto regressive moving average model. Results There was little to no difference in faecal egg counts between the two methods of faecal removal; both resulted in significantly improved cyathostomin control compared to the results obtained from the donkeys that grazed pasture from which there was no faecal removal. Conclusions This study represents a valuable and unique assessment of the effectiveness of the removal of equine faeces from pasture, and provides an evidence base from which to advocate twice-weekly removal of faeces from pasture as an adjunct for equid nematode control. Widespread adoption of this practice could substantially reduce anthelmintic usage, and hence reduce selection pressure for nematode resistance to the currently effective anthelmintic products. PMID:24460700

  4. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  5. One-year delayed effect of fog on malaria transmission: a time-series analysis in the rain forest area of Mengla County, south-west China

    PubMed Central

    Tian, Linwei; Bi, Yan; Ho, Suzanne C; Liu, Wenjie; Liang, Song; Goggins, William B; Chan, Emily YY; Zhou, Shuisen; Sung, Joseph JY

    2008-01-01

    Background Malaria is a major public health burden in the tropics with the potential to significantly increase in response to climate change. Analyses of data from the recent past can elucidate how short-term variations in weather factors affect malaria transmission. This study explored the impact of climate variability on the transmission of malaria in the tropical rain forest area of Mengla County, south-west China. Methods Ecological time-series analysis was performed on data collected between 1971 and 1999. Auto-regressive integrated moving average (ARIMA) models were used to evaluate the relationship between weather factors and malaria incidence. Results At the time scale of months, the predictors for malaria incidence included: minimum temperature, maximum temperature, and fog day frequency. The effect of minimum temperature on malaria incidence was greater in the cool months than in the hot months. The fog day frequency in October had a positive effect on malaria incidence in May of the following year. At the time scale of years, the annual fog day frequency was the only weather predictor of the annual incidence of malaria. Conclusion Fog day frequency was for the first time found to be a predictor of malaria incidence in a rain forest area. The one-year delayed effect of fog on malaria transmission may involve providing water input and maintaining aquatic breeding sites for mosquitoes in vulnerable times when there is little rainfall in the 6-month dry seasons. These findings should be considered in the prediction of future patterns of malaria for similar tropical rain forest areas worldwide. PMID:18565224

  6. Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination

    PubMed Central

    Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu

    2013-01-01

    Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370

  7. Outpatient Utilization by Infants Auto-assigned to Medicaid Managed Care Plans

    PubMed Central

    Cohn, Lisa M.; Clark, Sarah J.

    2013-01-01

    To test the hypothesis that infants auto-assigned to a Medicaid managed care plan would have lower primary care and higher emergency department (ED) utilization compared to infants with a chosen plan. Retrospective cohort study. Medicaid administrative data were used to identify all children 0–3 months of age at enrollment in Michigan Medicaid managed care in 2005–2008 with 18-months of subsequent enrollment. Medicaid encounter and state immunization registry data were then acquired. Auto-assigned infants were compared versus chosen plan infants on: (1) well-child visits (WCVs); (2) immunizations; (3) acute office visits; and (4) ED visits. Chi squared and rank-sum tests and logistic and negative binomial regression were used in bivariate and multivariable analyses for dichotomous and count data, respectively. 18 % of infants were auto-assigned. Auto-assigned infants were less likely to meet goal number of WCVs in 18-months of managed care enrollment (32 vs. 53 %, p < 0.001) and to be up-to-date on immunizations at 12 months of age (75 vs. 85 %, p < 0.001). Auto-assigned infants had fewer acute office visits (median: 4 vs. 5, p < 0.001) but were only slightly more likely to have 2 or more ED visits (51 vs. 46 %, p < 0.001) in 18-months of enrollment. All results were significant in multivariable analyses. Auto-assigned infants were less likely to use preventive and acute primary care but only slightly more likely to use emergency care. Future work is needed to understand mechanisms of differences in utilization, but auto-assigned children may represent a target group for efforts to promote pediatric preventive care in Medicaid. PMID:23775252

  8. Outpatient utilization by infants auto-assigned to Medicaid managed care plans.

    PubMed

    Zickafoose, Joseph S; Cohn, Lisa M; Clark, Sarah J

    2014-04-01

    To test the hypothesis that infants auto-assigned to a Medicaid managed care plan would have lower primary care and higher emergency department (ED) utilization compared to infants with a chosen plan. Retrospective cohort study. Medicaid administrative data were used to identify all children 0-3 months of age at enrollment in Michigan Medicaid managed care in 2005-2008 with 18-months of subsequent enrollment. Medicaid encounter and state immunization registry data were then acquired. Auto-assigned infants were compared versus chosen plan infants on: (1) well-child visits (WCVs); (2) immunizations; (3) acute office visits; and (4) ED visits. Chi squared and rank-sum tests and logistic and negative binomial regression were used in bivariate and multivariable analyses for dichotomous and count data, respectively. 18% of infants were auto-assigned. Auto-assigned infants were less likely to meet goal number of WCVs in 18-months of managed care enrollment (32 vs. 53%, p < 0.001) and to be up-to-date on immunizations at 12 months of age (75 vs. 85%, p < 0.001). Auto-assigned infants had fewer acute office visits (median: 4 vs. 5, p < 0.001) but were only slightly more likely to have 2 or more ED visits (51 vs. 46%, p < 0.001) in 18-months of enrollment. All results were significant in multivariable analyses. Auto-assigned infants were less likely to use preventive and acute primary care but only slightly more likely to use emergency care. Future work is needed to understand mechanisms of differences in utilization, but auto-assigned children may represent a target group for efforts to promote pediatric preventive care in Medicaid.

  9. A framework for feature extraction from hospital medical data with applications in risk prediction.

    PubMed

    Tran, Truyen; Luo, Wei; Phung, Dinh; Gupta, Sunil; Rana, Santu; Kennedy, Richard Lee; Larkins, Ann; Venkatesh, Svetha

    2014-12-30

    Feature engineering is a time consuming component of predictive modeling. We propose a versatile platform to automatically extract features for risk prediction, based on a pre-defined and extensible entity schema. The extraction is independent of disease type or risk prediction task. We contrast auto-extracted features to baselines generated from the Elixhauser comorbidities. Hospital medical records was transformed to event sequences, to which filters were applied to extract feature sets capturing diversity in temporal scales and data types. The features were evaluated on a readmission prediction task, comparing with baseline feature sets generated from the Elixhauser comorbidities. The prediction model was through logistic regression with elastic net regularization. Predictions horizons of 1, 2, 3, 6, 12 months were considered for four diverse diseases: diabetes, COPD, mental disorders and pneumonia, with derivation and validation cohorts defined on non-overlapping data-collection periods. For unplanned readmissions, auto-extracted feature set using socio-demographic information and medical records, outperformed baselines derived from the socio-demographic information and Elixhauser comorbidities, over 20 settings (5 prediction horizons over 4 diseases). In particular over 30-day prediction, the AUCs are: COPD-baseline: 0.60 (95% CI: 0.57, 0.63), auto-extracted: 0.67 (0.64, 0.70); diabetes-baseline: 0.60 (0.58, 0.63), auto-extracted: 0.67 (0.64, 0.69); mental disorders-baseline: 0.57 (0.54, 0.60), auto-extracted: 0.69 (0.64,0.70); pneumonia-baseline: 0.61 (0.59, 0.63), auto-extracted: 0.70 (0.67, 0.72). The advantages of auto-extracted standard features from complex medical records, in a disease and task agnostic manner were demonstrated. Auto-extracted features have good predictive power over multiple time horizons. Such feature sets have potential to form the foundation of complex automated analytic tasks.

  10. Perceptions and Efficacy of Flight Operational Quality Assurance (FOQA) Programs Among Small-scale Operators

    DTIC Science & Technology

    2012-01-01

    regressive Integrated Moving Average ( ARIMA ) model for the data, eliminating the need to identify an appropriate model through trial and error alone...06 .11 13.67 16 .62 16 .14 .11 8.06 16 .95 * Based on the asymptotic chi-square approximation. 8 In general, ARIMA models address three...performance standards and measurement processes and a prevailing climate of organizational trust were important factors. Unfortunately, uneven

  11. Accuracy of a novel auto-CPAP device to evaluate the residual apnea-hypopnea index in patients with obstructive sleep apnea.

    PubMed

    Nigro, Carlos Alberto; González, Sergio; Arce, Anabella; Aragone, María Rosario; Nigro, Luciana

    2015-05-01

    Patients under treatment with continuous positive airway pressure (CPAP) may have residual sleep apnea (RSA). The main objective of our study was to evaluate a novel auto-CPAP for the diagnosis of RSA. All patients referred to the sleep laboratory to undergo CPAP polysomnography were evaluated. Patients treated with oxygen or noninvasive ventilation and split-night polysomnography (PSG), PSG with artifacts, or total sleep time less than 180 min were excluded. The PSG was manually analyzed before generating the automatic report from auto-CPAP. PSG variables (respiratory disturbance index (RDI), obstructive apnea index, hypopnea index, and central apnea index) were compared with their counterparts from auto-CPAP through Bland-Altman plots and intraclass correlation coefficient. The diagnostic accuracy of autoscoring from auto-CPAP using different cutoff points of RDI (≥5 and 10) was evaluated by the receiver operating characteristics (ROCs) curve. The study included 114 patients (24 women; mean age and BMI, 59 years old and 33 kg/m(2); RDI and apnea/hypopnea index (AHI)-auto median, 5 and 2, respectively). The average difference between the AHI-auto and the RDI was -3.5 ± 3.9. The intraclass correlation coefficient (ICC) between the total number of central apneas, obstructive, and hypopneas between the PSG and the auto-CPAP were 0.69, 0.16, and 0.15, respectively. An AHI-auto >2 (RDI ≥ 5) or >4 (RDI ≥ 10) had an area under the ROC curve, sensitivity, specificity, positive likelihood ratio, and negative for diagnosis of residual sleep apnea of 0.84/0.89, 84/81%, 82/91%, 4.5/9.5, and 0.22/0.2, respectively. The automatic analysis from auto-CPAP (S9 Autoset) showed a good diagnostic accuracy to identify residual sleep apnea. The absolute agreement between PSG and auto-CPAP to classify the respiratory events correctly varied from very low (obstructive apneas, hypopneas) to moderate (central apneas).

  12. Alveolar ridge preservation of an extraction socket using autogenous tooth bone graft material for implant site development: prospective case series

    PubMed Central

    Yun, Pil-Young; Um, In-Woong; Lee, Hyo-Jung; Yi, Yang-Jin; Bae, Ji-Hyun; Lee, Junho

    2014-01-01

    This case series evaluated the clinical efficacy of autogenous tooth bone graft material (AutoBT) in alveolar ridge preservation of an extraction socket. Thirteen patients who received extraction socket graft using AutoBT followed by delayed implant placements from Nov. 2008 to Aug. 2010 were evaluated. A total of fifteen implants were placed. The primary and secondary stability of the placed implants were an average of 58 ISQ and 77.9 ISQ, respectively. The average amount of crestal bone loss around the implant was 0.05 mm during an average of 22.5 months (from 12 to 34 months) of functional loading. Newly formed tissues were evident from the 3-month specimen. Within the limitations of this case, autogenous tooth bone graft material can be a favorable bone substitute for extraction socket graft due to its good bone remodeling and osteoconductivity. PMID:25551013

  13. Wavelet regression model in forecasting crude oil price

    NASA Astrophysics Data System (ADS)

    Hamid, Mohd Helmie; Shabri, Ani

    2017-05-01

    This study presents the performance of wavelet multiple linear regression (WMLR) technique in daily crude oil forecasting. WMLR model was developed by integrating the discrete wavelet transform (DWT) and multiple linear regression (MLR) model. The original time series was decomposed to sub-time series with different scales by wavelet theory. Correlation analysis was conducted to assist in the selection of optimal decomposed components as inputs for the WMLR model. The daily WTI crude oil price series has been used in this study to test the prediction capability of the proposed model. The forecasting performance of WMLR model were also compared with regular multiple linear regression (MLR), Autoregressive Moving Average (ARIMA) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) using root mean square errors (RMSE) and mean absolute errors (MAE). Based on the experimental results, it appears that the WMLR model performs better than the other forecasting technique tested in this study.

  14. Inductance analyzer based on auto-balanced circuit for precision measurement of fluxgate impedance

    NASA Astrophysics Data System (ADS)

    Setiadi, Rahmondia N.; Schilling, Meinhard

    2018-05-01

    An instrument for fluxgate sensor impedance measurement based on an auto-balanced circuit has been designed and characterized. The circuit design is adjusted to comply with the fluxgate sensor characteristics which are low impedance and highly saturable core with very high permeability. The system utilizes a NI-DAQ card and LabVIEW to process the signal acquisition and evaluation. Some fixed reference resistances are employed for system calibration using linear regression. A multimeter HP 34401A and impedance analyzer Agilent 4294A are used as calibrator and validator for the resistance and inductance measurements. Here, we realized a fluxgate analyzer instrument based on auto-balanced circuit, which measures the resistance and inductance of the device under test with a small error and much lower excitation current to avoid core saturation compared to the used calibrator.

  15. Carpooling: status and potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kendall, D.C.

    1975-06-01

    Studies were conducted to analyze the status and potential of work-trip carpooling as a means of achieving more efficient use of the automobile. Current and estimated maximum potential levels of carpooling are presented together with analyses revealing characteristics of carpool trips, incentives, impacts of increased carpooling and issues related to carpool matching services. National survey results indicate the average auto occupancy for urban work-trip is 1.2 passengers per auto. This value, and average carpool occupancy of 2.5, have been relatively stable over the last five years. An increase in work-trip occupancy from 1.2 to 1.8 would require a 100% increasemore » in the number of carpoolers. A model was developed to predict the maximum potential level of carpooling in an urban area. Results from applying the model to the Boston region were extrapolated to estimate a maximum nationwide potential between 47 and 71% of peak period auto commuters. Maximum benefits of increased carpooling include up to 10% savings in auto fuel consumption. A technique was developed for estimating the number of participants required in a carpool matching service to achieve a chosen level of matching among respondents, providing insight into tradeoffs between employer and regional or centralized matching services. Issues recommended for future study include incentive policies and their impacts on other modes, and the evaluation of new and ongoing carpool matching services. (11 references) (GRA)« less

  16. Digital Daily Cycles of Individuals

    NASA Astrophysics Data System (ADS)

    Aledavood, Talayeh; Lehmann, Sune; Saramäki, Jari

    2015-10-01

    Humans, like almost all animals, are phase-locked to the diurnal cycle. Most of us sleep at night and are active through the day. Because we have evolved to function with this cycle, the circadian rhythm is deeply ingrained and even detectable at the biochemical level. However, within the broader day-night pattern, there are individual differences: e.g., some of us are intrinsically morning-active, while others prefer evenings. In this article, we look at digital daily cycles: circadian patterns of activity viewed through the lens of auto-recorded data of communication and online activity. We begin at the aggregate level, discuss earlier results, and illustrate differences between population-level daily rhythms in different media. Then we move on to the individual level, and show that there is a strong individual-level variation beyond averages: individuals typically have their distinctive daily pattern that persists in time. We conclude by discussing the driving forces behind these signature daily patterns, from personal traits (morningness/eveningness) to variation in activity level and external constraints, and outline possibilities for future research.

  17. Depression May Reduce Adherence during CPAP Titration Trial

    PubMed Central

    Law, Mandy; Naughton, Matthew; Ho, Sally; Roebuck, Teanau; Dabscheck, Eli

    2014-01-01

    Study Objectives: Depression is a risk factor for medication non-compliance. We aimed to identify if depression is associated with poorer adherence during home-based autotitrating continuous positive airway pressure (autoPAP) titration. Design: Mixed retrospective-observational study. Setting: Academic center. Participants: Two-hundred forty continuous positive airway pressure-naïve obstructive sleep apnea (OSA) patients. Measurements: Patients underwent approximately 1 week of home-based autoPAP titration with adherence data downloaded from the device. Electronic hospital records were reviewed in a consecutive manner for inclusion. Three areas of potential predictors were examined: (i) demographics and clinical factors, (ii) disease severity, and (iii) device-related variables. Depression and anxiety were assessed using the Hospital Anxiety and Depression Scale (HADS). Scores on the subscales were categorized as normal or clinical diagnoses of depression (≥ 8) and anxiety (≥ 11). The primary outcome variable was the mean hours of autoPAP used per night. Results: Patients were diagnosed with OSA by either attended polysomnography (n = 73, AHI 25.5[15.1-41.5]) or unattended home oximetry (n = 167, ODI3 34.0[22.4-57.4]) and had home-based autoPAP titration over 6.2 ± 1.2 nights. Mean autoPAP use was 4.5 ± 2.4 hours per night. Multiple linear regression analysis revealed that depression and lower 95th percentile pressures significantly predicted lesser hours of autoPAP use (R2 = 0.19, p < 0.001). Significantly milder OSA in those requiring lower pressures may have confounded the relationship between 95th percentile pressure and autoPAP use. Conclusion: Depression was independently associated with poorer adherence during home-based autoPAP titration. Depression may be a potential target for clinicians and future research aimed at enhancing adherence to autoPAP therapy. Citation: Law M; Naughton M; Ho S; Roebuck T; Dabscheck E. Depression may reduce adherence during CPAP titration trial. J Clin Sleep Med 2014;10(2):163-169. PMID:24532999

  18. Evaluation of a commercial automatic treatment planning system for prostate cancers.

    PubMed

    Nawa, Kanabu; Haga, Akihiro; Nomoto, Akihiro; Sarmiento, Raniel A; Shiraishi, Kenshiro; Yamashita, Hideomi; Nakagawa, Keiichi

    2017-01-01

    Recent developments in Radiation Oncology treatment planning have led to the development of software packages that facilitate automated intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) planning. Such solutions include site-specific modules, plan library methods, and algorithm-based methods. In this study, the plan quality for prostate cancer generated by the Auto-Planning module of the Pinnacle 3 radiation therapy treatment planning system (v9.10, Fitchburg, WI) is retrospectively evaluated. The Auto-Planning module of Pinnacle 3 uses a progressive optimization algorithm. Twenty-three prostate cancer cases, which had previously been planned and treated without lymph node irradiation, were replanned using the Auto-Planning module. Dose distributions were statistically compared with those of manual planning by the paired t-test at 5% significance level. Auto-Planning was performed without any manual intervention. Planning target volume (PTV) dose and dose to rectum were comparable between Auto-Planning and manual planning. The former, however, significantly reduced the dose to the bladder and femurs. Regression analysis was performed to examine the correlation between volume overlap between bladder and PTV divided by the total bladder volume and resultant V70. The findings showed that manual planning typically exhibits a logistic way for dose constraint, whereas Auto-Planning shows a more linear tendency. By calculating the Akaike information criterion (AIC) to validate the statistical model, a reduction of interoperator variation in Auto-Planning was shown. We showed that, for prostate cancer, the Auto-Planning module provided plans that are better than or comparable with those of manual planning. By comparing our results with those previously reported for head and neck cancer treatment, we recommend the homogeneous plan quality generated by the Auto-Planning module, which exhibits less dependence on anatomic complexity. Copyright © 2017 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  19. MetMSLine: an automated and fully integrated pipeline for rapid processing of high-resolution LC-MS metabolomic datasets.

    PubMed

    Edmands, William M B; Barupal, Dinesh K; Scalbert, Augustin

    2015-03-01

    MetMSLine represents a complete collection of functions in the R programming language as an accessible GUI for biomarker discovery in large-scale liquid-chromatography high-resolution mass spectral datasets from acquisition through to final metabolite identification forming a backend to output from any peak-picking software such as XCMS. MetMSLine automatically creates subdirectories, data tables and relevant figures at the following steps: (i) signal smoothing, normalization, filtration and noise transformation (PreProc.QC.LSC.R); (ii) PCA and automatic outlier removal (Auto.PCA.R); (iii) automatic regression, biomarker selection, hierarchical clustering and cluster ion/artefact identification (Auto.MV.Regress.R); (iv) Biomarker-MS/MS fragmentation spectra matching and fragment/neutral loss annotation (Auto.MS.MS.match.R) and (v) semi-targeted metabolite identification based on a list of theoretical masses obtained from public databases (DBAnnotate.R). All source code and suggested parameters are available in an un-encapsulated layout on http://wmbedmands.github.io/MetMSLine/. Readme files and a synthetic dataset of both X-variables (simulated LC-MS data), Y-variables (simulated continuous variables) and metabolite theoretical masses are also available on our GitHub repository. © The Author 2014. Published by Oxford University Press.

  20. MetMSLine: an automated and fully integrated pipeline for rapid processing of high-resolution LC–MS metabolomic datasets

    PubMed Central

    Edmands, William M. B.; Barupal, Dinesh K.; Scalbert, Augustin

    2015-01-01

    Summary: MetMSLine represents a complete collection of functions in the R programming language as an accessible GUI for biomarker discovery in large-scale liquid-chromatography high-resolution mass spectral datasets from acquisition through to final metabolite identification forming a backend to output from any peak-picking software such as XCMS. MetMSLine automatically creates subdirectories, data tables and relevant figures at the following steps: (i) signal smoothing, normalization, filtration and noise transformation (PreProc.QC.LSC.R); (ii) PCA and automatic outlier removal (Auto.PCA.R); (iii) automatic regression, biomarker selection, hierarchical clustering and cluster ion/artefact identification (Auto.MV.Regress.R); (iv) Biomarker—MS/MS fragmentation spectra matching and fragment/neutral loss annotation (Auto.MS.MS.match.R) and (v) semi-targeted metabolite identification based on a list of theoretical masses obtained from public databases (DBAnnotate.R). Availability and implementation: All source code and suggested parameters are available in an un-encapsulated layout on http://wmbedmands.github.io/MetMSLine/. Readme files and a synthetic dataset of both X-variables (simulated LC–MS data), Y-variables (simulated continuous variables) and metabolite theoretical masses are also available on our GitHub repository. Contact: ScalbertA@iarc.fr PMID:25348215

  1. [Study of blending method for the extracts of herbal plants].

    PubMed

    Liu, Yongsuo; Cao, Min; Chen, Yuying; Hu, Yuzhu; Wang, Yiming; Luo, Guoan

    2006-03-01

    The irregularity in herbal plant composition is influenced by multiple factors. As for quality control of traditional Chinese medicine, the most critical challenge is to ensure the dosage content uniformity. This content uniformity can be improved by blending different batches of the extracts of herbal plants. Nonlinear least-squares regression was used to calculate the blending coefficient, which means no great absolute differences allowed for all ingredients. For traditional Chinese medicines, even relatively smaller differences could present to be very important for all the ingredients. The auto-scaling pretreatment was used prior to the calculation of the blending coefficients. The pretreatment buffered the characteristics of individual data for the ingredients in different batches, so an improved auto-scaling pretreatment method was proposed. With the improved auto-scaling pretreatment, the relative. differences decreased after blending different batches of extracts of herbal plants according to the reference samples. And the content uniformity control of the specific ingredients could be achieved by the error control coefficient. In the studies for the extracts of fructus gardeniae, the relative differences of all the ingredients is less than 3% after blending different batches of the extracts. The results showed that nonlinear least-squares regression can be used to calculate the blending coefficient of the herbal plant extracts.

  2. Atmospheric mold spore counts in relation to meteorological parameters

    NASA Astrophysics Data System (ADS)

    Katial, R. K.; Zhang, Yiming; Jones, Richard H.; Dyer, Philip D.

    Fungal spore counts of Cladosporium, Alternaria, and Epicoccum were studied during 8 years in Denver, Colorado. Fungal spore counts were obtained daily during the pollinating season by a Rotorod sampler. Weather data were obtained from the National Climatic Data Center. Daily averages of temperature, relative humidity, daily precipitation, barometric pressure, and wind speed were studied. A time series analysis was performed on the data to mathematically model the spore counts in relation to weather parameters. Using SAS PROC ARIMA software, a regression analysis was performed, regressing the spore counts on the weather variables assuming an autoregressive moving average (ARMA) error structure. Cladosporium was found to be positively correlated (P<0.02) with average daily temperature, relative humidity, and negatively correlated with precipitation. Alternaria and Epicoccum did not show increased predictability with weather variables. A mathematical model was derived for Cladosporium spore counts using the annual seasonal cycle and significant weather variables. The model for Alternaria and Epicoccum incorporated the annual seasonal cycle. Fungal spore counts can be modeled by time series analysis and related to meteorological parameters controlling for seasonallity; this modeling can provide estimates of exposure to fungal aeroallergens.

  3. Zika pandemic online trends, incidence and health risk communication: a time trend study

    PubMed Central

    Neumark, Yehuda; Gesser-Edelsburg, Anat; Abu Ahmad, Wiessam

    2017-01-01

    Objectives We aimed to describe the online search trends of Zika and examine their association with Zika incidence, assess the content of Zika-related press releases issued by leading health authorities and examine the association between online trends and press release timing. Design Using Google Trends, the 1 May 2015 to 30 May 2016 online trends of Zika and associated search terms were studied globally and in the five countries with the highest numbers of suspected cases. Correlations were then examined between online trends and Zika incidence in these countries. All Zika-related press releases issued by WHO/Pan America Health Organization (PAHO) and Centers for Disease Control and Prevention (CDC) during the study period were assessed for transparency, uncertainty and audience segmentation. Witte's Extended Parallel Process Model was applied to assess self-efficacy, response efficacy, susceptibility and severity. AutoRegressive Integrated Moving Average with an eXogenous predictor variable (ARIMAX) (p,d,q) regression modelling was used to quantify the association between online trends and the timing of press releases. Results Globally, Zika online search trends were low until the beginning of 2016, when interest rose steeply. Strong correlations (r=0.748–0.922; p<0.001) were observed between online trends and the number of suspected Zika cases in four of the five countries studied. Compared with press releases issued by WHO/PAHO, CDC press releases were significantly more likely to provide contact details and links to other resources, include figures/graphs, be risk-advisory in nature and be more readable and briefer. ARIMAX modelling results indicate that online trends preceded by 1 week press releases by WHO (stationary-R2=0.345; p<0.001) and CDC (stationary-R2=0.318; p=0.014). Conclusions These results suggest that online trends can aid in pandemic surveillance. Identification of shortcomings in the content and timing of Zika press releases can help guide health communication efforts in the current pandemic and future public health emergencies. PMID:29082006

  4. Moving Forward: College and Career Transitions of LAMP Graduates. Findings from the LAMP Longitudinal Study.

    ERIC Educational Resources Information Center

    MacAllum, Keith; Yoder, Karla; Kim, Scott; Bozick, Robert

    A longitudinal study examined the college and career transitions of graduates of the Lansing Area Manufacturing Partnership (LAMP) program, which is a school-to-career (STC) program sponsored by the United Auto Workers, General Motors Corporation, and Michigan's Ingham County Intermediate School District. The progress of three cohorts of LAMP…

  5. Drought Patterns Forecasting using an Auto-Regressive Logistic Model

    NASA Astrophysics Data System (ADS)

    del Jesus, M.; Sheffield, J.; Méndez Incera, F. J.; Losada, I. J.; Espejo, A.

    2014-12-01

    Drought is characterized by a water deficit that may manifest across a large range of spatial and temporal scales. Drought may create important socio-economic consequences, many times of catastrophic dimensions. A quantifiable definition of drought is elusive because depending on its impacts, consequences and generation mechanism, different water deficit periods may be identified as a drought by virtue of some definitions but not by others. Droughts are linked to the water cycle and, although a climate change signal may not have emerged yet, they are also intimately linked to climate.In this work we develop an auto-regressive logistic model for drought prediction at different temporal scales that makes use of a spatially explicit framework. Our model allows to include covariates, continuous or categorical, to improve the performance of the auto-regressive component.Our approach makes use of dimensionality reduction (principal component analysis) and classification techniques (K-Means and maximum dissimilarity) to simplify the representation of complex climatic patterns, such as sea surface temperature (SST) and sea level pressure (SLP), while including information on their spatial structure, i.e. considering their spatial patterns. This procedure allows us to include in the analysis multivariate representation of complex climatic phenomena, as the El Niño-Southern Oscillation. We also explore the impact of other climate-related variables such as sun spots. The model allows to quantify the uncertainty of the forecasts and can be easily adapted to make predictions under future climatic scenarios. The framework herein presented may be extended to other applications such as flash flood analysis, or risk assessment of natural hazards.

  6. Earthquakes Magnitude Predication Using Artificial Neural Network in Northern Red Sea Area

    NASA Astrophysics Data System (ADS)

    Alarifi, A. S.; Alarifi, N. S.

    2009-12-01

    Earthquakes are natural hazards that do not happen very often, however they may cause huge losses in life and property. Early preparation for these hazards is a key factor to reduce their damage and consequence. Since early ages, people tried to predicate earthquakes using simple observations such as strange or a typical animal behavior. In this paper, we study data collected from existing earthquake catalogue to give better forecasting for future earthquakes. The 16000 events cover a time span of 1970 to 2009, the magnitude range from greater than 0 to less than 7.2 while the depth range from greater than 0 to less than 100km. We propose a new artificial intelligent predication system based on artificial neural network, which can be used to predicate the magnitude of future earthquakes in northern Red Sea area including the Sinai Peninsula, the Gulf of Aqaba, and the Gulf of Suez. We propose a feed forward new neural network model with multi-hidden layers to predicate earthquakes occurrences and magnitudes in northern Red Sea area. Although there are similar model that have been published before in different areas, to our best knowledge this is the first neural network model to predicate earthquake in northern Red Sea area. Furthermore, we present other forecasting methods such as moving average over different interval, normally distributed random predicator, and uniformly distributed random predicator. In addition, we present different statistical methods and data fitting such as linear, quadratic, and cubic regression. We present a details performance analyses of the proposed methods for different evaluation metrics. The results show that neural network model provides higher forecast accuracy than other proposed methods. The results show that neural network achieves an average absolute error of 2.6% while an average absolute error of 3.8%, 7.3% and 6.17% for moving average, linear regression and cubic regression, respectively. In this work, we show an analysis of earthquakes data in northern Red Sea area for different statistics parameters such as correlation, mean, standard deviation, and other. This analysis is to provide a deep understand of the Seismicity of the area, and existing patterns.

  7. A generic sun-tracking algorithm for on-axis solar collector in mobile platforms

    NASA Astrophysics Data System (ADS)

    Lai, An-Chow; Chong, Kok-Keong; Lim, Boon-Han; Ho, Ming-Cheng; Yap, See-Hao; Heng, Chun-Kit; Lee, Jer-Vui; King, Yeong-Jin

    2015-04-01

    This paper proposes a novel dynamic sun-tracking algorithm which allows accurate tracking of the sun for both non-concentrated and concentrated photovoltaic systems located on mobile platforms to maximize solar energy extraction. The proposed algorithm takes not only the date, time, and geographical information, but also the dynamic changes of coordinates of the mobile platforms into account to calculate the sun position angle relative to ideal azimuth-elevation axes in real time using general sun-tracking formulas derived by Chong and Wong. The algorithm acquires data from open-loop sensors, i.e. global position system (GPS) and digital compass, which are readily available in many off-the-shelf portable gadgets, such as smart phone, to instantly capture the dynamic changes of coordinates of mobile platforms. Our experiments found that a highly accurate GPS is not necessary as the coordinate changes of practical mobile platforms are not fast enough to produce significant differences in the calculation of the incident angle. On the contrary, it is critical to accurately identify the quadrant and angle where the mobile platforms are moving toward in real time, which can be resolved by using digital compass. In our implementation, a noise filtering mechanism is found necessary to remove unexpected spikes in the readings of the digital compass to ensure stability in motor actuations and effectiveness in continuous tracking. Filtering mechanisms being studied include simple moving average and linear regression; the results showed that a compound function of simple moving average and linear regression produces a better outcome. Meanwhile, we found that a sampling interval is useful to avoid excessive motor actuations and power consumption while not sacrificing the accuracy of sun-tracking.

  8. Ionospheric TEC from the Turkish Permanent GNSS Network (TPGN) and comparison with ARMA and IRI models

    NASA Astrophysics Data System (ADS)

    Ansari, Kutubuddin; Panda, Sampad Kumar; Althuwaynee, Omar F.; Corumluoglu, Ozsen

    2017-09-01

    The present study investigates the ionospheric Total Electron Content (TEC) variations in the lower mid-latitude Turkish region from the Turkish Permanent GNSS Network (TPGN) and International GNSS Services (IGS) observations during the year 2016. The corresponding vertical TEC (VTEC) predicted by Auto Regressive Moving Average (ARMA) and International Reference Ionosphere 2016 (IRI-2016) models are evaluated to realize their effectiveness over the region. The spatial, diurnal and seasonal behavior of VTEC and the relative VTEC variations are modeled with Ordinary Least Square Estimator (OLSE). The spatial behavior of modeled result during March equinox and June solstice indicates an inverse relationship of VTEC with the longitude across the region. On the other hand, the VTEC variation during September equinox and December solstice including March equinox and June solstice are decreasing with increase in latitude. The GNSS observed and modeled diurnal variation of the VTEC show that the VTEC slowly increases with dawn, attains a broader duration of peak around 09.00 to 12.00 UT, and thereafter decreases gradually reaching minimum around 21.00 UT. The seasonal variation of VTEC shows an annual mode, maxima in equinox and minima in solstice. The average value of VTEC during the June solstice is with slightly higher value than the March equinox though variations during the latter season is more. Moreover, the study shows minimum average value during December solstice compared to June solstice at all stations. The comparative analysis demonstrates the prediction errors by OLSE, ARMA and IRI remaining between 0.23 to 1.17%, 2.40 to 4.03% and 24.82 to 25.79% respectively. Also, the observed VTEC seasonal variation has good agreement with OLSE and ARMA models whereas IRI-VTEC often underestimated the observed value at each location. Hence, the deviations of IRI estimated VTEC compared to ARMA and OLSE models claim further improvements in IRI model over the Turkish region. Although IRI estimations are well accepted over the mid-latitudes but the performance over the lower mid-latitudes is not satisfactory and needs further improvement. The long-term TEC data from the TPGN network can be incorporated in the IRI under laying database with appropriate calibration for further improvement of estimation accuracy over the region.

  9. AutoNR: an automated system that measures ECAP thresholds with the Nucleus Freedom cochlear implant via machine intelligence.

    PubMed

    Botros, Andrew; van Dijk, Bas; Killian, Matthijs

    2007-05-01

    AutoNRT is an automated system that measures electrically evoked compound action potential (ECAP) thresholds from the auditory nerve with the Nucleus Freedom cochlear implant. ECAP thresholds along the electrode array are useful in objectively fitting cochlear implant systems for individual use. This paper provides the first detailed description of the AutoNRT algorithm and its expert systems, and reports the clinical success of AutoNRT to date. AutoNRT determines thresholds by visual detection, using two decision tree expert systems that automatically recognise ECAPs. The expert systems are guided by a dataset of 5393 neural response measurements. The algorithm approaches threshold from lower stimulus levels, ensuring recipient safety during postoperative measurements. Intraoperative measurements use the same algorithm but proceed faster by beginning at stimulus levels much closer to threshold. When searching for ECAPs, AutoNRT uses a highly specific expert system (specificity of 99% during training, 96% during testing; sensitivity of 91% during training, 89% during testing). Once ECAPs are established, AutoNRT uses an unbiased expert system to determine an accurate threshold. Throughout the execution of the algorithm, recording parameters (such as implant amplifier gain) are automatically optimised when needed. In a study that included 29 intraoperative and 29 postoperative subjects (a total of 418 electrodes), AutoNRT determined a threshold in 93% of cases where a human expert also determined a threshold. When compared to the median threshold of multiple human observers on 77 randomly selected electrodes, AutoNRT performed as accurately as the 'average' clinician. AutoNRT has demonstrated a high success rate and a level of performance that is comparable with human experts. It has been used in many clinics worldwide throughout the clinical trial and commercial launch of Nucleus Custom Sound Suite, significantly streamlining the clinical procedures associated with cochlear implant use.

  10. Do alcohol excise taxes affect traffic accidents? Evidence from Estonia.

    PubMed

    Saar, Indrek

    2015-01-01

    This article examines the association between alcohol excise tax rates and alcohol-related traffic accidents in Estonia. Monthly time series of traffic accidents involving drunken motor vehicle drivers from 1998 through 2013 were regressed on real average alcohol excise tax rates while controlling for changes in economic conditions and the traffic environment. Specifically, regression models with autoregressive integrated moving average (ARIMA) errors were estimated in order to deal with serial correlation in residuals. Counterfactual models were also estimated in order to check the robustness of the results, using the level of non-alcohol-related traffic accidents as a dependent variable. A statistically significant (P <.01) strong negative relationship between the real average alcohol excise tax rate and alcohol-related traffic accidents was disclosed under alternative model specifications. For instance, the regression model with ARIMA (0, 1, 1)(0, 1, 1) errors revealed that a 1-unit increase in the tax rate is associated with a 1.6% decrease in the level of accidents per 100,000 population involving drunk motor vehicle drivers. No similar association was found in the cases of counterfactual models for non-alcohol-related traffic accidents. This article indicates that the level of alcohol-related traffic accidents in Estonia has been affected by changes in real average alcohol excise taxes during the period 1998-2013. Therefore, in addition to other measures, the use of alcohol taxation is warranted as a policy instrument in tackling alcohol-related traffic accidents.

  11. Climate variations and salmonellosis transmission in Adelaide, South Australia: a comparison between regression models

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Bi, Peng; Hiller, Janet

    2008-01-01

    This is the first study to identify appropriate regression models for the association between climate variation and salmonellosis transmission. A comparison between different regression models was conducted using surveillance data in Adelaide, South Australia. By using notified salmonellosis cases and climatic variables from the Adelaide metropolitan area over the period 1990-2003, four regression methods were examined: standard Poisson regression, autoregressive adjusted Poisson regression, multiple linear regression, and a seasonal autoregressive integrated moving average (SARIMA) model. Notified salmonellosis cases in 2004 were used to test the forecasting ability of the four models. Parameter estimation, goodness-of-fit and forecasting ability of the four regression models were compared. Temperatures occurring 2 weeks prior to cases were positively associated with cases of salmonellosis. Rainfall was also inversely related to the number of cases. The comparison of the goodness-of-fit and forecasting ability suggest that the SARIMA model is better than the other three regression models. Temperature and rainfall may be used as climatic predictors of salmonellosis cases in regions with climatic characteristics similar to those of Adelaide. The SARIMA model could, thus, be adopted to quantify the relationship between climate variations and salmonellosis transmission.

  12. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia.

    PubMed

    Loha, Eskindir; Lindtjørn, Bernt

    2010-06-16

    Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations), temperature (17 locations), and relative humidity (three locations). Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF) models and univariate auto-regressive integrated moving average (ARIMA) when there was no significant predictor meteorological variable. Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations) or when coupled with meteorological variables (four locations) was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location to location, and among lagged effects, data transformation forms, ARIMA and TF orders. This study describes P. falciparum malaria incidence models linked with meteorological data. Variability in the models was principally attributed to regional differences, and a single model was not found that fits all locations. Past P. falciparum malaria incidence appeared to be a superior predictor than meteorology. Future efforts in malaria modelling may benefit from inclusion of non-meteorological factors.

  13. Alternatives to the Moving Average

    Treesearch

    Paul C. van Deusen

    2001-01-01

    There are many possible estimators that could be used with annual inventory data. The 5-year moving average has been selected as a default estimator to provide initial results for states having available annual inventory data. User objectives for these estimates are discussed. The characteristics of a moving average are outlined. It is shown that moving average...

  14. An Evaluation of a Biological Slide-Tutorial Program.

    ERIC Educational Resources Information Center

    Chan, Gordon L.

    Described is an auto-tutorial slide program for zoology students. A self-paced system was devised for observing the subject matter covered in the twelve study units of a zoology course. The post-testing evaluation revealed that students with lower grade point averages achieved scores comparable with students of higher grade point averages.…

  15. [Comparison analysis of outcomes in primary light chain amyloidosis patients treated by auto peripheral blood stem cell transplantation or bortezomib plus dexamethasone].

    PubMed

    Zhao, Qian; Wang, Liping; Song, Ping; Li, Feng; Zhou, Xiaogang; Yu, Yaping; An, Zhiming; Wang, Xuli; Zhai, Yongping

    2016-04-01

    To explore the feature of primary light chain amyloidosis patients treated with high-dose melphalan with auto peripheral blood stem cell transplantation (auto-PBSCT) and bortezomib plus dexamethasone (VD). Thirty-eight patients diagnosed from September 2004 to September 2012 were analyzed retrospectively, including 15 cases received auto-PBSCT, 23 cases exposed with VD. The median follow-up duration for the patients was 34 months (range, 1-112 months), including auto-PBSCT group of 38 months (range, 5-112 months) and VD group of 31 months (range, 1-108 months). The organ response rate in all the patients was 39.5% (15/38), and the organ response rate between these two groups has no significant difference [33.3% (5/15) vs 43.5% (10/23), P=0.532]. However, the median time of organ response was significant difference [6 (3-10) months vs 3 (1-6) months, respectively (P=0.032)]. The 3-year overall survival (OS) rates in the two groups were 72.0% and 66.9%, and their average survival were 84.7 months and 75.9 months, respectively (P=0.683). In the patients with auto-PBSCT, the occurrence of III-IV grade of bone marrow suppression (P<0.001), fever (P<0.001), nausea and infection (P=0.006) were obviously higher than those with VD, but there was no statistically significant difference in pulmonary infection (P=0.069) and bloodstream infection (P=0.059). The preliminary results have presented that primary light chain amyloidosis patients treated with auto-PBSCT or VD had similar organ response rate and survival. However, more adverse events occurred in the group of auto-PBSCT.

  16. Associations between Changes in City and Address Specific Temperature and QT Interval - The VA Normative Aging Study

    PubMed Central

    Mehta, Amar J.; Kloog, Itai; Zanobetti, Antonella; Coull, Brent A.; Sparrow, David; Vokonas, Pantel; Schwartz, Joel

    2014-01-01

    Background The underlying mechanisms of the association between ambient temperature and cardiovascular morbidity and mortality are not well understood, particularly for daily temperature variability. We evaluated if daily mean temperature and standard deviation of temperature was associated with heart rate-corrected QT interval (QTc) duration, a marker of ventricular repolarization in a prospective cohort of older men. Methods This longitudinal analysis included 487 older men participating in the VA Normative Aging Study with up to three visits between 2000–2008 (n = 743). We analyzed associations between QTc and moving averages (1–7, 14, 21, and 28 days) of the 24-hour mean and standard deviation of temperature as measured from a local weather monitor, and the 24-hour mean temperature estimated from a spatiotemporal prediction model, in time-varying linear mixed-effect regression. Effect modification by season, diabetes, coronary heart disease, obesity, and age was also evaluated. Results Higher mean temperature as measured from the local monitor, and estimated from the prediction model, was associated with longer QTc at moving averages of 21 and 28 days. Increased 24-hr standard deviation of temperature was associated with longer QTc at moving averages from 4 and up to 28 days; a 1.9°C interquartile range increase in 4-day moving average standard deviation of temperature was associated with a 2.8 msec (95%CI: 0.4, 5.2) longer QTc. Associations between 24-hr standard deviation of temperature and QTc were stronger in colder months, and in participants with diabetes and coronary heart disease. Conclusion/Significance In this sample of older men, elevated mean temperature was associated with longer QTc, and increased variability of temperature was associated with longer QTc, particularly during colder months and among individuals with diabetes and coronary heart disease. These findings may offer insight of an important underlying mechanism of temperature-related cardiovascular morbidity and mortality in an older population. PMID:25238150

  17. Application and System Design of Elastomer Based Optofluidic Lenses

    NASA Astrophysics Data System (ADS)

    Savidis, Nickolaos

    Adaptive optic technology has revolutionized real time correction of wavefront aberrations. Optofluidic based applied optic devices have offered an opportunity to produce flexible refractive lenses in the correction of wavefronts. Fluidic lenses have superiority relative to their solid lens counterparts in their capabilities of producing tunable optical systems, that when synchronized, can produce real time variable systems with no moving parts. We have developed optofluidic fluidic lenses for applications of applied optical devices, as well as ophthalmic optic devices. The first half of this dissertation discusses the production of fluidic lenses as optical devices. In addition, the design and testing of various fluidic systems made with these components are evaluated. We begin with the creation of spherical or defocus singlet fluidic lenses. We then produced zoom optical systems with no moving parts by synchronizing combinations of these fluidic spherical lenses. The variable power zoom system incorporates two singlet fluidic lenses that are synchronized. The coupled device has no moving parts and has produced a magnification range of 0.1 x to 10 x or a 20 x magnification range. The chapter after fluidic zoom technology focuses on producing achromatic lens designs. We offer an analysis of a hybrid diffractive and refractive achromat that offers discrete achromatized variable focal lengths. In addition, we offer a design of a fully optofluidic based achromatic lens. By synchronizing the two membrane surfaces of the fluidic achromat we develop a design for a fluidic achromatic lens. The second half of this dissertation discusses the production of optofluidic technology in ophthalmic applications. We begin with an introduction to an optofluidic phoropter system. A fluidic phoropter is designed through the combination of a defocus lens with two cylindrical fluidic lenses that are orientated 45° relative to each other. Here we discuss the designs of the fluidic cylindrical lens coupled with a previously discussed defocus singlet lens. We then couple this optofluidic phoropter with relay optics and Shack-Hartmann wavefront sensing technology to produce an auto-phoropter device. The auto-phoropter system combines a refractometer designed Shack-Hartmann wavefront sensor with the compact refractive fluidic lens phoropter. This combination allows for the identification and control of ophthalmic cylinder, cylinder axis, as well as refractive error. The closed loop system of the fluidic phoropter with refractometer enables for the creation of our see-through auto-phoropter system. The design and testing of several generations of transmissive see-through auto-phoropter devices are presented in this section.

  18. [Evaluation of selected endocrine complications in patients treated with auto- and allo-haematopoietic stem cell transplantation].

    PubMed

    Niedzielska, Ewa; Wójcik, Dorota; Barg, Ewa; Pietras, Wojciech; Sega-Pondel, Dorota; Doroszko, Adrian; Niedzielska, Małgorzata; Skarzyńska, Małgorzata; Chybicka, Alicja

    2008-01-01

    The aim of this study was to evaluate the endocrine complications, in particular disorders of growth and thyroid function and glucose metabolism dysfunctions in patients treated with allo- and auto-haematopoietic stem cell transplantation (HSCT). The investigated group consisted of: I. 16 patients after auto-HSCT (6 girls, 10 boys) aged 3-20 years (average 10,8+/-) because of acute myelogenous leukaemia (n=5), non Hodgkin lymphoma (n=3), neuroblastoma (n=3), embryonal cancer (n=2), medulloblastoma (n=1), Ewing's sarcoma/PNET (n=1), hyper eosinophilic syndrome (n=1). High dose chemiotherapy (HDC/T) included: BU/MEL (busulfan/melfalan) (n=7), BEAM (carmustine, eteposide, cytosine arabinose, melfalan) (n=3). II. 30 patients after allo-HSCT (20 girls, 10 boys) aged 3-17 years (average 9,56). Indication for HSCT was acute lymphoblastic leukaemia (n=11), acute myelogenous leukaemia (n=5), chronic myeloid leukaemia-CML (n=6), myelodysplastic syndromes (n=2), non Hodgkin lymphoma (n=1), juvenile myelomonocytic leukemia (n=1), severe aplastic anaemia (n=1), Blackfan-Diamond anaemia (n=1), severe combined immune deficiency (n=1), rhabdomyosarcoma (n=1). The patients underwent the following types of transplantation: HSCT of matched sibling donor (n=13), HSCT of matched unrelated donor (n=11) and HLA-mismatched related donor (n=6). The preparative regimens consisted of HDC/T usually BU/MEL (n=3); BU/CY/VP (busulfan, cyclophosphamide, etoposide) (6); BU/CY/ATG (anti-thymocyte globulin) (n=5), VP/ATG/TBI (total body irradiation) (n=3). 19 children received CI (cranial irradiation) prior to grafting: auto-HSCT (n=6) and allo-HSCT (n=13) and 6 patients underwent TBI. 18 children received high steroid doses at least 28 days before transplant, 4 patients in the auto-HSCT group, and in the allo-HSCT group 14 patients before and 20 after HSCT procedure. The analysis of thyroid-stimulating hormone (TSH), free triiodothyronine (fT3), free thyroxine (fT4), insulin-like growth factor 1 (IGF-1), insulin-like growth factor binding protein 3 (IGFBP-3), hemoglobin A1c (HbA1c), prolactine (PRL), oral glucose tolerance test, growth hormone (GH test) and thyrotropin releasing hormone (TRH) test was performed in each case. Hypothyroidism was found in 5 patients (3 after allo-HSCT, 2 after auto-HSCT). Thyroid hormone substitution was applied. No case of hyperthyroidism was diagnosed. Growth deficit was found in 8 patients (6 girls, 2 boys) between 13 to 70 months after allo-transplantation (average 36 months). Three children from the above group received CI. Growth hormone substitution was applied in 1 girl (ALL, HLA MM REL, CI). An impaired excretion of GH after stimulation was diagnosed in 14 pts (10 after allo-HSCT, 4 after auto-HSCT). The growth process should still be observed in this subgroup. Glucose intolerance was found in 7 patients: in 4 treated with auto-HSCT and in 3 after allo-HSCT. Diabetes mellitus was diagnosed in none of them. An impaired glucose tolerance curve with increased excretion of insulin was diagnosed in 12 children. Early endocrinological care is necessary in patients treated both with auto-HSCT and allo-HSCT due to high risk of hormonal disorders.

  19. Forecasting daily patient volumes in the emergency department.

    PubMed

    Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L

    2008-02-01

    Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.

  20. Image analysis of multiple moving wood pieces in real time

    NASA Astrophysics Data System (ADS)

    Wang, Weixing

    2006-02-01

    This paper presents algorithms for image processing and image analysis of wood piece materials. The algorithms were designed for auto-detection of wood piece materials on a moving conveyor belt or a truck. When wood objects on moving, the hard task is to trace the contours of the objects in n optimal way. To make the algorithms work efficiently in the plant, a flexible online system was designed and developed, which mainly consists of image acquisition, image processing, object delineation and analysis. A number of newly-developed algorithms can delineate wood objects with high accuracy and high speed, and in the wood piece analysis part, each wood piece can be characterized by a number of visual parameters which can also be used for constructing experimental models directly in the system.

  1. Robust scoring functions for protein-ligand interactions with quantum chemical charge models.

    PubMed

    Wang, Jui-Chih; Lin, Jung-Hsin; Chen, Chung-Ming; Perryman, Alex L; Olson, Arthur J

    2011-10-24

    Ordinary least-squares (OLS) regression has been used widely for constructing the scoring functions for protein-ligand interactions. However, OLS is very sensitive to the existence of outliers, and models constructed using it are easily affected by the outliers or even the choice of the data set. On the other hand, determination of atomic charges is regarded as of central importance, because the electrostatic interaction is known to be a key contributing factor for biomolecular association. In the development of the AutoDock4 scoring function, only OLS was conducted, and the simple Gasteiger method was adopted. It is therefore of considerable interest to see whether more rigorous charge models could improve the statistical performance of the AutoDock4 scoring function. In this study, we have employed two well-established quantum chemical approaches, namely the restrained electrostatic potential (RESP) and the Austin-model 1-bond charge correction (AM1-BCC) methods, to obtain atomic partial charges, and we have compared how different charge models affect the performance of AutoDock4 scoring functions. In combination with robust regression analysis and outlier exclusion, our new protein-ligand free energy regression model with AM1-BCC charges for ligands and Amber99SB charges for proteins achieve lowest root-mean-squared error of 1.637 kcal/mol for the training set of 147 complexes and 2.176 kcal/mol for the external test set of 1427 complexes. The assessment for binding pose prediction with the 100 external decoy sets indicates very high success rate of 87% with the criteria of predicted root-mean-squared deviation of less than 2 Å. The success rates and statistical performance of our robust scoring functions are only weakly class-dependent (hydrophobic, hydrophilic, or mixed).

  2. Quantified moving average strategy of crude oil futures market based on fuzzy logic rules and genetic algorithms

    NASA Astrophysics Data System (ADS)

    Liu, Xiaojia; An, Haizhong; Wang, Lijun; Guan, Qing

    2017-09-01

    The moving average strategy is a technical indicator that can generate trading signals to assist investment. While the trading signals tell the traders timing to buy or sell, the moving average cannot tell the trading volume, which is a crucial factor for investment. This paper proposes a fuzzy moving average strategy, in which the fuzzy logic rule is used to determine the strength of trading signals, i.e., the trading volume. To compose one fuzzy logic rule, we use four types of moving averages, the length of the moving average period, the fuzzy extent, and the recommend value. Ten fuzzy logic rules form a fuzzy set, which generates a rating level that decides the trading volume. In this process, we apply genetic algorithms to identify an optimal fuzzy logic rule set and utilize crude oil futures prices from the New York Mercantile Exchange (NYMEX) as the experiment data. Each experiment is repeated for 20 times. The results show that firstly the fuzzy moving average strategy can obtain a more stable rate of return than the moving average strategies. Secondly, holding amounts series is highly sensitive to price series. Thirdly, simple moving average methods are more efficient. Lastly, the fuzzy extents of extremely low, high, and very high are more popular. These results are helpful in investment decisions.

  3. Associations of long-term fine particulate matter exposure with prevalent hypertension and increased blood pressure in older Americans.

    PubMed

    Honda, Trenton; Pun, Vivian C; Manjourides, Justin; Suh, Helen

    2018-07-01

    Hypertension is a highly prevalent cardiovascular risk factor. It is possible that air pollution, also an established cardiovascular risk factor, may contribute to cardiovascular disease through increasing blood pressure. Previous studies evaluating associations between air pollution and blood pressure have had mixed results. We examined the association between long-term (one-year moving average) air pollutant exposures, prevalent hypertension and blood pressure in 4121 older Americans (57+ years) enrolled in the National Social Life, Health, and Aging Project. We estimated exposures to PM 2.5 using spatio-temporal models and used logistic regression accounting for repeated measures to evaluate the association between long-term average PM 2.5 and prevalence odds of hypertension. We additionally used linear regression to evaluate the associations between air pollutants and systolic, diastolic, mean arterial, and pulse pressures. Health effect models were adjusted for a number of demographic, health and socioeconomic covariates. An inter-quartile range (3.91 μg/m 3 ) increase in the one-year moving average of PM 2.5 was associated with increased: Odds of prevalent hypertension (POR 1.24, 95% CI: 1.11, 1.38), systolic blood pressure (0.93 mm Hg, 95% CI: 0.05, 1.80) and pulse pressure (0.89 mm Hg, 95% CI: 0.21, 1.58). Dose-response relationships were also observed. PM 2.5 was associated with increased odds of prevalent hypertension, and increased systolic pressure and pulse pressure in a cohort of older Americans. These findings add to the growing evidence that air pollution may be an important risk factor for hypertension and perturbations in blood pressure. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Open-Source Logic-Based Automated Sleep Scoring Software using Electrophysiological Recordings in Rats

    PubMed Central

    Gross, Brooks A.; Walsh, Christine M.; Turakhia, Apurva A.; Booth, Victoria; Mashour, George; Poe, Gina R.

    2009-01-01

    Manual state scoring of physiological recordings in sleep studies is time-consuming, resulting in a data backlog, research delays and increased personnel costs. We developed MATLAB-based software to automate scoring of sleep/waking states in rats, potentially extendable to other animals, from a variety of recording systems. The software contains two programs, Sleep Scorer and Auto-Scorer, for manual and automated scoring. Auto-Scorer is a logic-based program that displays power spectral densities of an electromyographic signal and σ, δ, and θ frequency bands of an electroencephalographic signal, along with the δ/θ ratio and σ ×θ, for every epoch. The user defines thresholds from the training file state definitions which the Auto-Scorer uses with logic to discriminate the state of every epoch in the file. Auto-Scorer was evaluated by comparing its output to manually scored files from 6 rats under 2 experimental conditions by 3 users. Each user generated a training file, set thresholds, and autoscored the 12 files into 4 states (waking, non-REM, transition-to-REM, and REM sleep) in ¼ the time required to manually score the file. Overall performance comparisons between Auto-Scorer and manual scoring resulted in a mean agreement of 80.24 +/− 7.87%, comparable to the average agreement among 3 manual scorers (83.03 +/− 4.00%). There was no significant difference between user-user and user-Auto-Scorer agreement ratios. These results support the use of our open-source Auto-Scorer, coupled with user review, to rapidly and accurately score sleep/waking states from rat recordings. PMID:19615408

  5. Age estimation using exfoliative cytology and radiovisiography: A comparative study

    PubMed Central

    Nallamala, Shilpa; Guttikonda, Venkateswara Rao; Manchikatla, Praveen Kumar; Taneeru, Sravya

    2017-01-01

    Introduction: Age estimation is one of the essential factors in establishing the identity of an individual. Among various methods, exfoliative cytology (EC) is a unique, noninvasive technique, involving simple, and pain-free collection of intact cells from the oral cavity for microscopic examination. Objective: The study was undertaken with an aim to estimate the age of an individual from the average cell size of their buccal smears calculated using image analysis morphometric software and the pulp–tooth area ratio in mandibular canine of the same individual using radiovisiography (RVG). Materials and Methods: Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using Papanicolaou stain. The average cell size was measured using image analysis software (Image-Pro Insight 8.0). The RVG images of mandibular canines were obtained, pulp and tooth areas were traced using AutoCAD 2010 software, and area ratio was calculated. The estimated age was then calculated using regression analysis. Results: The paired t-test between chronological age and estimated age by cell size and pulp–tooth area ratio was statistically nonsignificant (P > 0.05). Conclusion: In the present study, age estimated by pulp–tooth area ratio and EC yielded good results. PMID:29657491

  6. Age estimation using exfoliative cytology and radiovisiography: A comparative study.

    PubMed

    Nallamala, Shilpa; Guttikonda, Venkateswara Rao; Manchikatla, Praveen Kumar; Taneeru, Sravya

    2017-01-01

    Age estimation is one of the essential factors in establishing the identity of an individual. Among various methods, exfoliative cytology (EC) is a unique, noninvasive technique, involving simple, and pain-free collection of intact cells from the oral cavity for microscopic examination. The study was undertaken with an aim to estimate the age of an individual from the average cell size of their buccal smears calculated using image analysis morphometric software and the pulp-tooth area ratio in mandibular canine of the same individual using radiovisiography (RVG). Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using Papanicolaou stain. The average cell size was measured using image analysis software (Image-Pro Insight 8.0). The RVG images of mandibular canines were obtained, pulp and tooth areas were traced using AutoCAD 2010 software, and area ratio was calculated. The estimated age was then calculated using regression analysis. The paired t -test between chronological age and estimated age by cell size and pulp-tooth area ratio was statistically nonsignificant ( P > 0.05). In the present study, age estimated by pulp-tooth area ratio and EC yielded good results.

  7. Hemodynamic responses to external counterbalancing of auto-positive end-expiratory pressure in mechanically ventilated patients with chronic obstructive pulmonary disease.

    PubMed

    Baigorri, F; de Monte, A; Blanch, L; Fernández, R; Vallés, J; Mestre, J; Saura, P; Artigas, A

    1994-11-01

    To study the effect of positive end-expiratory pressure (PEEP) on right ventricular hemodynamics and ejection fraction in patients with chronic obstructive pulmonary disease and positive alveolar pressure throughout expiration by dynamic hyperinflation (auto-PEEP). Open, prospective, controlled trial. General intensive care unit of a community hospital. Ten patients sedated and paralyzed with an acute exacerbation of chronic obstructive pulmonary disease undergoing mechanical ventilation. Insertion of a pulmonary artery catheter modified with a rapid response thermistor and a radial arterial catheter. PEEP was then increased from 0 (PEEP 0) to auto-PEEP level (PEEP = auto-PEEP) and 5 cm H2O above that (PEEP = auto-PEEP +5). At each level of PEEP, airway pressures, flow and volume, hemodynamic variables (including right ventricular ejection fraction by thermodilution technique), and blood gas analyses were recorded. The mean auto-PEEP was 6.6 +/- 2.8 cm H2O and the total PEEP reached was 12.2 +/- 2.4 cm H2O. The degree of lung inflation induced by PEEP averaged 145 +/- 87 mL with PEEP = auto-PEEP and 495 +/- 133 mL with PEEP = auto-PEEP + 5. The PEEP = auto-PEEP caused a right ventricular end-diastolic pressure increase, but there was no other significant hemodynamic change. With PEEP = auto-PEEP + 5, there was a significant increase in intravascular pressures; this amount of PEEP reduced cardiac output (from 4.40 +/- 1.38 L/min at PEEP 0 to 4.13 +/- 1.48 L/min; p < .05). The cardiac output reduction induced by PEEP = auto-PEEP + 5 was > 10% in only five cases and this group of patients had significantly lower right ventricular volumes than the group with less cardiac output variation (right ventricular end-diastolic volume: 64 +/- 9 vs. 96 +/- 26 mL/m2; right ventricular end-systolic volume: 38 +/- 6 vs. 65 +/- 21 mL/m2; p < .05) without significant difference in the other variables that were measured. Neither right ventricular ejection fraction nor right ventricle volumes changed as PEEP increased, but there were marked interpatient differences and also pronounced changes in volume between stages in individual patients. In the study conditions, PEEP application up to values approaching auto-PEEP did not result in the impairment of right ventricular hemodynamics, while higher levels reduced cardiac output in selected patients.

  8. Using the Quantile Mapping to improve a weather generator

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Themessl, M.; Gobiet, A.

    2012-04-01

    We developed a weather generator (WG) by using statistical and stochastic methods, among them are quantile mapping (QM), Monte-Carlo, auto-regression, empirical orthogonal function (EOF). One of the important steps in the WG is using QM, through which all the variables, no matter what distribution they originally are, are transformed into normal distributed variables. Therefore, the WG can work on normally distributed variables, which greatly facilitates the treatment of random numbers in the WG. Monte-Carlo and auto-regression are used to generate the realization; EOFs are employed for preserving spatial relationships and the relationships between different meteorological variables. We have established a complete model named WGQM (weather generator and quantile mapping), which can be applied flexibly to generate daily or hourly time series. For example, with 30-year daily (hourly) data and 100-year monthly (daily) data as input, the 100-year daily (hourly) data would be relatively reasonably produced. Some evaluation experiments with WGQM have been carried out in the area of Austria and the evaluation results will be presented.

  9. Rapid and safe learning of robotic gastrectomy for gastric cancer: multidimensional analysis in a comparison with laparoscopic gastrectomy.

    PubMed

    Kim, H-I; Park, M S; Song, K J; Woo, Y; Hyung, W J

    2014-10-01

    The learning curve of robotic gastrectomy has not yet been evaluated in comparison with the laparoscopic approach. We compared the learning curves of robotic gastrectomy and laparoscopic gastrectomy based on operation time and surgical success. We analyzed 172 robotic and 481 laparoscopic distal gastrectomies performed by single surgeon from May 2003 to April 2009. The operation time was analyzed using a moving average and non-linear regression analysis. Surgical success was evaluated by a cumulative sum plot with a target failure rate of 10%. Surgical failure was defined as laparoscopic or open conversion, insufficient lymph node harvest for staging, resection margin involvement, postoperative morbidity, and mortality. Moving average and non-linear regression analyses indicated stable state for operation time at 95 and 121 cases in robotic gastrectomy, and 270 and 262 cases in laparoscopic gastrectomy, respectively. The cumulative sum plot identified no cut-off point for surgical success in robotic gastrectomy and 80 cases in laparoscopic gastrectomy. Excluding the initial 148 laparoscopic gastrectomies that were performed before the first robotic gastrectomy, the two groups showed similar number of cases to reach steady state in operation time, and showed no cut-off point in analysis of surgical success. The experience of laparoscopic surgery could affect the learning process of robotic gastrectomy. An experienced laparoscopic surgeon requires fewer cases of robotic gastrectomy to reach steady state. Moreover, the surgical outcomes of robotic gastrectomy were satisfactory. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Designing components using smartMOVE electroactive polymer technology

    NASA Astrophysics Data System (ADS)

    Rosenthal, Marcus; Weaber, Chris; Polyakov, Ilya; Zarrabi, Al; Gise, Peter

    2008-03-01

    Designing components using SmartMOVE TM electroactive polymer technology requires an understanding of the basic operation principles and the necessary design tools for integration into actuator, sensor and energy generation applications. Artificial Muscle, Inc. is collaborating with OEMs to develop customized solutions for their applications using smartMOVE. SmartMOVE is an advanced and elegant way to obtain almost any kind of movement using dielectric elastomer electroactive polymers. Integration of this technology offers the unique capability to create highly precise and customized motion for devices and systems that require actuation. Applications of SmartMOVE include linear actuators for medical, consumer and industrial applications, such as pumps, valves, optical or haptic devices. This paper will present design guidelines for selecting a smartMOVE actuator design to match the stroke, force, power, size, speed, environmental and reliability requirements for a range of applications. Power supply and controller design and selection will also be introduced. An overview of some of the most versatile configuration options will be presented with performance comparisons. A case example will include the selection, optimization, and performance overview of a smartMOVE actuator for the cell phone camera auto-focus and proportional valve applications.

  11. The Auto-Gopher: A Wireline Rotary-Percussive Deep Sampler

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Zacny, Kris; Badescu, Mircea; Lee, Hyeong Jae; Sherrit, Stewart; Bao, Xiaoqi; Paulsen, Gale L.; Beegle, Luther

    2016-01-01

    Accessing regions on planetary bodies that potentially preserved biosignatures or are presently habitable is vital to meeting NASA solar system "Search for Life" exploration objectives. To address these objectives, a wireline deep rotary-percussive corer called Auto-Gopher was developed. The percussive action provides effective material fracturing and the rotation provides effective cuttings removal. To increase the drill's penetration rate, the percussive and rotary motions are operated simultaneously. Initially, the corer was designed as a percussive mechanism for sampling ice and was demonstrated in 2005 in Antarctica reaching about 2 m deep. The lessons learned suggested the need to use a combination of rotation and hammering to maximize the penetration rate. This lesson was implemented into the Auto-Gopher-I deep drill which was demonstrated to reach 3-meter deep in gypsum. The average drilling power that was used has been in the range of 100-150 Watt, while the penetration rate was approximately 2.4 m/hr. Recently, a task has started with the goal to develop Auto-Gopher-II that is equipped to execute all the necessary functions in a single drilling unit. These functions also include core breaking, retention and ejection in addition drilling. In this manuscript, the Auto-Gopher-II, its predecessors and their capability are described and discussed.

  12. Experimental investigation on the effect of swirling flow on combustion characteristics and performance of solid fuel ramjet

    NASA Astrophysics Data System (ADS)

    Musa, Omer; Weixuan, Li; Xiong, Chen; Lunkun, Gong; Wenhe, Liao

    2018-07-01

    Solid-fuel ramjet converts thermal energy of combustion products to a forward thrust without using any moving parts. Normally, it uses air intake system to compress the incoming air without swirler. A new design of swirler has been proposed and used in the current work. In this paper, a series of firing tests have been carried out to investigate the impact of using swirl flow on regression rate, combustion characteristics, and performance of solid-fuel ramjet engines. The influences of swirl intensity, solid fuel port diameter, and combustor length were studied and varied independently. A new technique for determining the time and space averaged regression rate of high-density polyethylene solid fuel surface after experiments has been proposed based on the laser scan technique. A code has been developed to reconstruct the data from the scanner and then used to obtain the three-dimensional distribution of the regression rate. It is shown that increasing swirl number increases regression rate, thrust, and characteristic velocity, and, decreases air-fuel ratio, corner recirculation zone length, and specific impulse. Using swirl flow enhances the flame stability meanwhile negatively affected on ignition process and specific impulse. Although a significant reduction of combustion chamber length can be achieved when swirl flow is used. Power fitting correlation for average regression rate was developed taking into account the influence of swirl number. Furthermore, varying port diameter and combustor length were found to have influences on regression rate, combustion characteristics and performance of solid-fuel ramjet.

  13. Spectral density mapping at multiple magnetic fields suitable for 13C NMR relaxation studies

    NASA Astrophysics Data System (ADS)

    Kadeřávek, Pavel; Zapletal, Vojtěch; Fiala, Radovan; Srb, Pavel; Padrta, Petr; Přecechtělová, Jana Pavlíková; Šoltésová, Mária; Kowalewski, Jozef; Widmalm, Göran; Chmelík, Josef; Sklenář, Vladimír; Žídek, Lukáš

    2016-05-01

    Standard spectral density mapping protocols, well suited for the analysis of 15N relaxation rates, introduce significant systematic errors when applied to 13C relaxation data, especially if the dynamics is dominated by motions with short correlation times (small molecules, dynamic residues of macromolecules). A possibility to improve the accuracy by employing cross-correlated relaxation rates and on measurements taken at several magnetic fields has been examined. A suite of protocols for analyzing such data has been developed and their performance tested. Applicability of the proposed protocols is documented in two case studies, spectral density mapping of a uniformly labeled RNA hairpin and of a selectively labeled disaccharide exhibiting highly anisotropic tumbling. Combination of auto- and cross-correlated relaxation data acquired at three magnetic fields was applied in the former case in order to separate effects of fast motions and conformational or chemical exchange. An approach using auto-correlated relaxation rates acquired at five magnetic fields, applicable to anisotropically moving molecules, was used in the latter case. The results were compared with a more advanced analysis of data obtained by interpolation of auto-correlated relaxation rates measured at seven magnetic fields, and with the spectral density mapping of cross-correlated relaxation rates. The results showed that sufficiently accurate values of auto- and cross-correlated spectral density functions at zero and 13C frequencies can be obtained from data acquired at three magnetic fields for uniformly 13C -labeled molecules with a moderate anisotropy of the rotational diffusion tensor. Analysis of auto-correlated relaxation rates at five magnetic fields represents an alternative for molecules undergoing highly anisotropic motions.

  14. Skin symptoms in bakery and auto body shop workers: associations with exposure and respiratory symptoms.

    PubMed

    Arrandale, Victoria; Meijster, Tim; Pronk, Anjoeka; Doekes, Gert; Redlich, Carrie A; Holness, D Linn; Heederik, Dick

    2013-02-01

    Despite the importance of skin exposure, studies of skin symptoms in relation to exposure and respiratory symptoms are rare. The goals of this study were to describe exposure-response relationships for skin symptoms, and to investigate associations between skin and respiratory symptoms in bakery and auto body shop workers. Data from previous studies of bakery and auto body shop workers were analyzed. Average exposure estimates for wheat allergen and isocyanates were used. Generalized linear models were constructed to describe the relationships between exposure and skin symptoms, as well as between skin and respiratory symptoms. Data from 723 bakery and 473 auto body shop workers were analyzed. In total, 5.3% of bakery and 6.1% of auto body shop workers were female; subjects' mean age was 39 and 38 years, respectively. Exposure-response relationships were observed in auto body shop workers for itchy or dry skin (PR 1.55, 95% CI 1.2-2.0) and work-related itchy skin (PR 1.97, 95% CI 1.2-3.3). A possible exposure-response relationship for work-related itchy skin in bakery workers did not reach statistical significance. In both groups, reporting skin symptoms was strongly and significantly associated with reporting respiratory symptoms, both work-related and non-work-related. Exposure-response relationships were observed for skin symptoms in auto body shop workers. The lack of significant exposure-response associations in bakery workers should be interpreted cautiously. Workers who reported skin symptoms were up to four times more likely to report respiratory symptoms. Improved awareness of both skin and respiratory outcomes in exposed workers is needed.

  15. Megagauss-level magnetic field production in cm-scale auto-magnetizing helical liners pulsed to 500 kA in 125 ns

    DOE PAGES

    Shipley, Gabriel A.; Awe, Thomas James; Hutsel, Brian Thomas; ...

    2018-05-03

    We present Auto-magnetizing (AutoMag) liners [Slutz et al., Phys. Plasmas 24, 012704 (2017)] are designed to generate up to 100 T of axial magnetic field in the fuel for Magnetized Liner Inertial Fusion [Slutz et al., Phys. Plasmas 17, 056303 (2010)] without the need for external field coils. AutoMag liners (cylindrical tubes) are composed of discrete metallic helical conduction paths separated by electrically insulating material. Initially, helical current in the AutoMag liner produces internal axial magnetic field during a long (100 to 300 ns) current prepulse with an average current rise rate dI/dt=5 kA/ns. After the cold fuel is magnetized,more » a rapidly rising current (200 kA/ns) generates a calculated electric field of 64 MV/m between the helices. Such field is sufficient to force dielectric breakdown of the insulating material after which liner current is reoriented from helical to predominantly axial which ceases the AutoMag axial magnetic field production mechanism and the z-pinch liner implodes. Proof of concept experiments have been executed on the Mykonos linear transformer driver to measure the axial field produced by a variety of AutoMag liners and to evaluate what physical processes drive dielectric breakdown. Lastly, a range of field strengths have been generated in various cm-scale liners in agreement with magnetic transient simulations including a measured field above 90 T at I = 350 kA. By varying the helical pitch angle, insulator material, and insulator geometry, favorable liner designs have been identified for which breakdown occurs under predictable and reproducible field conditions.« less

  16. Megagauss-level magnetic field production in cm-scale auto-magnetizing helical liners pulsed to 500 kA in 125 ns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shipley, Gabriel A.; Awe, Thomas James; Hutsel, Brian Thomas

    We present Auto-magnetizing (AutoMag) liners [Slutz et al., Phys. Plasmas 24, 012704 (2017)] are designed to generate up to 100 T of axial magnetic field in the fuel for Magnetized Liner Inertial Fusion [Slutz et al., Phys. Plasmas 17, 056303 (2010)] without the need for external field coils. AutoMag liners (cylindrical tubes) are composed of discrete metallic helical conduction paths separated by electrically insulating material. Initially, helical current in the AutoMag liner produces internal axial magnetic field during a long (100 to 300 ns) current prepulse with an average current rise rate dI/dt=5 kA/ns. After the cold fuel is magnetized,more » a rapidly rising current (200 kA/ns) generates a calculated electric field of 64 MV/m between the helices. Such field is sufficient to force dielectric breakdown of the insulating material after which liner current is reoriented from helical to predominantly axial which ceases the AutoMag axial magnetic field production mechanism and the z-pinch liner implodes. Proof of concept experiments have been executed on the Mykonos linear transformer driver to measure the axial field produced by a variety of AutoMag liners and to evaluate what physical processes drive dielectric breakdown. Lastly, a range of field strengths have been generated in various cm-scale liners in agreement with magnetic transient simulations including a measured field above 90 T at I = 350 kA. By varying the helical pitch angle, insulator material, and insulator geometry, favorable liner designs have been identified for which breakdown occurs under predictable and reproducible field conditions.« less

  17. Megagauss-level magnetic field production in cm-scale auto-magnetizing helical liners pulsed to 500 kA in 125 ns

    NASA Astrophysics Data System (ADS)

    Shipley, G. A.; Awe, T. J.; Hutsel, B. T.; Slutz, S. A.; Lamppa, D. C.; Greenly, J. B.; Hutchinson, T. M.

    2018-05-01

    Auto-magnetizing (AutoMag) liners [Slutz et al., Phys. Plasmas 24, 012704 (2017)] are designed to generate up to 100 T of axial magnetic field in the fuel for Magnetized Liner Inertial Fusion [Slutz et al., Phys. Plasmas 17, 056303 (2010)] without the need for external field coils. AutoMag liners (cylindrical tubes) are composed of discrete metallic helical conduction paths separated by electrically insulating material. Initially, helical current in the AutoMag liner produces internal axial magnetic field during a long (100 to 300 ns) current prepulse with an average current rise rate d I / d t = 5 k A / n s . After the cold fuel is magnetized, a rapidly rising current ( 200 k A / n s ) generates a calculated electric field of 64 M V / m between the helices. Such field is sufficient to force dielectric breakdown of the insulating material after which liner current is reoriented from helical to predominantly axial which ceases the AutoMag axial magnetic field production mechanism and the z-pinch liner implodes. Proof of concept experiments have been executed on the Mykonos linear transformer driver to measure the axial field produced by a variety of AutoMag liners and to evaluate what physical processes drive dielectric breakdown. A range of field strengths have been generated in various cm-scale liners in agreement with magnetic transient simulations including a measured field above 90 T at I = 350 kA. By varying the helical pitch angle, insulator material, and insulator geometry, favorable liner designs have been identified for which breakdown occurs under predictable and reproducible field conditions.

  18. Non-motor tasks improve adaptive brain-computer interface performance in users with severe motor impairment

    PubMed Central

    Faller, Josef; Scherer, Reinhold; Friedrich, Elisabeth V. C.; Costa, Ursula; Opisso, Eloy; Medina, Josep; Müller-Putz, Gernot R.

    2014-01-01

    Individuals with severe motor impairment can use event-related desynchronization (ERD) based BCIs as assistive technology. Auto-calibrating and adaptive ERD-based BCIs that users control with motor imagery tasks (“SMR-AdBCI”) have proven effective for healthy users. We aim to find an improved configuration of such an adaptive ERD-based BCI for individuals with severe motor impairment as a result of spinal cord injury (SCI) or stroke. We hypothesized that an adaptive ERD-based BCI, that automatically selects a user specific class-combination from motor-related and non motor-related mental tasks during initial auto-calibration (“Auto-AdBCI”) could allow for higher control performance than a conventional SMR-AdBCI. To answer this question we performed offline analyses on two sessions (21 data sets total) of cue-guided, five-class electroencephalography (EEG) data recorded from individuals with SCI or stroke. On data from the twelve individuals in Session 1, we first identified three bipolar derivations for the SMR-AdBCI. In a similar way, we determined three bipolar derivations and four mental tasks for the Auto-AdBCI. We then simulated both, the SMR-AdBCI and the Auto-AdBCI configuration on the unseen data from the nine participants in Session 2 and compared the results. On the unseen data of Session 2 from individuals with SCI or stroke, we found that automatically selecting a user specific class-combination from motor-related and non motor-related mental tasks during initial auto-calibration (Auto-AdBCI) significantly (p < 0.01) improved classification performance compared to an adaptive ERD-based BCI that only used motor imagery tasks (SMR-AdBCI; average accuracy of 75.7 vs. 66.3%). PMID:25368546

  19. Halo naevi and café au lait macule regression in a renal transplant patient on immunosuppression.

    PubMed

    Lolatgis, Helena; Varigos, George; Braue, Anna; Scardamaglia, Laura; Boyapati, Ann; Winship, Ingrid

    2015-11-01

    A case of halo naevi and café au lait macule regression in a renal transplant patient receiving long-term immunosuppressive therapy is described. We propose the direct transfer of an auto-reactive antibody, CD8 T-cells or tumour necrosis factor α from the transplant donor to the recipient as a possible cause. We have also considered insufficient immunosuppressive therapy as a possible mechanism. © 2014 The Australasian College of Dermatologists.

  20. Development and evaluation of an automated fall risk assessment system.

    PubMed

    Lee, Ju Young; Jin, Yinji; Piao, Jinshi; Lee, Sun-Mi

    2016-04-01

    Fall risk assessment is the first step toward prevention, and a risk assessment tool with high validity should be used. This study aimed to develop and validate an automated fall risk assessment system (Auto-FallRAS) to assess fall risks based on electronic medical records (EMRs) without additional data collected or entered by nurses. This study was conducted in a 1335-bed university hospital in Seoul, South Korea. The Auto-FallRAS was developed using 4211 fall-related clinical data extracted from EMRs. Participants included fall patients and non-fall patients (868 and 3472 for the development study; 752 and 3008 for the validation study; and 58 and 232 for validation after clinical application, respectively). The system was evaluated for predictive validity and concurrent validity. The final 10 predictors were included in the logistic regression model for the risk-scoring algorithm. The results of the Auto-FallRAS were shown as high/moderate/low risk on the EMR screen. The predictive validity analyzed after clinical application of the Auto-FallRAS was as follows: sensitivity = 0.95, NPV = 0.97 and Youden index = 0.44. The validity of the Morse Fall Scale assessed by nurses was as follows: sensitivity = 0.68, NPV = 0.88 and Youden index = 0.28. This study found that the Auto-FallRAS results were better than were the nurses' predictions. The advantage of the Auto-FallRAS is that it automatically analyzes information and shows patients' fall risk assessment results without requiring additional time from nurses. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  1. Application of unsteady flow rate evaluations to identify the dynamic transfer function of a cavitatingVenturi

    NASA Astrophysics Data System (ADS)

    Marie-Magdeleine, A.; Fortes-Patella, R.; Lemoine, N.; Marchand, N.

    2012-11-01

    This study concerns the simulation of the implementation of the Kinetic Differential Pressure (KDP) method used for the unsteady mass flow rate evaluation in order to identify the dynamic transfer matrix of a cavitatingVenturi. Firstly, the equations of the IZ code used for this simulation are introduced. Next, the methodology for evaluating unsteady pressures and mass flow rates at the inlet and the outlet of the cavitatingVenturi and for identifying the dynamic transfer matrix is presented. Later, the robustness of the method towards measurement uncertainties implemented as a Gaussian white noise is studied. The results of the numerical simulations let us estimate the system linearity domain and to perform the Empirical Transfer Function Evaluation on the inlet frequency per frequency signal and on the chirp signal tests. Then the pressure data obtained with the KDP method is taken and the identification procedure by ETFE and by the user-made Auto-Recursive Moving-Average eXogenous algorithms is performed and the obtained transfer matrix coefficients are compared with those obtained from the simulated input and output data.

  2. Following a trend with an exponential moving average: Analytical results for a Gaussian model

    NASA Astrophysics Data System (ADS)

    Grebenkov, Denis S.; Serror, Jeremy

    2014-01-01

    We investigate how price variations of a stock are transformed into profits and losses (P&Ls) of a trend following strategy. In the frame of a Gaussian model, we derive the probability distribution of P&Ls and analyze its moments (mean, variance, skewness and kurtosis) and asymptotic behavior (quantiles). We show that the asymmetry of the distribution (with often small losses and less frequent but significant profits) is reminiscent to trend following strategies and less dependent on peculiarities of price variations. At short times, trend following strategies admit larger losses than one may anticipate from standard Gaussian estimates, while smaller losses are ensured at longer times. Simple explicit formulas characterizing the distribution of P&Ls illustrate the basic mechanisms of momentum trading, while general matrix representations can be applied to arbitrary Gaussian models. We also compute explicitly annualized risk adjusted P&L and strategy turnover to account for transaction costs. We deduce the trend following optimal timescale and its dependence on both auto-correlation level and transaction costs. Theoretical results are illustrated on the Dow Jones index.

  3. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    NASA Astrophysics Data System (ADS)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  4. A deep auto-encoder model for gene expression prediction.

    PubMed

    Xie, Rui; Wen, Jia; Quitadamo, Andrew; Cheng, Jianlin; Shi, Xinghua

    2017-11-17

    Gene expression is a key intermediate level that genotypes lead to a particular trait. Gene expression is affected by various factors including genotypes of genetic variants. With an aim of delineating the genetic impact on gene expression, we build a deep auto-encoder model to assess how good genetic variants will contribute to gene expression changes. This new deep learning model is a regression-based predictive model based on the MultiLayer Perceptron and Stacked Denoising Auto-encoder (MLP-SAE). The model is trained using a stacked denoising auto-encoder for feature selection and a multilayer perceptron framework for backpropagation. We further improve the model by introducing dropout to prevent overfitting and improve performance. To demonstrate the usage of this model, we apply MLP-SAE to a real genomic datasets with genotypes and gene expression profiles measured in yeast. Our results show that the MLP-SAE model with dropout outperforms other models including Lasso, Random Forests and the MLP-SAE model without dropout. Using the MLP-SAE model with dropout, we show that gene expression quantifications predicted by the model solely based on genotypes, align well with true gene expression patterns. We provide a deep auto-encoder model for predicting gene expression from SNP genotypes. This study demonstrates that deep learning is appropriate for tackling another genomic problem, i.e., building predictive models to understand genotypes' contribution to gene expression. With the emerging availability of richer genomic data, we anticipate that deep learning models play a bigger role in modeling and interpreting genomics.

  5. Strategic Implications of Emerging Threats to West African Countries

    DTIC Science & Technology

    2012-03-14

    deforestation, serious water and air pollution , irresponsible exploitation practices, plundering of resources by devious warlords and politicians all...AQIM‟s threats in the area is damageable to local economy and safety in general. For example, the decision to move the Paris -Dakar auto rally to...to deal with regional threat posed by this phenomenon93. Additionally, the UN Security Council Resolution 2018 of 31 October 2011, “called upon

  6. The simulation of emergent dispatch of cars for intelligent driving autos

    NASA Astrophysics Data System (ADS)

    Zheng, Ziao

    2018-03-01

    It is widely acknowledged that it is important for the development of intelligent cars to be widely accepted by the majority of car users. While most of the intelligent cars have the system of monitoring itself whether it is on the good situation to drive, it is also clear that studies should be performed on the way of cars for the emergent rescue of the intelligent vehicles. In this study, writer focus mainly on how to derive a separate system for the car caring teams to arrive as soon as they get the signal sent out by the intelligent driving autos. This simulation measure the time for the rescuing team to arrive, the cost it spent on arriving on the site of car problem happens, also how long the queue is when the rescuing auto is waiting to cross a road. This can be definitely in great use when there are a team of intelligent cars with one car immediately having problems causing it's not moving and can be helpful in other situations. Through this way, the interconnection of cars can be a safety net for the drivers encountering difficulties in any time.

  7. The comparing analysis of simulation of emergent dispatch of cars for intelligent driving autos in crossroads

    NASA Astrophysics Data System (ADS)

    Zheng, Ziao

    2018-03-01

    It is widely acknowledged that it is important for the development of intelligent cars to be widely accepted by the majority of car users. While most of the intelligent cars have the system of monitoring itself whether it is on the good situation to drive, it is also clear that studies should be performed on the way of cars for the emergent rescue of the intelligent vehicles. In this study, writer focus mainly on how to derive a separate system for the car caring teams to arrive as soon as they get the signal sent out by the intelligent driving autos. This simulation measure the time for the rescuing team to arrive, the cost it spent on arriving on the site of car problem happens, also how long the queue is when the rescuing auto is waiting to cross a road. This can be definitely in great use when there are a team of intelligent cars with one car immediately having problems causing its not moving and can be helpful in other situations. Through this way, the interconnection of cars can be a safety net for the drivers encountering difficulties in any time.

  8. Task-shifting of CD4 T cell count monitoring by the touchscreen-based Muse™ Auto CD4/CD4% single-platform system for CD4 T cell numeration: Implication for decentralization in resource-constrained settings.

    PubMed

    Kouabosso, André; Mossoro-Kpinde, Christian Diamant; Bouassa, Ralph-Sydney Mboumba; Longo, Jean De Dieu; Mbeko Simaleko, Marcel; Grésenguet, Gérard; Bélec, Laurent

    2018-04-01

    The accuracy of CD4 T cell monitoring by the recently developed flow cytometry-based CD4 T cell counting Muse™ Auto CD4/CD4% Assay analyzer (EMD Millipore Corporation, Merck Life Sciences, KGaA, Darmstadt, Germany) was evaluated in trained lay providers against laboratory technicians. After 2 days of training on the Muse™ Auto CD4/CD4% analyzer, EDTA-blood samples from 6 HIV-positive and 4 HIV-negative individuals were used for CD4 T cell counting in triplicate in parallel by 12 trained lay providers as compared to 10 lab technicians. Mean number of CD4 T cells in absolute number was 829 ± 380 cells/μl by lay providers and 794 ± 409 cells/μl by technicians (P > 0.05); and in percentage 36.2 ± 14.8%CD4 by lay providers and 36.1 ± 15.0%CD4 by laboratory technician (P > 0.05). The unweighted linear regression and Passing-Bablok regression analyses on CD4 T cell results expressed in absolute count revealed moderate correlation between CD4 T cell counts obtained by lay providers and lab technicians. The mean absolute bias measured by Bland-Altman analysis between CD4 T cell/μl obtained by lay providers and lab technicians was -3.41 cells/μl. Intra-assay coefficient of variance (CV) of Muse™ Auto CD4/CD4% in absolute number was 10.1% by lay providers and 8.5% by lab technicians (P > 0.05), and in percentage 5.5% by lay providers and 4.4% by lab technicians (P > 0.05). The inter-assay CV of Muse™ Auto CD4/CD4% in absolute number was 13.4% by lay providers and 10.3% by lab technicians (P > 0.05), and in percentage 7.8% by lay providers and 6.9% by lab technicians (P > 0.05). The study demonstrates the feasibility of CD4 T cell counting using the alternative flow cytometer Muse™ Auto CD4/CD4% analyzer by trained lay providers and therefore the practical possibility of decentralization CD4 T cell counting to health community centers. Copyright © 2018. Published by Elsevier B.V.

  9. Realization of the ergonomics design and automatic control of the fundus cameras

    NASA Astrophysics Data System (ADS)

    Zeng, Chi-liang; Xiao, Ze-xin; Deng, Shi-chao; Yu, Xin-ye

    2012-12-01

    The principles of ergonomics design in fundus cameras should be extending the agreeableness by automatic control. Firstly, a 3D positional numerical control system is designed for positioning the eye pupils of the patients who are doing fundus examinations. This system consists of a electronically controlled chin bracket for moving up and down, a lateral movement of binocular with the detector and the automatic refocusing of the edges of the eye pupils. Secondly, an auto-focusing device for the object plane of patient's fundus is designed, which collects the patient's fundus images automatically whether their eyes is ametropic or not. Finally, a moving visual target is developed for expanding the fields of the fundus images.

  10. Error Estimation for the Linearized Auto-Localization Algorithm

    PubMed Central

    Guevara, Jorge; Jiménez, Antonio R.; Prieto, Jose Carlos; Seco, Fernando

    2012-01-01

    The Linearized Auto-Localization (LAL) algorithm estimates the position of beacon nodes in Local Positioning Systems (LPSs), using only the distance measurements to a mobile node whose position is also unknown. The LAL algorithm calculates the inter-beacon distances, used for the estimation of the beacons’ positions, from the linearized trilateration equations. In this paper we propose a method to estimate the propagation of the errors of the inter-beacon distances obtained with the LAL algorithm, based on a first order Taylor approximation of the equations. Since the method depends on such approximation, a confidence parameter τ is defined to measure the reliability of the estimated error. Field evaluations showed that by applying this information to an improved weighted-based auto-localization algorithm (WLAL), the standard deviation of the inter-beacon distances can be improved by more than 30% on average with respect to the original LAL method. PMID:22736965

  11. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    PubMed

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  12. Systemic Delivery of Atropine Sulfate by the MicroDose Dry-Powder Inhaler

    PubMed Central

    Venkataramanan, R.; Hoffman, R.M.; George, M.P.; Petrov, A.; Richards, T.; Zhang, S.; Choi, J.; Gao, Y.Y.; Oakum, C.D.; Cook, R.O.; Donahoe, M.

    2013-01-01

    Abstract Background Inhaled atropine is being developed as a systemic and pulmonary treatment for the extended recovery period after chemical weapons exposure. We performed a pharmacokinetics study comparing inhaled atropine delivery using the MicroDose Therapeutx Dry Powder Inhaler (DPIA) with intramuscular (IM) atropine delivery via auto-injector (AUTO). Methods The MicroDose DPIA utilizes a novel piezoelectric system to aerosolize drug and excipient from a foil dosing blister. Subjects inhaled a 1.95-mg atropine sulfate dose from the dry powder inhaler on one study day [5 doses×0.4 mg per dose (nominal) delivered over 12 min] and received a 2-mg IM injection via the AtroPen® auto-injector on another. Pharmacokinetics, pharmacodynamic response, and safety were studied for 12 hr. Results A total of 17 subjects were enrolled. All subjects completed IM dosing. One subject did not perform inhaled delivery due to a skin reaction from the IM dose. Pharmacokinetic results were as follows: area under the curve concentration, DPIA=20.1±5.8, AUTO=23.7±4.9 ng hr/mL (means±SD); maximum concentration reached, DPIA=7.7±3.5, AUTO=11.0±3.8 ng/mL; time to reach maximum concentration, DPIA=0.25±0.47, AUTO=0.19±0.23 hr. Pharmacodynamic results were as follows: maximum increase in heart rate, DPIA=18±12, AUTO=23±13 beats/min; average change in 1-sec forced expiratory volume at 30 min, DPIA=0.16±0.22 L, AUTO=0.11±0.29 L. The relative bioavailability for DPIA was 87% (based on output dose). Two subjects demonstrated allergic responses: one to the first dose (AUTO), which was mild and transient, and one to the second dose (DPIA), which was moderate in severity, required treatment with oral and intravenous (IV) diphenhydramine and IV steroids, and lasted more than 7 days. Conclusions Dry powder inhalation is a highly bioavailable route for attaining rapid and consistent systemic concentrations of atropine. PMID:22691110

  13. Systemic delivery of atropine sulfate by the MicroDose Dry-Powder Inhaler.

    PubMed

    Corcoran, T E; Venkataramanan, R; Hoffman, R M; George, M P; Petrov, A; Richards, T; Zhang, S; Choi, J; Gao, Y Y; Oakum, C D; Cook, R O; Donahoe, M

    2013-02-01

    Inhaled atropine is being developed as a systemic and pulmonary treatment for the extended recovery period after chemical weapons exposure. We performed a pharmacokinetics study comparing inhaled atropine delivery using the MicroDose Therapeutx Dry Powder Inhaler (DPIA) with intramuscular (IM) atropine delivery via auto-injector (AUTO). The MicroDose DPIA utilizes a novel piezoelectric system to aerosolize drug and excipient from a foil dosing blister. Subjects inhaled a 1.95-mg atropine sulfate dose from the dry powder inhaler on one study day [5 doses × 0.4 mg per dose (nominal) delivered over 12 min] and received a 2-mg IM injection via the AtroPen® auto-injector on another. Pharmacokinetics, pharmacodynamic response, and safety were studied for 12 hr. A total of 17 subjects were enrolled. All subjects completed IM dosing. One subject did not perform inhaled delivery due to a skin reaction from the IM dose. Pharmacokinetic results were as follows: area under the curve concentration, DPIA=20.1±5.8, AUTO=23.7±4.9 ng hr/mL (means±SD); maximum concentration reached, DPIA=7.7±3.5, AUTO=11.0±3.8 ng/mL; time to reach maximum concentration, DPIA=0.25±0.47, AUTO=0.19±0.23 hr. Pharmacodynamic results were as follows: maximum increase in heart rate, DPIA=18±12, AUTO=23±13 beats/min; average change in 1-sec forced expiratory volume at 30 min, DPIA=0.16±0.22 L, AUTO=0.11±0.29 L. The relative bioavailability for DPIA was 87% (based on output dose). Two subjects demonstrated allergic responses: one to the first dose (AUTO), which was mild and transient, and one to the second dose (DPIA), which was moderate in severity, required treatment with oral and intravenous (IV) diphenhydramine and IV steroids, and lasted more than 7 days. Dry powder inhalation is a highly bioavailable route for attaining rapid and consistent systemic concentrations of atropine.

  14. Autoantibody Profiles in Collagen Disease Patients with Interstitial Lung Disease (ILD): Antibodies to Major Histocompatibility Complex Class I-Related Chain A (MICA) as Markers of ILD

    PubMed Central

    Furukawa, Hiroshi; Oka, Shomi; Shimada, Kota; Masuo, Kiyoe; Nakajima, Fumiaki; Funano, Shunichi; Tanaka, Yuki; Komiya, Akiko; Fukui, Naoshi; Sawasaki, Tatsuya; Tadokoro, Kenji; Nose, Masato; Tsuchiya, Naoyuki; Tohma, Shigeto

    2015-01-01

    Interstitial lung disease (ILD) is frequently associated with collagen disease. It is then designated as collagen vascular disease-associated ILD (CVD-ILD), and influences patients’ prognosis. The prognosis of acute-onset diffuse ILD (AoDILD) occurring in patients with collagen disease is quite poor. Here, we report our investigation of auto-antibody (Ab) profiles to determine whether they may be useful in diagnosing CVD-ILD or AoDILD in collagen disease. Auto-Ab profiles were analyzed using the Lambda Array Beads Multi-Analyte System, granulocyte immunofluorescence test, Proto-Array Human Protein Microarray, AlphaScreen assay, and glutathione S-transferase capture enzyme-linked immunosorbent assay in 34 patients with rheumatoid arthritis (RA) with or without CVD-ILD and in 15 patients with collagen disease with AoDILD. The average anti-major histocompatibility complex class I-related chain A (MICA) Ab levels were higher in RA patients with CVD-ILD than in those without (P = 0.0013). The ratio of the average anti-MICA Ab level to the average anti-human leukocyte antigen class I Ab level (ie, MICA/Class I) was significantly higher in RA patients with CVD-ILD compared with those without (P = 4.47 × 10−5). To the best of our knowledge, this is the first report of auto-Ab profiles in CVD-ILD. The MICA/Class I ratio could be a better marker for diagnosing CVD-ILD than KL-6 (Krebs von den lungen-6). PMID:26327779

  15. Estimation of Covariance Matrix on Bi-Response Longitudinal Data Analysis with Penalized Spline Regression

    NASA Astrophysics Data System (ADS)

    Islamiyati, A.; Fatmawati; Chamidah, N.

    2018-03-01

    The correlation assumption of the longitudinal data with bi-response occurs on the measurement between the subjects of observation and the response. It causes the auto-correlation of error, and this can be overcome by using a covariance matrix. In this article, we estimate the covariance matrix based on the penalized spline regression model. Penalized spline involves knot points and smoothing parameters simultaneously in controlling the smoothness of the curve. Based on our simulation study, the estimated regression model of the weighted penalized spline with covariance matrix gives a smaller error value compared to the error of the model without covariance matrix.

  16. Modelling space of spread Dengue Hemorrhagic Fever (DHF) in Central Java use spatial durbin model

    NASA Astrophysics Data System (ADS)

    Ispriyanti, Dwi; Prahutama, Alan; Taryono, Arkadina PN

    2018-05-01

    Dengue Hemorrhagic Fever is one of the major public health problems in Indonesia. From year to year, DHF causes Extraordinary Event in most parts of Indonesia, especially Central Java. Central Java consists of 35 districts or cities where each region is close to each other. Spatial regression is an analysis that suspects the influence of independent variables on the dependent variables with the influences of the region inside. In spatial regression modeling, there are spatial autoregressive model (SAR), spatial error model (SEM) and spatial autoregressive moving average (SARMA). Spatial Durbin model is the development of SAR where the dependent and independent variable have spatial influence. In this research dependent variable used is number of DHF sufferers. The independent variables observed are population density, number of hospitals, residents and health centers, and mean years of schooling. From the multiple regression model test, the variables that significantly affect the spread of DHF disease are the population and mean years of schooling. By using queen contiguity and rook contiguity, the best model produced is the SDM model with queen contiguity because it has the smallest AIC value of 494,12. Factors that generally affect the spread of DHF in Central Java Province are the number of population and the average length of school.

  17. A Case Study to Improve Emergency Room Patient Flow at Womack Army Medical Center

    DTIC Science & Technology

    2009-06-01

    use just the previous month, moving average 2-month period ( MA2 ) uses the average from the previous two months, moving average 3-month period (MA3...ED prior to discharge by provider) MA2 /MA3/MA4 - moving averages of 2-4 months in length MAD - mean absolute deviation (measure of accuracy for

  18. Depression and auto-aggressiveness in adolescents in Zagreb.

    PubMed

    Tripković, Mara; Vuković, Iris Sarajlić; Frančišković, Tanja; Pisk, Sandra Vuk; Krnić, Silvana

    2014-12-01

    The aim of the study was to explore the frequency of depression among the general population of adolescents who were high school students in the city of Zagreb. As depression is associated with increased suicidal risk we wanted to check to what extent depression, as an emotional problem among youth, is associated with auto-aggression in the general population of adolescents. The study was conducted on a sample of high school students in Zagreb and it included 701 students of both genders aged from 14-19 years of age. To test the depression a Beck Depression Inventory (BDI) was administered for youth between 11-18 years of age (Youth Self Report for ages 11-18). To test auto-aggression a Scale of Auto-destructiveness (SAD) was used. Results obtained by this study show that about 20.7% of high school students have mild and borderline depressive disorders while moderate or severe depression shows about 5% of them, whereby depression is statistically significant among girls who, on average, report more symptoms of depression. It has also been proven a significant impact of depression levels (F (2,423)=35.860, p<0.001) on auto-aggression in subjects of both genders. In both genders, moderately depressed show more auto destructiveness than those without depression symptoms (p<0.01). In the group of heavily depressed (n=30), significantly higher self-destructiveness is shown by girls (p<0.01). The data suggest the importance of early recognition, understanding and treatment of depressive symptoms in adolescents in order to reduce the risk of subsequent chronic psychosocial damage.

  19. Parametric and Non-Parametric Vibration-Based Structural Identification Under Earthquake Excitation

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Fouskitakis, George N.

    2014-05-01

    The problem of modal identification in civil structures is of crucial importance, and thus has been receiving increasing attention in recent years. Vibration-based methods are quite promising as they are capable of identifying the structure's global characteristics, they are relatively easy to implement and they tend to be time effective and less expensive than most alternatives [1]. This paper focuses on the off-line structural/modal identification of civil (concrete) structures subjected to low-level earthquake excitations, under which, they remain within their linear operating regime. Earthquakes and their details are recorded and provided by the seismological network of Crete [2], which 'monitors' the broad region of south Hellenic arc, an active seismic region which functions as a natural laboratory for earthquake engineering of this kind. A sufficient number of seismic events are analyzed in order to reveal the modal characteristics of the structures under study, that consist of the two concrete buildings of the School of Applied Sciences, Technological Education Institute of Crete, located in Chania, Crete, Hellas. Both buildings are equipped with high-sensitivity and accuracy seismographs - providing acceleration measurements - established at the basement (structure's foundation) presently considered as the ground's acceleration (excitation) and at all levels (ground floor, 1st floor, 2nd floor and terrace). Further details regarding the instrumentation setup and data acquisition may be found in [3]. The present study invokes stochastic, both non-parametric (frequency-based) and parametric methods for structural/modal identification (natural frequencies and/or damping ratios). Non-parametric methods include Welch-based spectrum and Frequency response Function (FrF) estimation, while parametric methods, include AutoRegressive (AR), AutoRegressive with eXogeneous input (ARX) and Autoregressive Moving-Average with eXogeneous input (ARMAX) models[4, 5]. Preliminary results indicate that parametric methods are capable of sufficiently providing the structural/modal characteristics such as natural frequencies and damping ratios. The study also aims - at a further level of investigation - to provide a reliable statistically-based methodology for structural health monitoring after major seismic events which potentially cause harming consequences in structures. Acknowledgments This work was supported by the State Scholarships Foundation of Hellas. References [1] J. S. Sakellariou and S. D. Fassois, "Stochastic output error vibration-based damage detection and assessment in structures under earthquake excitation," Journal of Sound and Vibration, vol. 297, pp. 1048-1067, 2006. [2] G. Hloupis, I. Papadopoulos, J. P. Makris, and F. Vallianatos, "The South Aegean seismological network - HSNC," Adv. Geosci., vol. 34, pp. 15-21, 2013. [3] F. P. Pentaris, J. Stonham, and J. P. Makris, "A review of the state-of-the-art of wireless SHM systems and an experimental set-up towards an improved design," presented at the EUROCON, 2013 IEEE, Zagreb, 2013. [4] S. D. Fassois, "Parametric Identification of Vibrating Structures," in Encyclopedia of Vibration, S. G. Braun, D. J. Ewins, and S. S. Rao, Eds., ed London: Academic Press, London, 2001. [5] S. D. Fassois and J. S. Sakellariou, "Time-series methods for fault detection and identification in vibrating structures," Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 365, pp. 411-448, February 15 2007.

  20. AutoSyP: A Low-Cost, Low-Power Syringe Pump for Use in Low-Resource Settings.

    PubMed

    Juarez, Alexa; Maynard, Kelley; Skerrett, Erica; Molyneux, Elizabeth; Richards-Kortum, Rebecca; Dube, Queen; Oden, Z Maria

    2016-10-05

    This article describes the design and evaluation of AutoSyP, a low-cost, low-power syringe pump intended to deliver intravenous (IV) infusions in low-resource hospitals. A constant-force spring within the device provides mechanical energy to depress the syringe plunger. As a result, the device can run on rechargeable battery power for 66 hours, a critical feature for low-resource settings where the power grid may be unreliable. The device is designed to be used with 5- to 60-mL syringes and can deliver fluids at flow rates ranging from 3 to 60 mL/hour. The cost of goods to build one AutoSyP device is approximately $500. AutoSyP was tested in a laboratory setting and in a pilot clinical study. Laboratory accuracy was within 4% of the programmed flow rate. The device was used to deliver fluid to 10 healthy adult volunteers and 30 infants requiring IV fluid therapy at Queen Elizabeth Central Hospital in Blantyre, Malawi. The device delivered fluid with an average mean flow rate error of -2.3% ± 1.9% for flow rates ranging from 3 to 60 mL/hour. AutoSyP has the potential to improve the accuracy and safety of IV fluid delivery in low-resource settings. © The American Society of Tropical Medicine and Hygiene.

  1. Thermal and TEC anomalies detection using an intelligent hybrid system around the time of the Saravan, Iran, (Mw = 7.7) earthquake of 16 April 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2014-02-01

    A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.

  2. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert

    2013-04-01

    Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  3. Essential/precursor chemicals and drug consumption: impacts of US sodium permanganate and Mexico pseudoephedrine controls on the numbers of US cocaine and methamphetamine users.

    PubMed

    Cunningham, James K; Liu, Lon-Mu; Callaghan, Russell C

    2016-11-01

    In December 2006 the United States regulated sodium permanganate, a cocaine essential chemical. In March 2007 Mexico, the United States' primary source for methamphetamine, closed a chemical company accused of illicitly importing 60+ tons of pseudoephedrine, a methamphetamine precursor chemical. US cocaine availability and methamphetamine availability, respectively, decreased in association. This study tested whether the controls had impacts upon the numbers of US cocaine users and methamphetamine users. Auto-regressive integrated moving average (ARIMA) intervention time-series analysis. Comparison series-heroin and marijuana users-were used. United States, 2002-14. The National Survey on Drug Use and Health (n = 723 283), a complex sample survey of the US civilian, non-institutionalized population. Estimates of the numbers of (1) past-year users and (2) past-month users were constructed for each calendar quarter from 2002 to 2014, providing each series with 52 time-periods. Downward shifts in cocaine users started at the time of the cocaine regulation. Past-year and past-month cocaine users series levels decreased by approximately 1 946 271 (-32%) (P < 0.05) and 694 770 (-29%) (P < 0.01), respectively-no apparent recovery occurred through 2014. Downward shifts in methamphetamine users started at the time of the chemical company closure. Past-year and past-month methamphetamine series levels decreased by 494 440 (-35%) [P < 0.01; 95% confidence interval (CI) = -771 897, -216 982] and 277 380 (-45%) (P < 0.05; CI = -554 073, -686), respectively-partial recovery possibly occurred in 2013. The comparison series changed little at the intervention times. Essential/precursor chemical controls in the United States (2006) and Mexico (2007) were associated with large, extended (7+ years) reductions in cocaine users and methamphetamine users in the United States. © 2016 Society for the Study of Addiction.

  4. Disentangling the Long-term Effects of Climate Change and Forest Structure and Species Composition on Streamflow Across the Eastern US

    NASA Astrophysics Data System (ADS)

    Caldwell, P.; Elliott, K.; Hartsell, A.; Miniat, C.

    2016-12-01

    Climate change and disturbances are threatening the ability of forested watersheds to provide the clean, reliable, and abundant fresh water necessary to support aquatic ecosystems and a growing human population. Forested watersheds in the eastern US have undergone significant change over the 20th century due to natural and introduced disturbances and a legacy of land use. We hypothesize that changes in forest age and species composition (i.e., forest change) associated with these disturbances may have altered forest water use and thus streamflow (Q) due to inherent differences in transpiration among species and forest ages. To test this hypothesis, we quantified changes in Q from 1960 to 2012 in 202 US Geological Survey forested reference watersheds across the eastern US, and separated the effect of changes in climate from forest change using Auto-Regressive Integrated Moving Average (ARIMA) time series modeling. We linked changes in Q to forest disturbance, forest ages and species composition using the Landsat-based North American Forest Dynamics dataset and plot-level USDA Forest Service Forest Inventory and Analysis (FIA) data. We found that 172 of the 202 sites (85%) exhibited changes in Q not accounted for by climate that we attributed to forest change and/or land use change. Among these, 76 (44%) had declining Q due to forest change (mostly in the southeastern US) while 96 (56%) had increasing Q (mostly in the mid-Atlantic and northeastern US). Across the 172 sites with forest-related changes in Q, 34% had at least 10% of the watershed area disturbed at least once from 1986-2010. In a case study of three watersheds, FIA data indicated that changes in forest structure and species composition explained observed changes in Q beyond climate effects. Our results suggest that forest-related changes in Q may have significant implications for water supply in the region and may inform forest management strategies to mitigate climate change impacts on water resources.

  5. Identification of an internal combustion engine model by nonlinear multi-input multi-output system identification. Ph.D. Thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luh, G.C.

    1994-01-01

    This thesis presents the application of advanced modeling techniques to construct nonlinear forward and inverse models of internal combustion engines for the detection and isolation of incipient faults. The NARMAX (Nonlinear Auto-Regressive Moving Average modeling with eXogenous inputs) technique of system identification proposed by Leontaritis and Billings was used to derive the nonlinear model of a internal combustion engine, over operating conditions corresponding to the I/M240 cycle. The I/M240 cycle is a standard proposed by the United States Environmental Protection Agency to measure tailpipe emissions in inspection and maintenance programs and consists of a driving schedule developed for the purposemore » of testing compliance with federal vehicle emission standards for carbon monoxide, unburned hydrocarbons, and nitrogen oxides. The experimental work for model identification and validation was performed on a 3.0 liter V6 engine installed in an engine test cell at the Center for Automotive Research at The Ohio State University. In this thesis, different types of model structures were proposed to obtain multi-input multi-output (MIMO) nonlinear NARX models. A modification of the algorithm proposed by He and Asada was used to estimate the robust orders of the derived MIMO nonlinear models. A methodology for the analysis of inverse NARX model was developed. Two methods were proposed to derive the inverse NARX model: (1) inversion from the forward NARX model; and (2) direct identification of inverse model from the output-input data set. In this thesis, invertibility, minimum-phase characteristic of zero dynamics, and stability analysis of NARX forward model are also discussed. Stability in the sense of Lyapunov is also investigated to check the stability of the identified forward and inverse models. This application of inverse problem leads to the estimation of unknown inputs and to actuator fault diagnosis.« less

  6. Statistical analysis of aerosols over the Gangetic-Himalayan region using ARIMA model based on long-term MODIS observations

    NASA Astrophysics Data System (ADS)

    Soni, Kirti; Kapoor, Sangeeta; Parmar, Kulwinder Singh; Kaskaoutis, Dimitris G.

    2014-11-01

    Long-term observations and modeling of aerosol loading over the Indo-Gangetic plains (IGP), the Indian desert region and Himalayan slopes are analyzed in the present study. The Box-Jenkins popular ARIMA (AutoRegressive Integrated Moving Average) model was applied to simulate the monthly-mean Terra MODIS (MODerate Resolution Imaging Spectroradiometer) Aerosol Optical Depth (AOD550 nm) over eight sites in the region covering a period of about 13 years (March 2000-May 2012). The autocorrelation structure has been analyzed indicating a deterministic pattern in the time series that it regains its structure every 24 month period. The ARIMA models namely ARIMA(2,0,12), ARIMA(1,0,6), ARIMA(3,0,0), ARIMA(2,0,13) ARIMA(0,0,12), ARIMA(2,0,2), ARIMA(1,0,12) and ARIMA(0,0,1) have been developed as the most suitable for simulating and forecasting the monthly-mean AOD over the eight selected locations. The Stationary R-squared, R-squared, Root Mean Square Error (RMSE) and Normalized BIC (Bayesian Information Criterion) are used to test the validity and applicability of the developed ARIMA models revealing adequate accuracy in the model performance. The values of Hurst Exponent, Fractal Dimension and Predictability Index for AODs are about 0.5, 1.5 and 0, respectively, suggesting that the AODs in all sites follow the Brownian time-series motion (true random walk). High AOD values (> 0.7) are observed over the industrialized and densely-populated IGP sites associated with low ones over the foothills/slopes of the Himalayas. The trends in AOD during the ~ 13-year period differentiates depending on season and site. During post-monsoon and winter accumulation of aerosols and increasing trends are shown over IGP sites, which are neutralized in pre-monsoon and become slightly negative in monsoon. The AOD over the Himalayan sites does not exhibit any significant trend and seems to be practically unaffected by the aerosol built-up over IGP.

  7. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  8. Machine Learning Based Evaluation of Reading and Writing Difficulties.

    PubMed

    Iwabuchi, Mamoru; Hirabayashi, Rumi; Nakamura, Kenryu; Dim, Nem Khan

    2017-01-01

    The possibility of auto evaluation of reading and writing difficulties was investigated using non-parametric machine learning (ML) regression technique for URAWSS (Understanding Reading and Writing Skills of Schoolchildren) [1] test data of 168 children of grade 1 - 9. The result showed that the ML had better prediction than the ordinary rule-based decision.

  9. Kepler AutoRegressive Planet Search (KARPS)

    NASA Astrophysics Data System (ADS)

    Caceres, Gabriel

    2018-01-01

    One of the main obstacles in detecting faint planetary transits is the intrinsic stellar variability of the host star. The Kepler AutoRegressive Planet Search (KARPS) project implements statistical methodology associated with autoregressive processes (in particular, ARIMA and ARFIMA) to model stellar lightcurves in order to improve exoplanet transit detection. We also develop a novel Transit Comb Filter (TCF) applied to the AR residuals which provides a periodogram analogous to the standard Box-fitting Least Squares (BLS) periodogram. We train a random forest classifier on known Kepler Objects of Interest (KOIs) using select features from different stages of this analysis, and then use ROC curves to define and calibrate the criteria to recover the KOI planet candidates with high fidelity. These statistical methods are detailed in a contributed poster (Feigelson et al., this meeting).These procedures are applied to the full DR25 dataset of NASA’s Kepler mission. Using the classification criteria, a vast majority of known KOIs are recovered and dozens of new KARPS Candidate Planets (KCPs) discovered, including ultra-short period exoplanets. The KCPs will be briefly presented and discussed.

  10. A new semiquantitative method for evaluation of metastasis progression.

    PubMed

    Volarevic, A; Ljujic, B; Volarevic, V; Milovanovic, M; Kanjevac, T; Lukic, A; Arsenijevic, N

    2012-01-01

    Although recent technical advancements are directed toward developing novel assays and methods for detection of micro and macro metastasis, there are still no reports of reliable, simple to use imaging software that could be used for the detection and quantification of metastasis in tissue sections. We herein report a new semiquantitative method for evaluation of metastasis progression in a well established 4T1 orthotopic mouse model of breast cancer metastasis. The new semiquantitative method presented here was implemented by using the Autodesk AutoCAD 2012 program, a computer-aided design program used primarily for preparing technical drawings in 2 dimensions. By using the Autodesk AutoCAD 2012 software- aided graphical evaluation we managed to detect each metastatic lesion and we precisely calculated the average percentage of lung and liver tissue parenchyma with metastasis in 4T1 tumor-bearing mice. The data were highly specific and relevant to descriptive histological analysis, confirming reliability and accuracy of the AutoCAD 2012 software as new method for quantification of metastatic lesions. The new semiquantitative method using AutoCAD 2012 software provides a novel approach for the estimation of metastatic progression in histological tissue sections.

  11. Motion prediction in MRI-guided radiotherapy based on interleaved orthogonal cine-MRI

    NASA Astrophysics Data System (ADS)

    Seregni, M.; Paganelli, C.; Lee, D.; Greer, P. B.; Baroni, G.; Keall, P. J.; Riboldi, M.

    2016-01-01

    In-room cine-MRI guidance can provide non-invasive target localization during radiotherapy treatment. However, in order to cope with finite imaging frequency and system latencies between target localization and dose delivery, tumour motion prediction is required. This work proposes a framework for motion prediction dedicated to cine-MRI guidance, aiming at quantifying the geometric uncertainties introduced by this process for both tumour tracking and beam gating. The tumour position, identified through scale invariant features detected in cine-MRI slices, is estimated at high-frequency (25 Hz) using three independent predictors, one for each anatomical coordinate. Linear extrapolation, auto-regressive and support vector machine algorithms are compared against systems that use no prediction or surrogate-based motion estimation. Geometric uncertainties are reported as a function of image acquisition period and system latency. Average results show that the tracking error RMS can be decreased down to a [0.2; 1.2] mm range, for acquisition periods between 250 and 750 ms and system latencies between 50 and 300 ms. Except for the linear extrapolator, tracking and gating prediction errors were, on average, lower than those measured for surrogate-based motion estimation. This finding suggests that cine-MRI guidance, combined with appropriate prediction algorithms, could relevantly decrease geometric uncertainties in motion compensated treatments.

  12. Lateral Information Processing by Spiking Neurons: A Theoretical Model of the Neural Correlate of Consciousness

    PubMed Central

    Ebner, Marc; Hameroff, Stuart

    2011-01-01

    Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on “autopilot”). Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the “conscious pilot”) suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious “auto-pilot” cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways “gap junctions” in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of synchrony, within which the contents of perception are represented and contained. This mobile zone can be viewed as a model of the neural correlate of consciousness in the brain. PMID:22046178

  13. Lateral information processing by spiking neurons: a theoretical model of the neural correlate of consciousness.

    PubMed

    Ebner, Marc; Hameroff, Stuart

    2011-01-01

    Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on "autopilot"). Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the "conscious pilot") suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious "auto-pilot" cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways "gap junctions" in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of synchrony, within which the contents of perception are represented and contained. This mobile zone can be viewed as a model of the neural correlate of consciousness in the brain.

  14. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    PubMed

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.

  15. Acute effects of PM2.5 on lung function parameters in schoolchildren in Nanjing, China: a panel study.

    PubMed

    Xu, Dandan; Zhang, Yi; Zhou, Lian; Li, Tiantian

    2018-03-17

    The association between exposure to ambient particulate matter (PM) and reduced lung function parameters has been reported in many works. However, few studies have been conducted in developing countries with high levels of air pollution like China, and little attention has been paid to the acute effects of short-term exposure to air pollution on lung function. The study design consisted of a panel comprising 86 children from the same school in Nanjing, China. Four measurements of lung function were performed. A mixed-effects regression model with study participant as a random effect was used to investigate the relationship between PM 2.5 and lung function. An increase in the current day, 1-day and 2-day moving average PM 2.5 concentration was associated with decreases in lung function indicators. The greatest effect of PM 2.5 on lung function was detected at 1-day moving average PM 2.5 exposure. An increase of 10 μg/m 3 in the 1-day moving average PM 2.5 concentration was associated with a 23.22 mL decrease (95% CI: 13.19, 33.25) in Forced Vital Capacity (FVC), a 18.93 mL decrease (95% CI: 9.34, 28.52) in 1-s Forced Expiratory Volume (FEV 1 ), a 29.38 mL/s decrease (95% CI: -0.40, 59.15) in Peak Expiratory Flow (PEF), and a 27.21 mL/s decrease (95% CI: 8.38, 46.04) in forced expiratory flow 25-75% (FEF 25-75% ). The effects of PM 2.5 on lung function had significant lag effects. After an air pollution event, the health effects last for several days and we still need to pay attention to health protection.

  16. Short-Term Exposure to Air Pollution and Biomarkers of Oxidative Stress: The Framingham Heart Study.

    PubMed

    Li, Wenyuan; Wilker, Elissa H; Dorans, Kirsten S; Rice, Mary B; Schwartz, Joel; Coull, Brent A; Koutrakis, Petros; Gold, Diane R; Keaney, John F; Lin, Honghuang; Vasan, Ramachandran S; Benjamin, Emelia J; Mittleman, Murray A

    2016-04-28

    Short-term exposure to elevated air pollution has been associated with higher risk of acute cardiovascular diseases, with systemic oxidative stress induced by air pollution hypothesized as an important underlying mechanism. However, few community-based studies have assessed this association. Two thousand thirty-five Framingham Offspring Cohort participants living within 50 km of the Harvard Boston Supersite who were not current smokers were included. We assessed circulating biomarkers of oxidative stress including blood myeloperoxidase at the seventh examination (1998-2001) and urinary creatinine-indexed 8-epi-prostaglandin F2α (8-epi-PGF2α) at the seventh and eighth (2005-2008) examinations. We measured fine particulate matter (PM2.5), black carbon, sulfate, nitrogen oxides, and ozone at the Supersite and calculated 1-, 2-, 3-, 5-, and 7-day moving averages of each pollutant. Measured myeloperoxidase and 8-epi-PGF2α were loge transformed. We used linear regression models and linear mixed-effects models with random intercepts for myeloperoxidase and indexed 8-epi-PGF2α, respectively. Models were adjusted for demographic variables, individual- and area-level measures of socioeconomic position, clinical and lifestyle factors, weather, and temporal trend. We found positive associations of PM2.5 and black carbon with myeloperoxidase across multiple moving averages. Additionally, 2- to 7-day moving averages of PM2.5 and sulfate were consistently positively associated with 8-epi-PGF2α. Stronger positive associations of black carbon and sulfate with myeloperoxidase were observed among participants with diabetes than in those without. Our community-based investigation supports an association of select markers of ambient air pollution with circulating biomarkers of oxidative stress. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  17. Experimental investigation of a moving averaging algorithm for motion perpendicular to the leaf travel direction in dynamic MLC target tracking.

    PubMed

    Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul

    2011-07-01

    In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a tgamma-test with a 3%/3 mm criterion. The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the gamma-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation. The delivery efficiency of moving average tracking was up to four times higher than that of real-time tracking and approached the efficiency of no compensation for all cases. The geometric accuracy and dosimetric accuracy of the moving average algorithm was between real-time tracking and no compensation, approximately half the percentage of dosimetric points failing the gamma-test compared with no compensation.

  18. TU-AB-BRC-11: Moving a GPU-OpenCL-Based Monte Carlo (MC) Dose Engine Towards Routine Clinical Use: Automatic Beam Commissioning and Efficient Source Sampling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z; Folkerts, M; Jiang, S

    Purpose: We have previously developed a GPU-OpenCL-based MC dose engine named goMC with built-in analytical linac beam model. To move goMC towards routine clinical use, we have developed an automatic beam-commissioning method, and an efficient source sampling strategy to facilitate dose calculations for real treatment plans. Methods: Our commissioning method is to automatically adjust the relative weights among the sub-sources, through an optimization process minimizing the discrepancies between calculated dose and measurements. Six models built for Varian Truebeam linac photon beams (6MV, 10MV, 15MV, 18MV, 6MVFFF, 10MVFFF) were commissioned using measurement data acquired at our institution. To facilitate dose calculationsmore » for real treatment plans, we employed inverse sampling method to efficiently incorporate MLC leaf-sequencing into source sampling. Specifically, instead of sampling source particles control-point by control-point and rejecting the particles blocked by MLC, we assigned a control-point index to each sampled source particle, according to MLC leaf-open duration of each control-point at the pixel where the particle intersects the iso-center plane. Results: Our auto-commissioning method decreased distance-to-agreement (DTA) of depth dose at build-up regions by 36.2% averagely, making it within 1mm. Lateral profiles were better matched for all beams, with biggest improvement found at 15MV for which root-mean-square difference was reduced from 1.44% to 0.50%. Maximum differences of output factors were reduced to less than 0.7% for all beams, with largest decrease being from1.70% to 0.37% found at 10FFF. Our new sampling strategy was tested on a Head&Neck VMAT patient case. Achieving clinically acceptable accuracy, the new strategy could reduce the required history number by a factor of ∼2.8 given a statistical uncertainty level and hence achieve a similar speed-up factor. Conclusion: Our studies have demonstrated the feasibility and effectiveness of our auto-commissioning approach and new efficient source sampling strategy, implying the potential of our GPU-based MC dose engine goMC for routine clinical use.« less

  19. Genetic and Environmental Contributions to the Development of Childhood Aggression

    ERIC Educational Resources Information Center

    Lubke, Gitta H.; McArtor, Daniel B.; Boomsma, Dorret I.; Bartels, Meike

    2018-01-01

    Longitudinal data from a large sample of twins participating in the Netherlands Twin Register (n = 42,827, age range 3-16) were analyzed to investigate the genetic and environmental contributions to childhood aggression. Genetic auto-regressive (simplex) models were used to assess whether the same genes are involved or whether new genes come into…

  20. Latent Transition Analysis of Pre-Service Teachers' Efficacy in Mathematics and Science

    ERIC Educational Resources Information Center

    Ward, Elizabeth Kennedy

    2009-01-01

    This study modeled changes in pre-service teacher efficacy in mathematics and science over the course of the final year of teacher preparation using latent transition analysis (LTA), a longitudinal form of analysis that builds on two modeling traditions (latent class analysis (LCA) and auto-regressive modeling). Data were collected using the…

  1. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia

    PubMed Central

    2010-01-01

    Background Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Methods Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations), temperature (17 locations), and relative humidity (three locations). Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF) models and univariate auto-regressive integrated moving average (ARIMA) when there was no significant predictor meteorological variable. Results Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations) or when coupled with meteorological variables (four locations) was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location to location, and among lagged effects, data transformation forms, ARIMA and TF orders. Conclusions This study describes P. falciparum malaria incidence models linked with meteorological data. Variability in the models was principally attributed to regional differences, and a single model was not found that fits all locations. Past P. falciparum malaria incidence appeared to be a superior predictor than meteorology. Future efforts in malaria modelling may benefit from inclusion of non-meteorological factors. PMID:20553590

  2. Exposure to carbon monoxide, methyl-tertiary butyl ether (MTBE), and benzene levels inside vehicles traveling on an urban area in Korea.

    PubMed

    Jo, W K; Park, K H

    1998-01-01

    This study was designed to allow systematic comparison of exposure on public (40-seater buses) and private (four passengers cars) transport modes for carbon monoxide (CO), methyl-tertiary butyl ether (MTBE), and benzene by carrying out simultaneous measurements along the same routes. There were statistically significant differences (p < 0.05) in the concentrations of all target compounds among the three microenvironments; inside autos; inside buses; and in ambient air. The target compounds were significantly correlated for all the three environments, with at least p < 0.05. The in-vehicle concentrations of MTBE and benzene were significantly higher (p < 0.0001), on the average 3.5 times higher, in the car with a carbureted engine than in the other three electronic fuel-injected cars. On the other hand, the CO concentrations were not significantly different among the four cars. The in-auto MTBE levels (48.5 micrograms/m3 as a median) measured during commutes in this study was 2-3 times higher than the New Jersey and Connecticut's results. The in-auto concentration of CO (4.8 ppm as a median) in this study was comparable with those in later studies in some American cities, but much lower than those in earlier studies in other American cities. The in-bus CO concentration was 3.6 ppm as a median. As a median, the in-auto concentration of benzene was 44.9 micrograms/m3, while the in-bus concentration 17.0 micrograms/m3. The in-auto/in-bus exposure ratios for all the target compounds was 31-40% higher than the corresponding concentration ratios, due to the higher travel speed on buses in the specified commute route as compared to the autos.

  3. Decay of the supersonic turbulent wakes from micro-ramps

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Schrijer, F. F. J.; Scarano, F.; van Oudheusden, B. W.

    2014-02-01

    The wakes resulting from micro-ramps immersed in a supersonic turbulent boundary layer at Ma = 2.0 are investigated by means of particle image velocimetry. Two micro-ramps are investigated with height of 60% and 80% of the undisturbed boundary layer, respectively. The measurement domain is placed at the symmetry plane of the ramp and encompasses the range from 10 to 32 ramp heights downstream of the ramp. The decay of the flow field properties is evaluated in terms of time-averaged and root-mean-square (RMS) statistics. In the time-averaged flow field, the recovery from the imparted momentum deficit and the decay of upwash motion are analyzed. The RMS fluctuations of the velocity components exhibit strong anisotropy at the most upstream location and develop into a more isotropic regime downstream. The self-similarity properties of velocity components and fluctuation components along wall-normal direction are followed. The investigation of the unsteady large scale motion is carried out by means of snapshot analysis and by a statistical approach based on the spatial auto-correlation function. The Kelvin-Helmholtz (K-H) instability at the upper shear layer is observed to develop further with the onset of vortex pairing. The average distance between vortices is statistically estimated using the spatial auto-correlation. A marked transition with the wavelength increase is observed across the pairing regime. The K-H instability, initially observed only at the upper shear layer also begins to appear in the lower shear layer as soon as the wake is elevated sufficiently off the wall. The auto-correlation statistics confirm the coherence of counter-rotating vortices from the upper and lower sides, indicating the formation of vortex rings downstream of the pairing region.

  4. Concentrations of fine, ultrafine, and black carbon particles in auto-rickshaws in New Delhi, India

    NASA Astrophysics Data System (ADS)

    Apte, Joshua, S.; Kirchstetter, Thomas W.; Reich, Alexander, H.; Deshpande, Shyam J.; Kaushik, Geetanjali; Chel, Arvind; Marshall, Julian D.; Nazaroff, William W.

    2011-08-01

    Concentrations of air pollutants from vehicles are elevated along roadways, indicating that human exposure in transportation microenvironments may not be adequately characterized by centrally located monitors. We report results from ˜180 h of real-time measurements of fine particle and black carbon mass concentration (PM 2.5, BC) and ultrafine particle number concentration (PN) inside a common vehicle, the auto-rickshaw, in New Delhi, India. Measured exposure concentrations are much higher in this study (geometric mean for ˜60 trip-averaged concentrations: 190 μg m -3 PM 2.5, 42 μg m -3 BC, 280 × 10 3 particles cm -3; GSD ˜1.3 for all three pollutants) than reported for transportation microenvironments in other megacities. In-vehicle concentrations exceeded simultaneously measured ambient levels by 1.5× for PM 2.5, 3.6× for BC, and 8.4× for PN. Short-duration peak concentrations (averaging time: 10 s), attributable to exhaust plumes of nearby vehicles, were greater than 300 μg m -3 for PM 2.5, 85 μg m -3 for BC, and 650 × 10 3 particles cm -3 for PN. The incremental increase of within-vehicle concentration above ambient levels—which we attribute to in- and near-roadway emission sources—accounted for 30%, 68% and 86% of time-averaged in-vehicle PM 2.5, BC and PN concentrations, respectively. Based on these results, we estimate that one's exposure during a daily commute by auto-rickshaw in Delhi is as least as large as full-day exposures experienced by urban residents of many high-income countries. This study illuminates an environmental health concern that may be common in many populous, low-income cities.

  5. The impact of household cooking and heating with solid fuels on ambient PM2.5 in peri-urban Beijing

    NASA Astrophysics Data System (ADS)

    Liao, Jiawen; Zimmermann Jin, Anna; Chafe, Zoë A.; Pillarisetti, Ajay; Yu, Tao; Shan, Ming; Yang, Xudong; Li, Haixi; Liu, Guangqing; Smith, Kirk R.

    2017-09-01

    Household cooking and space heating with biomass and coal have adverse impacts on both indoor and outdoor air quality and are associated with a significant health burden. Though household heating with biomass and coal is common in northern China, the contribution of space heating to ambient air pollution is not well studied. We investigated the impact of space heating on ambient air pollution in a village 40 km southwest of central Beijing during the winter heating season, from January to March 2013. Ambient PM2.5 concentrations and meteorological conditions were measured continuously at rooftop sites in the village during two winter months in 2013. The use of coal- and biomass-burning cookstoves and space heating devices was measured over time with Stove Use Monitors (SUMs) in 33 households and was coupled with fuel consumption data from household surveys to estimate hourly household PM2.5 emissions from cooking and space heating over the same period. We developed a multivariate linear regression model to assess the relationship between household PM2.5 emissions and the hourly average ambient PM2.5 concentration, and a time series autoregressive integrated moving average (ARIMA) regression model to account for autocorrelation. During the heating season, the average hourly ambient PM2.5 concentration was 139 ± 107 μg/m3 (mean ± SD) with strong autocorrelation in hourly concentration. The average primary PM2.5 emission per hour from village household space heating was 0.736 ± 0.138 kg/hour. The linear multivariate regression model indicated that during the heating season - after adjusting for meteorological effects - 39% (95% CI: 26%, 54%) of hourly averaged ambient PM2.5 was associated with household space heating emissions from the previous hour. Our study suggests that a comprehensive pollution control strategy for northern China, including Beijing, should address uncontrolled emissions from household solid fuel combustion in surrounding areas, particularly during the winter heating season.

  6. Correlation among auto-refractor, wavefront aberration, and subjective manual refraction

    NASA Astrophysics Data System (ADS)

    Li, Qi; Ren, Qiushi

    2005-01-01

    Three optometry methods which include auto-refractor, wavefront aberrometer and subjective manual refraction were studied and compared in measuring low order aberrations of 60 people"s 117 normal eyes. Paired t-test and linear regression were used to study these three methods" relationship when measuring myopia with astigmatism. In order to make the analysis more clear, we divided the 117 normal eyes into different groups according to their subjective manual refraction and redid the statistical analysis. Correlations among three methods show significant in sphere, cylinder and axis in all groups, with sphere"s correlation coefficients largest(R>0.98, P<0.01) and cylinder"s smallest (0.900.01). Auto-refractor had significant change from the other two methods when measuring cylinder (P<0.01). The results after grouping differed a little from the analysis among total people. Although three methods showed significant change from each other in certain parameters, the amplitude of these differences were not large, which indicated that the coherence of auto-refractor, wavefront aberrometer and subjective refraction is good. However, we suggested that wavefront aberration measurement could be a good starting point of optometry, subjective refraction is still necessary for refinement.

  7. Quantitation of mandibular ramus volume as a source of bone grafting.

    PubMed

    Verdugo, Fernando; Simonian, Krikor; Smith McDonald, Roberto; Nowzari, Hessam

    2009-10-01

    When alveolar atrophy impairs dental implant placement, ridge augmentation using mandibular ramus graft may be considered. In live patients, however, an accurate calculation of the amount of bone that can be safely harvested from the ramus has not been reported. The use of a software program to perform these calculations can aid in preventing surgical complications. The aim of the present study was to intra-surgically quantify the volume of the ramus bone graft that can be safely harvested in live patients, and compare it to presurgical computerized tomographic calculations. The AutoCAD software program quantified ramus bone graft in 40 consecutive patients from computerized tomographies. Direct intra-surgical measurements were recorded thereafter and compared to software data (n = 10). In these 10 patients, the bone volume was also measured at the recipient sites 6 months post-sinus augmentation. The mandibular second and third molar areas provided the thickest cortical graft averaging 2.8 +/- 0.6 mm. The thinnest bone was immediately posterior to the third molar (1.9 +/- 0.3 mm). The volume of ramus bone graft measured by AutoCAD averaged 0.8 mL (standard deviation [SD] 0.2 mL, range: 0.4-1.2 mL). The volume of bone graft measured intra-surgically averaged 2.5 mL (SD 0.4 mL, range: 1.8-3.0 mL). The difference between the two measurement methods was significant (p < 0.001). The bone volume measured 6 months post-sinus augmentation averaged 2.2 mL (SD 0.4 mL, range: 1.6-2.8 mL) with a mean loss of 0.3 mL in volume. The mandibular second molar area provided the thickest cortical graft. A cortical plate of 2.8 mm in average at combined second and third molar areas provided 2.5 mL particulated volume. The use of a design software program can improve surgical treatment planning prior to ramus bone grafting. The AutoCAD software program did not overestimate the volume of bone that can be safely harvested from the mandibular ramus.

  8. Kepler AutoRegressive Planet Search

    NASA Astrophysics Data System (ADS)

    Caceres, Gabriel Antonio; Feigelson, Eric

    2016-01-01

    The Kepler AutoRegressive Planet Search (KARPS) project uses statistical methodology associated with autoregressive (AR) processes to model Kepler lightcurves in order to improve exoplanet transit detection in systems with high stellar variability. We also introduce a planet-search algorithm to detect transits in time-series residuals after application of the AR models. One of the main obstacles in detecting faint planetary transits is the intrinsic stellar variability of the host star. The variability displayed by many stars may have autoregressive properties, wherein later flux values are correlated with previous ones in some manner. Our analysis procedure consisting of three steps: pre-processing of the data to remove discontinuities, gaps and outliers; AR-type model selection and fitting; and transit signal search of the residuals using a new Transit Comb Filter (TCF) that replaces traditional box-finding algorithms. The analysis procedures of the project are applied to a portion of the publicly available Kepler light curve data for the full 4-year mission duration. Tests of the methods have been made on a subset of Kepler Objects of Interest (KOI) systems, classified both as planetary `candidates' and `false positives' by the Kepler Team, as well as a random sample of unclassified systems. We find that the ARMA-type modeling successfully reduces the stellar variability, by a factor of 10 or more in active stars and by smaller factors in more quiescent stars. A typical quiescent Kepler star has an interquartile range (IQR) of ~10 e-/sec, which may improve slightly after modeling, while those with IQR ranging from 20 to 50 e-/sec, have improvements from 20% up to 70%. High activity stars (IQR exceeding 100) markedly improve. A periodogram based on the TCF is constructed to concentrate the signal of these periodic spikes. When a periodic transit is found, the model is displayed on a standard period-folded averaged light curve. Our findings to date on real-data tests of the KARPS methodology will be discussed including confirmation of some Kepler Team `candidate' planets. We also present cases of new possible planetary signals.

  9. Horses Auto-Recruit Their Lungs by Inspiratory Breath Holding Following Recovery from General Anaesthesia

    PubMed Central

    Mosing, Martina; Waldmann, Andreas D.; MacFarlane, Paul; Iff, Samuel; Auer, Ulrike; Bohm, Stephan H.; Bettschart-Wolfensberger, Regula; Bardell, David

    2016-01-01

    This study evaluated the breathing pattern and distribution of ventilation in horses prior to and following recovery from general anaesthesia using electrical impedance tomography (EIT). Six horses were anaesthetised for 6 hours in dorsal recumbency. Arterial blood gas and EIT measurements were performed 24 hours before (baseline) and 1, 2, 3, 4, 5 and 6 hours after horses stood following anaesthesia. At each time point 4 representative spontaneous breaths were analysed. The percentage of the total breath length during which impedance remained greater than 50% of the maximum inspiratory impedance change (breath holding), the fraction of total tidal ventilation within each of four stacked regions of interest (ROI) (distribution of ventilation) and the filling time and inflation period of seven ROI evenly distributed over the dorso-ventral height of the lungs were calculated. Mixed effects multi-linear regression and linear regression were used and significance was set at p<0.05. All horses demonstrated inspiratory breath holding until 5 hours after standing. No change from baseline was seen for the distribution of ventilation during inspiration. Filling time and inflation period were more rapid and shorter in ventral and slower and longer in most dorsal ROI compared to baseline, respectively. In a mixed effects multi-linear regression, breath holding was significantly correlated with PaCO2 in both the univariate and multivariate regression. Following recovery from anaesthesia, horses showed inspiratory breath holding during which gas redistributed from ventral into dorsal regions of the lungs. This suggests auto-recruitment of lung tissue which would have been dependent and likely atelectic during anaesthesia. PMID:27331910

  10. Detection and characterization of stomach cancer and atrophic gastritis with fluorescence and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Xiaozhou; Lin, Junxiu; Jia, Chunde; Wang, Rong

    2003-12-01

    In this paper, we attempt to find a valid method to distinguish gastric cancer and atrophic gastritis. Auto-fluorescence and Raman spectroscopy of laser induced (514.5 nm and 488.0 nm) was measured. The serum spectrum is different between normal and cancer. Average value of diagnosis parameter for normal serum, red shift is less than 12 nm and Raman relative intensity of peak C by 514.5 nm excited is stronger than that of 488.0 nm. To gastric cancer, its red shift of average is bigger than 12 nm and relative intensity of Raman peak C by 514.5 nm excited is weaker than that by 488.0 nm. To atrophic gastritis, the distribution state of Raman peaks is similar with normal serum and auto-fluorescence spectrum's shape is similar to that of gastric cancer. Its average Raman peak red shift is bigger than 12 nm and the relative intensity of peak C by 514.5 excited is stronger than that of by 488.0. We considered it as a criterion and got an accuracy of 85.6% for diagnosis of gastric cancer compared with the result of clinical diagnosis.

  11. ArF scanner performance improvement by using track integrated CD optimization

    NASA Astrophysics Data System (ADS)

    Huang, Jacky; Yu, Shinn-Sheng; Ke, Chih-Ming; Wu, Timothy; Wang, Yu-Hsi; Gau, Tsai-Sheng; Wang, Dennis; Li, Allen; Yang, Wenge; Kaoru, Araki

    2006-03-01

    In advanced semiconductor processing, shrinking CD is one of the main objectives when moving to the next generation technology. Improving CD uniformity (CDU) with shrinking CD is one of the biggest challenges. From ArF lithography CD error budget analysis, PEB (post exposure bake) contributes more than 40% CD variations. It turns out that hot plate performance such as CD matching and within-plate temperature control play key roles in litho cell wafer per hour (WPH). Traditionally wired or wireless thermal sensor wafers were used to match and optimize hot plates. However, sensor-to-sensor matching and sensor data quality vs. sensor lifetime or sensor thermal history are still unknown. These concerns make sensor wafers more suitable for coarse mean-temperature adjustment. For precise temperature adjustment, especially within-hot-plate temperature uniformity, using CD instead of sensor wafer temperature is a better and more straightforward metrology to calibrate hot plates. In this study, we evaluated TEL clean track integrated optical CD metrology (IM) combined with TEL CD Optimizer (CDO) software to improve 193-nm resist within-wafer and wafer-to-wafer CD uniformity. Within-wafer CD uniformity is mainly affected by the temperature non-uniformity on the PEB hot plate. Based on CD and PEB sensitivity of photo resists, a physical model has been established to control the CD uniformity through fine-tuning PEB temperature settings. CD data collected by track integrated CD metrology was fed into this model, and the adjustment of PEB setting was calculated and executed through track internal APC system. This auto measurement, auto feed forward, auto calibration and auto adjustment system can reduce the engineer key-in error and improve the hot plate calibration cycle time. And this PEB auto calibration system can easily bring hot-plate-to-hot-plate CD matching to within 0.5nm and within-wafer CDU (3σ) to less than 1.5nm.

  12. Auto-positioning ultrasonic transducer system

    NASA Technical Reports Server (NTRS)

    Buchanan, Randy K. (Inventor)

    2010-01-01

    An ultrasonic transducer apparatus and process for determining the optimal transducer position for flow measurement along a conduit outer surface. The apparatus includes a transmitting transducer for transmitting an ultrasonic signal, said transducer affixed to a conduit outer surface; a guide rail attached to a receiving transducer for guiding movement of a receiving transducer along the conduit outer surface, wherein the receiving transducer receives an ultrasonic signal from the transmitting transducer and sends a signal to a data acquisition system; and a motor for moving the receiving transducer along the guide rail, wherein the motor is controlled by a controller. The method includes affixing a transmitting transducer to an outer surface of a conduit; moving a receiving transducer on the conduit outer surface, wherein the receiving transducer is moved along a guide rail by a motor; transmitting an ultrasonic signal from the transmitting transducer that is received by the receiving transducer; communicating the signal received by the receiving transducer to a data acquisition and control system; and repeating the moving, transmitting, and communicating along a length of the conduit.

  13. QuickVina: accelerating AutoDock Vina using gradient-based heuristics for global optimization.

    PubMed

    Handoko, Stephanus Daniel; Ouyang, Xuchang; Su, Chinh Tran To; Kwoh, Chee Keong; Ong, Yew Soon

    2012-01-01

    Predicting binding between macromolecule and small molecule is a crucial phase in the field of rational drug design. AutoDock Vina, one of the most widely used docking software released in 2009, uses an empirical scoring function to evaluate the binding affinity between the molecules and employs the iterated local search global optimizer for global optimization, achieving a significantly improved speed and better accuracy of the binding mode prediction compared its predecessor, AutoDock 4. In this paper, we propose further improvement in the local search algorithm of Vina by heuristically preventing some intermediate points from undergoing local search. Our improved version of Vina-dubbed QVina-achieved a maximum acceleration of about 25 times with the average speed-up of 8.34 times compared to the original Vina when tested on a set of 231 protein-ligand complexes while maintaining the optimal scores mostly identical. Using our heuristics, larger number of different ligands can be quickly screened against a given receptor within the same time frame.

  14. A Simple Introduction to Moving Least Squares and Local Regression Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garimella, Rao Veerabhadra

    In this brief note, a highly simpli ed introduction to esimating functions over a set of particles is presented. The note starts from Global Least Squares tting, going on to Moving Least Squares estimation (MLS) and nally, Local Regression Estimation (LRE).

  15. Notes sur les mouvements recursifs (Notes on Regressive Moves).

    ERIC Educational Resources Information Center

    Auchlin, Antoine; And Others

    1981-01-01

    Examines the phenomenon of regressive moves (retro-interpretation) in the light of a hypothesis according to which the formation of complex and hierarchically organized conversation units is subordinated to the linearity of discourse. Analyzes a transactional exchange, describing the interplay of integration, anticipation, and retro-interpretation…

  16. The effect of airline deregulation on automobile fatalities.

    PubMed

    Bylow, L F; Savage, I

    1991-10-01

    This paper attempts to quantify the effects of airline deregulation in the United States on intercity automobile travel and consequently on the number of highway fatalities. A demand model is constructed for auto travel, which includes variables representing the price and availability of air service. A reduced form model of the airline market is then estimated. Finding that deregulation has decreased airfares and increased flights, it is estimated that auto travel has been reduced by 2.2% per year on average. Given assumptions on the characteristics of drivers switching modes and the types of roads they drove on, the number of automobile fatalities averted since 1978 is estimated to be in the range 200-300 per year.

  17. Analyzing spatial coherence using a single mobile field sensor.

    PubMed

    Fridman, Peter

    2007-04-01

    According to the Van Cittert-Zernike theorem, the intensity distribution of a spatially incoherent source and the mutual coherence function of the light impinging on two wave sensors are related. It is the comparable relationship using a single mobile sensor moving at a certain velocity relative to the source that is calculated in this paper. The auto-corelation function of the electric field at the sensor contains information about the intensity distribution. This expression could be employed in aperture synthesis.

  18. The economic impact of a smoke-free bylaw on restaurant and bar sales in Ottawa, Canada.

    PubMed

    Luk, Rita; Ferrence, Roberta; Gmel, Gerhard

    2006-05-01

    On 1 August 2001, the City of Ottawa (Canada's Capital) implemented a smoke-free bylaw that completely prohibited smoking in work-places and public places, including restaurants and bars, with no exemption for separately ventilated smoking rooms. This paper evaluates the effects of this bylaw on restaurant and bar sales. DATA AND MEASURES: We used retail sales tax data from March 1998 to June 2002 to construct two outcome measures: the ratio of licensed restaurant and bar sales to total retail sales and the ratio of unlicensed restaurant sales to total retail sales. Restaurant and bar sales were subtracted from total retail sales in the denominator of these measures. We employed an interrupted time-series design. Autoregressive integrated moving average (ARIMA) intervention analysis was used to test for three possible impacts that the bylaw might have on the sales of restaurants and bars. We repeated the analysis using regression with autoregressive moving average (ARMA) errors method to triangulate our results. Outcome measures showed declining trends at baseline before the bylaw went into effect. Results from ARIMA intervention and regression analyses did not support the hypotheses that the smoke-free bylaw had an impact that resulted in (1) abrupt permanent, (2) gradual permanent or (3) abrupt temporary changes in restaurant and bar sales. While a large body of research has found no significant adverse impact of smoke-free legislation on restaurant and bar sales in the United States, Australia and elsewhere, our study confirms these results in a northern region with a bilingual population, which has important implications for impending policy in Europe and other areas.

  19. Application of a Combined Model with Autoregressive Integrated Moving Average (ARIMA) and Generalized Regression Neural Network (GRNN) in Forecasting Hepatitis Incidence in Heng County, China

    PubMed Central

    Liang, Hao; Gao, Lian; Liang, Bingyu; Huang, Jiegang; Zang, Ning; Liao, Yanyan; Yu, Jun; Lai, Jingzhen; Qin, Fengxiang; Su, Jinming; Ye, Li; Chen, Hui

    2016-01-01

    Background Hepatitis is a serious public health problem with increasing cases and property damage in Heng County. It is necessary to develop a model to predict the hepatitis epidemic that could be useful for preventing this disease. Methods The autoregressive integrated moving average (ARIMA) model and the generalized regression neural network (GRNN) model were used to fit the incidence data from the Heng County CDC (Center for Disease Control and Prevention) from January 2005 to December 2012. Then, the ARIMA-GRNN hybrid model was developed. The incidence data from January 2013 to December 2013 were used to validate the models. Several parameters, including mean absolute error (MAE), root mean square error (RMSE), mean absolute percentage error (MAPE) and mean square error (MSE), were used to compare the performance among the three models. Results The morbidity of hepatitis from Jan 2005 to Dec 2012 has seasonal variation and slightly rising trend. The ARIMA(0,1,2)(1,1,1)12 model was the most appropriate one with the residual test showing a white noise sequence. The smoothing factor of the basic GRNN model and the combined model was 1.8 and 0.07, respectively. The four parameters of the hybrid model were lower than those of the two single models in the validation. The parameters values of the GRNN model were the lowest in the fitting of the three models. Conclusions The hybrid ARIMA-GRNN model showed better hepatitis incidence forecasting in Heng County than the single ARIMA model and the basic GRNN model. It is a potential decision-supportive tool for controlling hepatitis in Heng County. PMID:27258555

  20. Examining spatial-temporal variability and prediction of rainfall in North-eastern Nigeria

    NASA Astrophysics Data System (ADS)

    Muhammed, B. U.; Kaduk, J.; Balzter, H.

    2012-12-01

    In the last 50 years rainfall in North-eastern Nigeria under the influence of the West African Monsoon (WAM) has been characterised by large annual variations with severe droughts recorded in 1967-1973, and 1983-1987. This variability in rainfall has a large impact on the regions agricultural output, economy and security where the majority of the people depend on subsistence agriculture. In the 1990s there was a sign of recovery with higher annual rainfall totals compared to the 1961-1990 period but annual totals were slightly above the long term mean for the century. In this study we examine how significant this recovery is by analysing medium-term (1980-2006) rainfall of the region using the Climate Research Unit (CRU) and National Centre for Environment Prediction (NCEP) precipitation ½ degree, 6 hourly reanalysis data set. Percentage coefficient of variation increases northwards for annual rainfall (10%-35%) and the number of rainy days (10%-50%). The standardized precipitation index (SPI) of the area shows 7 years during the period as very wet (1996, 1999, 2003 and 2004) with SPI≥1.5 and moderately wet (1993, 1998, and 2006) with values of 1.0≥SPI≤1.49. Annual rainfall indicates a recovery from the 1990s and onwards but significant increases (in the amount of rainfall and number of days recorded with rainfall) is only during the peak of the monsoon season in the months of August and September (p<0.05) with no significant increases in the months following the onset of rainfall. Forecasting of monthly rainfall was made using the Auto Regressive Integrated Moving Average (ARIMA) model. The model is further evaluated using 24 months rainfall data yielding r=0.79 (regression slope=0.8; p<0.0001) in the sub-humid part of the study area and r=0.65 (regression slope=0.59, and p<0.0001) in the northern semi-arid part. The results suggest that despite the positive changes in rainfall (without significant increases in the months following the onset of the monsoon), the area has not fully recovered from the drought years of the 1960s, 70s, and 80s. These findings also highlight the implications of the current recovery on rain fed agriculture and water resources in the study area. The strong correlation and a root mean square error of 64.8 mm between the ARIMA model and the rainfall data used for this study indicates that the model can be satisfactorily used in forecasting rainfall in the in the sub-humid part of North-eastern Nigeria over a 24 months period.

  1. Indication of advanced orthokeratology as an additional treatment after refractive surgeries

    NASA Astrophysics Data System (ADS)

    Mitsui, Iwane; Yamada, Yoshida

    2005-04-01

    Ortho-K was indicated for twenty-three eyes of thirteen patients after refractive surgeries such as RK(1) ,PRK(2), and LASIK(3). The average of their Uncorrective Visual Acuity (UCVA) after surgeries was 20/30 or worse, and mean spherical equivalent (SE) was -2.42D. They were followed at least two years wearing of Advanced Ortho-K lenses during night. The following studies were examined on their auto-refraction, auto-keratometry, uncorrected and corrected visual acuity, intra-ocular pressure, corneal endothelium, corneal thickness, corneal curvature, and corneal shape for more than two years. 95% of the patients improved in UCVA up to 20/20 or better, 86% of them improved up to 20/15 or better, and 76% of them improved up to 20/10. The mean SEs improved to -1.20+/-1.02D during six months, - 1.03+/-0.83D during one year, and -0.73+/-0.64D during two years. Astigmatism also slightly decreased. Ophthalmologic examinations showed no abnormalities including flap formation, intra-ocular pressure, and endothelium. Among the refractive surgeries as well as RK and PRK, LASIK has been most popularly spread all over the world. However, patient's quality of vision is not always satisfied during and/or after refractive surgeries, because of several complications such as instability of flap formation, unexpected keratoectasia, diffuse lamellar keratitis, epithelial ingrowth, irregularity of corneal surface which caused myopia regression. In such cases, additional surgical procedures should not be indicated easily. However, Ortho-K is safe and effective enough to correct refractive errors still remained or re-appeared after refractive surgeries. It enables to restore the corneal irregularity to the ideal shape.

  2. Latent transition analysis of pre-service teachers' efficacy in mathematics and science

    NASA Astrophysics Data System (ADS)

    Ward, Elizabeth Kennedy

    This study modeled changes in pre-service teacher efficacy in mathematics and science over the course of the final year of teacher preparation using latent transition analysis (LTA), a longitudinal form of analysis that builds on two modeling traditions (latent class analysis (LCA) and auto-regressive modeling). Data were collected using the STEBI-B, MTEBI-r, and the ABNTMS instruments. The findings suggest that LTA is a viable technique for use in teacher efficacy research. Teacher efficacy is modeled as a construct with two dimensions: personal teaching efficacy (PTE) and outcome expectancy (OE). Findings suggest that the mathematics and science teaching efficacy (PTE) of pre-service teachers is a multi-class phenomena. The analyses revealed a four-class model of PTE at the beginning and end of the final year of teacher training. Results indicate that when pre-service teachers transition between classes, they tend to move from a lower efficacy class into a higher efficacy class. In addition, the findings suggest that time-varying variables (attitudes and beliefs) and time-invariant variables (previous coursework, previous experiences, and teacher perceptions) are statistically significant predictors of efficacy class membership. Further, analyses suggest that the measures used to assess outcome expectancy are not suitable for LCA and LTA procedures.

  3. Technical Efficiency of Automotive Industry Cluster in Chennai

    NASA Astrophysics Data System (ADS)

    Bhaskaran, E.

    2012-07-01

    Chennai is also called as Detroit of India due to its automotive industry presence producing over 40 % of the India's vehicle and components. During 2001-2002, diagnostic study was conducted on the Automotive Component Industries (ACI) in Ambattur Industrial Estate, Chennai and in SWOT analysis it was found that it had faced problems on infrastructure, technology, procurement, production and marketing. In the year 2004-2005 under the cluster development approach (CDA), they formed Chennai auto cluster, under public private partnership concept, received grant from Government of India, Government of Tamil Nadu, Ambattur Municipality, bank loans and stake holders. This results development in infrastructure, technology, procurement, production and marketing interrelationships among ACI. The objective is to determine the correlation coefficient, regression equation, technical efficiency, peer weights, slack variables and return to scale of cluster before and after the CDA. The methodology adopted is collection of primary data from ACI and analyzing using data envelopment analysis (DEA) of input oriented Banker-Charnes-Cooper model. There is significant increase in correlation coefficient and the regression analysis reveals that for one percent increase in employment and net worth, the gross output increases significantly after the CDA. The DEA solver gives the technical efficiency of ACI by taking shift, employment, net worth as input data and quality, gross output and export ratio as output data. From the technical score and ranking of ACI, it is found that there is significant increase in technical efficiency of ACI when compared to CDA. The slack variables obtained clearly reveals the excess employment and net worth and no shortage of gross output. To conclude there is increase in technical efficiency of not only Chennai auto cluster in general but also Chennai auto components industries in particular.

  4. Fix-It Careers: Jobs in Repair

    ERIC Educational Resources Information Center

    Torpey, Elka Maria

    2010-01-01

    From auto mechanic to HVAC technicians, many occupations require repair skills. For jobseekers with the right skills, there are many advantages to a repair career. Repair work provides millions of jobs throughout the United States. Wages are often higher than average. And in many occupations, the employment outlook is bright. Plus, most repair…

  5. SU-E-J-129: Atlas Development for Cardiac Automatic Contouring Using Multi-Atlas Segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, R; Yang, J; Pan, T

    Purpose: To develop a set of atlases for automatic contouring of cardiac structures to determine heart radiation dose and the associated toxicity. Methods: Six thoracic cancer patients with both contrast and non-contrast CT images were acquired for this study. Eight radiation oncologists manually and independently delineated cardiac contours on the non-contrast CT by referring to the fused contrast CT and following the RTOG 1106 atlas contouring guideline. Fifteen regions of interest (ROIs) were delineated, including heart, four chambers, four coronary arteries, pulmonary artery and vein, inferior and superior vena cava, and ascending and descending aorta. Individual expert contours were fusedmore » using the simultaneous truth and performance level estimation (STAPLE) algorithm for each ROI and each patient. The fused contours became atlases for an in-house multi-atlas segmentation. Using leave-one-out test, we generated auto-segmented contours for each ROI and each patient. The auto-segmented contours were compared with the fused contours using the Dice similarity coefficient (DSC) and the mean surface distance (MSD). Results: Inter-observer variability was not obvious for heart, chambers, and aorta but was large for other structures that were not clearly distinguishable on CT image. The average DSC between individual expert contours and the fused contours were less than 50% for coronary arteries and pulmonary vein, and the average MSD were greater than 4.0 mm. The largest MSD of expert contours deviating from the fused contours was 2.5 cm. The mean DSC and MSD of auto-segmented contours were within one standard deviation of expert contouring variability except the right coronary artery. The coronary arteries, vena cava, and pulmonary vein had DSC<70% and MSD>3.0 mm. Conclusion: A set of cardiac atlases was created for cardiac automatic contouring, the accuracy of which was comparable to the variability in expert contouring. However, substantial modification may need for auto-segmented contours of indistinguishable small structures.« less

  6. Changing patterns of meat consumption and greenhouse gas emissions in Australia: Will kangaroo meat make a difference?

    PubMed Central

    Ratnasiri, Shyama; Bandara, Jayatilleke

    2017-01-01

    The Australian per capita consumption of ruminant meat such as beef and lamb has declined over the last two decades. Over the same period, however, per capita consumption of non-ruminant meat such as chicken and pork has continued to increase. Furthermore, it is now observed that the human consumption of kangaroo meat is on the rise. This study investigates the implications of these changes in meat consumption patterns on Green House Gases (GHGs) emission mitigation in Australia using a Vector Auto Regression (VAR) forecasting approach. Our results suggest that the increase will continue in non-ruminant meat consumption and this will not only offset the decline in ruminant meat consumption, but will also raise the overall per capita meat consumption by approximately 1% annually. The per capita GHGs emissions will likely decrease by approximately 2.3% per annum, due to the inclusion of non-ruminant meat in Australian diets. The GHGs emissions can further be reduced if the average Australian consumer partially replaces ruminant meat with kangaroo meat. PMID:28196141

  7. Changing patterns of meat consumption and greenhouse gas emissions in Australia: Will kangaroo meat make a difference?

    PubMed

    Ratnasiri, Shyama; Bandara, Jayatilleke

    2017-01-01

    The Australian per capita consumption of ruminant meat such as beef and lamb has declined over the last two decades. Over the same period, however, per capita consumption of non-ruminant meat such as chicken and pork has continued to increase. Furthermore, it is now observed that the human consumption of kangaroo meat is on the rise. This study investigates the implications of these changes in meat consumption patterns on Green House Gases (GHGs) emission mitigation in Australia using a Vector Auto Regression (VAR) forecasting approach. Our results suggest that the increase will continue in non-ruminant meat consumption and this will not only offset the decline in ruminant meat consumption, but will also raise the overall per capita meat consumption by approximately 1% annually. The per capita GHGs emissions will likely decrease by approximately 2.3% per annum, due to the inclusion of non-ruminant meat in Australian diets. The GHGs emissions can further be reduced if the average Australian consumer partially replaces ruminant meat with kangaroo meat.

  8. Mechanically assisted liquid lens zoom system for mobile phone cameras

    NASA Astrophysics Data System (ADS)

    Wippermann, F. C.; Schreiber, P.; Bräuer, A.; Berge, B.

    2006-08-01

    Camera systems with small form factor are an integral part of today's mobile phones which recently feature auto focus functionality. Ready to market solutions without moving parts have been developed by using the electrowetting technology. Besides virtually no deterioration, easy control electronics and simple and therefore cost-effective fabrication, this type of liquid lenses enables extremely fast settling times compared to mechanical approaches. As a next evolutionary step mobile phone cameras will be equipped with zoom functionality. We present first order considerations for the optical design of a miniaturized zoom system based on liquid-lenses and compare it to its mechanical counterpart. We propose a design of a zoom lens with a zoom factor of 2.5 considering state-of-the-art commercially available liquid lens products. The lens possesses auto focus capability and is based on liquid lenses and one additional mechanical actuator. The combination of liquid lenses and a single mechanical actuator enables extremely short settling times of about 20ms for the auto focus and a simplified mechanical system design leading to lower production cost and longer life time. The camera system has a mechanical outline of 24mm in length and 8mm in diameter. The lens with f/# 3.5 provides market relevant optical performance and is designed for an image circle of 6.25mm (1/2.8" format sensor).

  9. A New Mathematical Framework for Design Under Uncertainty

    DTIC Science & Technology

    2016-05-05

    blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and

  10. Ultra-Short-Term Wind Power Prediction Using a Hybrid Model

    NASA Astrophysics Data System (ADS)

    Mohammed, E.; Wang, S.; Yu, J.

    2017-05-01

    This paper aims to develop and apply a hybrid model of two data analytical methods, multiple linear regressions and least square (MLR&LS), for ultra-short-term wind power prediction (WPP), for example taking, Northeast China electricity demand. The data was obtained from the historical records of wind power from an offshore region, and from a wind farm of the wind power plant in the areas. The WPP achieved in two stages: first, the ratios of wind power were forecasted using the proposed hybrid method, and then the transformation of these ratios of wind power to obtain forecasted values. The hybrid model combines the persistence methods, MLR and LS. The proposed method included two prediction types, multi-point prediction and single-point prediction. WPP is tested by applying different models such as autoregressive moving average (ARMA), autoregressive integrated moving average (ARIMA) and artificial neural network (ANN). By comparing results of the above models, the validity of the proposed hybrid model is confirmed in terms of error and correlation coefficient. Comparison of results confirmed that the proposed method works effectively. Additional, forecasting errors were also computed and compared, to improve understanding of how to depict highly variable WPP and the correlations between actual and predicted wind power.

  11. Neighborhood greenspace and health in a large urban center

    NASA Astrophysics Data System (ADS)

    Kardan, Omid; Gozdyra, Peter; Misic, Bratislav; Moola, Faisal; Palmer, Lyle J.; Paus, Tomáš; Berman, Marc G.

    2015-07-01

    Studies have shown that natural environments can enhance health and here we build upon that work by examining the associations between comprehensive greenspace metrics and health. We focused on a large urban population center (Toronto, Canada) and related the two domains by combining high-resolution satellite imagery and individual tree data from Toronto with questionnaire-based self-reports of general health perception, cardio-metabolic conditions and mental illnesses from the Ontario Health Study. Results from multiple regressions and multivariate canonical correlation analyses suggest that people who live in neighborhoods with a higher density of trees on their streets report significantly higher health perception and significantly less cardio-metabolic conditions (controlling for socio-economic and demographic factors). We find that having 10 more trees in a city block, on average, improves health perception in ways comparable to an increase in annual personal income of $10,000 and moving to a neighborhood with $10,000 higher median income or being 7 years younger. We also find that having 11 more trees in a city block, on average, decreases cardio-metabolic conditions in ways comparable to an increase in annual personal income of $20,000 and moving to a neighborhood with $20,000 higher median income or being 1.4 years younger.

  12. SU-E-J-208: Fast and Accurate Auto-Segmentation of Abdominal Organs at Risk for Online Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, V; Wang, Y; Romero, A

    2014-06-01

    Purpose: Various studies have demonstrated that online adaptive radiotherapy by real-time re-optimization of the treatment plan can improve organs-at-risk (OARs) sparing in the abdominal region. Its clinical implementation, however, requires fast and accurate auto-segmentation of OARs in CT scans acquired just before each treatment fraction. Autosegmentation is particularly challenging in the abdominal region due to the frequently observed large deformations. We present a clinical validation of a new auto-segmentation method that uses fully automated non-rigid registration for propagating abdominal OAR contours from planning to daily treatment CT scans. Methods: OARs were manually contoured by an expert panel to obtain groundmore » truth contours for repeat CT scans (3 per patient) of 10 patients. For the non-rigid alignment, we used a new non-rigid registration method that estimates the deformation field by optimizing local normalized correlation coefficient with smoothness regularization. This field was used to propagate planning contours to repeat CTs. To quantify the performance of the auto-segmentation, we compared the propagated and ground truth contours using two widely used metrics- Dice coefficient (Dc) and Hausdorff distance (Hd). The proposed method was benchmarked against translation and rigid alignment based auto-segmentation. Results: For all organs, the auto-segmentation performed better than the baseline (translation) with an average processing time of 15 s per fraction CT. The overall improvements ranged from 2% (heart) to 32% (pancreas) in Dc, and 27% (heart) to 62% (spinal cord) in Hd. For liver, kidneys, gall bladder, stomach, spinal cord and heart, Dc above 0.85 was achieved. Duodenum and pancreas were the most challenging organs with both showing relatively larger spreads and medians of 0.79 and 2.1 mm for Dc and Hd, respectively. Conclusion: Based on the achieved accuracy and computational time we conclude that the investigated auto-segmentation method overcomes an important hurdle to the clinical implementation of online adaptive radiotherapy. Partial funding for this work was provided by Accuray Incorporated as part of a research collaboration with Erasmus MC Cancer Institute.« less

  13. Auto Correlation Analysis of Coda Waves from Local Earthquakes for Detecting Temporal Changes in Shallow Subsurface Structures: the 2011 Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Nakahara, Hisashi

    2015-02-01

    For monitoring temporal changes in subsurface structures I propose to use auto correlation functions of coda waves from local earthquakes recorded at surface receivers, which probably contain more body waves than surface waves. Use of coda waves requires earthquakes resulting in decreased time resolution for monitoring. Nonetheless, it may be possible to monitor subsurface structures in sufficient time resolutions in regions with high seismicity. In studying the 2011 Tohoku-Oki, Japan earthquake (Mw 9.0), for which velocity changes have been previously reported, I try to validate the method. KiK-net stations in northern Honshu are used in this analysis. For each moderate earthquake normalized auto correlation functions of surface records are stacked with respect to time windows in the S-wave coda. Aligning the stacked, normalized auto correlation functions with time, I search for changes in phases arrival times. The phases at lag times of <1 s are studied because changes at shallow depths are focused. Temporal variations in the arrival times are measured at the stations based on the stretching method. Clear phase delays are found to be associated with the mainshock and to gradually recover with time. The amounts of the phase delays are 10 % on average with the maximum of about 50 % at some stations. The deconvolution analysis using surface and subsurface records at the same stations is conducted for validation. The results show the phase delays from the deconvolution analysis are slightly smaller than those from the auto correlation analysis, which implies that the phases on the auto correlations are caused by larger velocity changes at shallower depths. The auto correlation analysis seems to have an accuracy of about several percent, which is much larger than methods using earthquake doublets and borehole array data. So this analysis might be applicable in detecting larger changes. In spite of these disadvantages, this analysis is still attractive because it can be applied to many records on the surface in regions where no boreholes are available.

  14. Reconstruction of missing daily streamflow data using dynamic regression models

    NASA Astrophysics Data System (ADS)

    Tencaliec, Patricia; Favre, Anne-Catherine; Prieur, Clémentine; Mathevet, Thibault

    2015-12-01

    River discharge is one of the most important quantities in hydrology. It provides fundamental records for water resources management and climate change monitoring. Even very short data-gaps in this information can cause extremely different analysis outputs. Therefore, reconstructing missing data of incomplete data sets is an important step regarding the performance of the environmental models, engineering, and research applications, thus it presents a great challenge. The objective of this paper is to introduce an effective technique for reconstructing missing daily discharge data when one has access to only daily streamflow data. The proposed procedure uses a combination of regression and autoregressive integrated moving average models (ARIMA) called dynamic regression model. This model uses the linear relationship between neighbor and correlated stations and then adjusts the residual term by fitting an ARIMA structure. Application of the model to eight daily streamflow data for the Durance river watershed showed that the model yields reliable estimates for the missing data in the time series. Simulation studies were also conducted to evaluate the performance of the procedure.

  15. 25 CFR 700.173 - Average net earnings of business or farm.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Average net earnings of business or farm. 700.173 Section... PROCEDURES Moving and Related Expenses, Temporary Emergency Moves § 700.173 Average net earnings of business or farm. (a) Computing net earnings. For purposes of this subpart, the average annual net earnings of...

  16. 25 CFR 700.173 - Average net earnings of business or farm.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Average net earnings of business or farm. 700.173 Section... PROCEDURES Moving and Related Expenses, Temporary Emergency Moves § 700.173 Average net earnings of business or farm. (a) Computing net earnings. For purposes of this subpart, the average annual net earnings of...

  17. Identification of moving vehicle forces on bridge structures via moving average Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Pan, Chu-Dong; Yu, Ling; Liu, Huan-Lin

    2017-08-01

    Traffic-induced moving force identification (MFI) is a typical inverse problem in the field of bridge structural health monitoring. Lots of regularization-based methods have been proposed for MFI. However, the MFI accuracy obtained from the existing methods is low when the moving forces enter into and exit a bridge deck due to low sensitivity of structural responses to the forces at these zones. To overcome this shortcoming, a novel moving average Tikhonov regularization method is proposed for MFI by combining with the moving average concepts. Firstly, the bridge-vehicle interaction moving force is assumed as a discrete finite signal with stable average value (DFS-SAV). Secondly, the reasonable signal feature of DFS-SAV is quantified and introduced for improving the penalty function (∣∣x∣∣2 2) defined in the classical Tikhonov regularization. Then, a feasible two-step strategy is proposed for selecting regularization parameter and balance coefficient defined in the improved penalty function. Finally, both numerical simulations on a simply-supported beam and laboratory experiments on a hollow tube beam are performed for assessing the accuracy and the feasibility of the proposed method. The illustrated results show that the moving forces can be accurately identified with a strong robustness. Some related issues, such as selection of moving window length, effect of different penalty functions, and effect of different car speeds, are discussed as well.

  18. Assessing the Efficacy of Adjustable Moving Averages Using ASEAN-5 Currencies.

    PubMed

    Chan Phooi M'ng, Jacinta; Zainudin, Rozaimah

    2016-01-01

    The objective of this research is to examine the trends in the exchange rate markets of the ASEAN-5 countries (Indonesia (IDR), Malaysia (MYR), the Philippines (PHP), Singapore (SGD), and Thailand (THB)) through the application of dynamic moving average trading systems. This research offers evidence of the usefulness of the time-varying volatility technical analysis indicator, Adjustable Moving Average (AMA') in deciphering trends in these ASEAN-5 exchange rate markets. This time-varying volatility factor, referred to as the Efficacy Ratio in this paper, is embedded in AMA'. The Efficacy Ratio adjusts the AMA' to the prevailing market conditions by avoiding whipsaws (losses due, in part, to acting on wrong trading signals, which generally occur when there is no general direction in the market) in range trading and by entering early into new trends in trend trading. The efficacy of AMA' is assessed against other popular moving-average rules. Based on the January 2005 to December 2014 dataset, our findings show that the moving averages and AMA' are superior to the passive buy-and-hold strategy. Specifically, AMA' outperforms the other models for the United States Dollar against PHP (USD/PHP) and USD/THB currency pairs. The results show that different length moving averages perform better in different periods for the five currencies. This is consistent with our hypothesis that a dynamic adjustable technical indicator is needed to cater for different periods in different markets.

  19. Effect of parameters in moving average method for event detection enhancement using phase sensitive OTDR

    NASA Astrophysics Data System (ADS)

    Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum

    2017-04-01

    We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.

  20. [Multiple myeloma: Maintenance therapy after autologous hematopoietic stem cell transplantation, depending on minimal residual disease].

    PubMed

    Solovyev, M V; Mendeleeva, L P; Pokrovskaya, O S; Nareyko, M V; Firsova, M V; Galtseva, I V; Davydova, Yu O; Kapranov, N M; Kuzmina, L A; Gemdzhian, E G; Savchenko, V G

    To determine the efficiency of maintenance therapy with bortezomib in patients with multiple myeloma (MM) who have achieved complete remission (CR) after autologous hematopoietic stem cell (auto-HSCT), depending on the presence of minimal residual disease (MRD). In January 2014 to February 2016, fifty-two MM patients (19 men and 33 women) aged 24 to 66 years (median 54 years), who had achieved CR after auto-HSCT, were randomized to perform maintenance therapy with bortezomib during a year. On day 100 after auto-HSCT, all the patients underwent immunophenotyping of bone marrow plasma cells by 6-color flow cytometry to detect MRD. Relapse-free survival (RFS) was chosen as a criterion for evaluating the efficiency of maintenance therapy. After auto-HSCT, MRD-negative patients had a statistically significantly higher 2-year RFS rate than MRD-positive patients: 52.9% (95% confidence interval (CI), 35.5 to 70.5%) versus 37.2% (95% CI, 25.4 to 49.3%) (p=0.05). The presence of MRD statistically significantly increased the risk of relapse (odds ratio 1.7; 95% CI, 1.2 to 3.4; p=0.05). Two-year cumulative risk of relapse (using the Kaplan-Meier) after auto-HSCT did not statistically significantly differ in MRD-negative patients receiving (n=15) and not receiving (n=10) maintenance therapy with bortezomib (p=0.58). After completion of maintenance treatment, 42% of the MRD-positive patients achieved a negative status. In the MRD-positive patients who had received maintenance therapy, the average time to recurrence was 5 months longer than that in the naïve patients: 17.3 versus 12.3 months. The MRD status determined in MM patients who have achieved CR after auto-HSCT is an important factor for deciding on the use of maintenance therapy.

  1. HEMATOPOIETIC PROGENITOR CELL CONTENT OF VERTEBRAL BODY MARROW USED FOR COMBINED SOLID ORGAN AND BONE MARROW TRANSPLANTATION

    PubMed Central

    Rybka, Witold B.; Fontes, Paulo A.; Rao, Abdul S.; Winkelstein, Alan; Ricordi, Camillo; Ball, Edward D.; Starzl, Thomas E.

    2010-01-01

    While cadaveric vertebral bodies (VB) have long been proposed as a suitable source of bone marrow (BM) for transplantation (BMT), they have rarely been used for this purpose. We have infused VB BM immediately following whole organ (WO) transplantation to augment donor cell chimerism. We quantified the hematopoietic progenitor cell (HPC) content of VB BM as well as BM obtained from the iliac crests (IC) of normal allogeneic donors (ALLO) and from patients with malignancy undergoing autologous marrow harvest (AUTO). Patients undergoing WOIBM transplantation also had AUTO BM harvested in the event that subsequent lymphohematopoietic reconstitution was required. Twenty-four VB BM, 24 IC BM-ALLO, 31 IC AUTO, and 24 IC WO-AUTO were harvested. VB BM was tested 12 to 72 hr after procurement and infused after completion ofWO grafting. IC BM was tested and then used or cryopreserved immediately. HPC were quantified by clonal assay measuring CFU-GM, BFU-E, and CFU-GEMM, and by flow cytometry for CD34+ progenitor cells. On an average, 9 VB were processed during each harvest, and despite an extended processing time the number of viable nucleated cells obtained was significantly higher than that from IC. Furthermore, by HPC content, VB BM was equivalent to IC BM, which is routinely used for BMT. We conclude that VB BM is a clinically valuable source of BM for allogeneic transplantation. PMID:7701582

  2. On Identifying Clusters Within the C-type Asteroids of the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Poole, Renae; Ziffer, J.; Harvell, T.

    2012-10-01

    We applied AutoClass, a data mining technique based upon Bayesian Classification, to C-group asteroid colors in the Sloan Digital Sky Survey (SDSS). Previous taxonomic studies relied mostly on Principal Component Analysis (PCA) to differentiate asteroids within the C-group (e.g. B, G, F, Ch, Cg and Cb). AutoClass's advantage is that it calculates the most probable classification for us, removing the human factor from this part of the analysis. In our results, AutoClass divided the C-groups into two large classes and six smaller classes. The two large classes (n=4974 and 2033, respectively) display distinct regions with some overlap in color-vs-color plots. Each cluster's average spectrum is compared to 'typical' spectra of the C-group subtypes as defined by Tholen (1989) and each cluster's members are evaluated for consistency with previous taxonomies. Of the 117 asteroids classified as B-type in previous taxonomies, only 12 were found with SDSS colors that matched our criteria of having less than 0.1 magnitude error in u and 0.05 magnitude error in g, r, i, and z colors. Although this is a relatively small group, 11 of the 12 B-types were placed by AutoClass in the same cluster. By determining the C-group sub-classifications in the large SDSS database, this research furthers our understanding of the stratigraphy and composition of the main-belt.

  3. Validation of a classification system to grade fractionation in atrial fibrillation and correlation with automated detection systems.

    PubMed

    Hunter, Ross J; Diab, Ihab; Thomas, Glyn; Duncan, Edward; Abrams, Dominic; Dhinoja, Mehul; Sporton, Simon; Earley, Mark J; Schilling, Richard J

    2009-12-01

    We tested application of a grading system describing complex fractionated electrograms (CFE) in atrial fibrillation (AF) and used it to validate automated CFE detection (AUTO). Ten seconds bipolar electrograms were classified by visual inspection (VI) during ablation of persistent AF and the result compared with offline manual measurement (MM) by a second blinded operator: Grade 1 uninterrupted fractionated activity (defined as segments > or =70 ms) for > or =70% of recording and uninterrupted > or =1 s; Grade 2 interrupted fractionated activity > or =70% of recording; Grade 3 intermittent fractionated activity 30-70%; Grade 4 discrete (<70 ms) complex electrogram (> or =5 direction changes); Grade 5 discrete simple electrograms (< or =4 direction changes); Grade 6 scar. Grade by VI and MM for 100 electrograms agreed in 89%. Five hundred electrograms were graded on Carto and NavX by VI to validate AUTO in (i) detection of CFE (grades 1-4 considered CFE), and (ii) assessing degree of fractionation by correlating grade and score by AUTO (data shown as sensitivity, specificity, r): NavX 'CFE mean' 92%, 91%, 0.56; Carto 'interval confidence level' using factory settings 89%, 62%, -0.72, and other published settings 80%, 74%, -0.65; Carto 'shortest confidence interval' 74%, 70%, 0.43; Carto 'average confidence interval' 86%, 66%, 0.53. Grading CFE by VI is accurate and correlates with AUTO.

  4. Spatial and temporal predictions of agricultural land prices using DSM techniques.

    NASA Astrophysics Data System (ADS)

    Carré, F.; Grandgirard, D.; Diafas, I.; Reuter, H. I.; Julien, V.; Lemercier, B.

    2009-04-01

    Agricultural land prices highly impacts land accessibility to farmers and by consequence the evolution of agricultural landscapes (crop changes, land conversion to urban infrastructures…) which can turn to irreversible soil degradation. The economic value of agricultural land has been studied spatially, in every one of the 374 French Agricultural Counties, and temporally- from 1995 to 2007, by using data of the SAFER Institute. To this aim, agricultural land price was considered as a digital soil property. The spatial and temporal predictions were done using Digital Soil Mapping techniques combined with tools mainly used for studying temporal financial behaviors. For making both predictions, a first classification of the Agricultural Counties was done for the 1995-2006 periods (2007 was excluded and served as the date of prediction) using a fuzzy k-means clustering. The Agricultural Counties were then aggregated according to land price at the different times. The clustering allows for characterizing the counties by their memberships to each class centroid. The memberships were used for the spatial prediction, whereas the centroids were used for the temporal prediction. For the spatial prediction, from the 374 Agricultural counties, three fourths were used for modeling and one fourth for validating. Random sampling was done by class to ensure that all classes are represented by at least one county in the modeling and validation datasets. The prediction was done for each class by testing the relationships between the memberships and the following factors: (i) soil variable (organic matter from the French BDAT database), (ii) soil covariates (land use classes from CORINE LANDCOVER, bioclimatic zones from the WorldClim Database, landform attributes and landform classes from the SRTM, major roads and hydrographic densities from EUROSTAT, average field sizes estimated by automatic classification of remote sensed images) and (iii) socio-economic factors (population density, gross domestic product and its combination with the population density obtained from EUROSTAT). Linear (Generalized Linear Models) and non-linear models (neural network) were used for building the relationships. For the validation, the relationships were applied to the validation datasets. The RMSE and the coefficient of determination (from a linear regression) between predicted and actual memberships, and the contingency table between the predicted and actual allocation classes were used as validation criteria. The temporal prediction was done on the year 2007 from the centroid land prices characterizing the 1995-2006 period. For each class, the land prices of the time-series 1995-2006 were modeled using an Auto-Regressive Moving Average approach. For the validation, the models were applied to the year 2007. The RMSE between predicted and actual prices is used as the validation criteria. We then discussed the methods and the results of the spatial and temporal validation. Based on this methodology, an extrapolation will be tested on another European country with land price market similar to France (to be determined).

  5. Neuropsychological correlates of cognitive, emotional-affective and auto-activation apathy in Alzheimer's disease.

    PubMed

    Perri, Roberta; Turchetta, Chiara Stella; Caruso, Giulia; Fadda, Lucia; Caltagirone, Carlo; Carlesimo, Augusto Giovanni

    2018-01-31

    Apathy symptoms include different dimensions: cognitive (C), emotional-affective (E-Aff) and auto-activation; they have been related to dysfunctions of the dorsolateral, orbito-basal prefrontal cortex and the subcortical frontal connections to the basal ganglia, respectively. In Alzheimer's disease (AD), an association has been found between apathy severity and both executive deficits and atrophy of the dorso-lateral prefrontal cortex; however, it is not clear whether these associations concern only the cognitive aspects of apathy. Furthermore, whether there is an association in AD between E-aff apathy and theory of mind (ToM),the cognitive functions subsumed by the orbito-basal prefrontal cortex, has not been investigated. Aim of the study was to investigate the relationship between C, E-Aff and auto-activation apathy and performance on tasks investigating executive and ToM cognitive functions in AD. For this purpose, 20 AD patients with apathy and 20 matched controls were submitted to an executive and ToM neuropsychological assessment. Apathy was assessed with a weekly diary (ApD) created specifically to assist caregivers in quantifying the C, E-Aff and auto-activation symptomatology of apathy. Correlational analyses showed that AD patients' scores on the Modified Card Sorting Test (MCST) and Emotion Attribution tasks were correlated with most ApD scores. However, regression analyses showed that C diary scores were predicted by MCST performance, E-Aff diary scores by performance on the E-Attribution task and ApD scores measuring auto-activation apathy were predicted by both the MCST and the Emotion Attribution scores. These results confirm the co-occurrence of apathy and executive-function deficits in AD and suggest a specific association between AD patients' executive deficits and the cognitive component of apathy. Furthermore, they document, for the first time, an association between poor performance on tests assessing ToM abilities and the emotional-affective component of apathy in AD patients. Finally, these results are in line with the view that auto-activation apathy reflects the sum of emotional and cognitive processing deficits. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. The Economic Impact of Malignant Catarrhal Fever on Pastoralist Livelihoods

    PubMed Central

    Lankester, Felix; Lugelo, Ahmed; Kazwala, Rudovick; Keyyu, Julius; Cleaveland, Sarah; Yoder, Jonathan

    2015-01-01

    This study is the first to partially quantify the potential economic benefits that a vaccine, effective at protecting cattle against malignant catarrhal fever (MCF), could accrue to pastoralists living in East Africa. The benefits would result from the removal of household resource and management costs that are traditionally incurred avoiding the disease. MCF, a fatal disease of cattle caused by a virus transmitted from wildebeest calves, has plagued Maasai communities in East Africa for generations. The threat of the disease forces the Maasai to move cattle to less productive grazing areas to avoid wildebeest during calving season when forage quality is critical. To assess the management and resource costs associated with moving, we used household survey data. To estimate the costs associated with changes in livestock body condition that result from being herded away from wildebeest calving grounds, we exploited an ongoing MCF vaccine field trial and we used a hedonic price regression, a statistical model that allows estimation of the marginal contribution of a good’s attributes to its market price. We found that 90 percent of households move, on average, 82 percent of all cattle away from home to avoid MCF. In doing so, a herd’s productive contributions to the household was reduced, with 64 percent of milk being unavailable for sale or consumption by the family members remaining at the boma (the children, women, and the elderly). In contrast cattle that remained on the wildebeest calving grounds during the calving season (and survived MCF) remained fully productive to the family and gained body condition compared to cattle that moved away. This gain was, however, short-lived. We estimated the market value of these condition gains and losses using hedonic regression. The value of a vaccine for MCF is the removal of the costs incurred in avoiding the disease. PMID:25629896

  7. Learning investment indicators through data extension

    NASA Astrophysics Data System (ADS)

    Dvořák, Marek

    2017-07-01

    Stock prices in the form of time series were analysed using single and multivariate statistical methods. After simple data preprocessing in the form of logarithmic differences, we augmented this single variate time series to a multivariate representation. This method makes use of sliding windows to calculate several dozen of new variables using simple statistic tools like first and second moments as well as more complicated statistic, like auto-regression coefficients and residual analysis, followed by an optional quadratic transformation that was further used for data extension. These were used as a explanatory variables in a regularized logistic LASSO regression which tried to estimate Buy-Sell Index (BSI) from real stock market data.

  8. The Healthy LifeWorks Project: a pilot study of the economic analysis of a comprehensive workplace wellness program in a Canadian government department.

    PubMed

    Makrides, Lydia; Smith, Steven; Allt, Jane; Farquharson, Jane; Szpilfogel, Claudine; Curwin, Sandra; Veinot, Paula; Wang, Feifei; Edington, Dee

    2011-07-01

    To examine the relationship between health risks and absenteeism and drug costs vis-a-vis comprehensive workplace wellness. Eleven health risks, and change in drug claims, short-term and general illness calculated across four risk change groups. Wellness score examined using Wilcoxon test and regression model for cost change. The results showed 31% at risk; 9 of 11 risks associated with higher drug costs. Employees moving from low to high risk showed highest relative increase (81%) in drug costs; moving from high to low had lowest (24%). Low-high had highest increase in absenteeism costs (160%). With each risk increase, absenteeism costs increased by $CDN248 per year (P < 0.05) with average decrease of 0.07 risk factors and savings $CDN6979 per year. Both high-risk reduction and low-risk maintenance are important to contain drug costs. Only low-risk maintenance also avoids absenteeism costs associated with high risks.

  9. Optimal camera exposure for video surveillance systems by predictive control of shutter speed, aperture, and gain

    NASA Astrophysics Data System (ADS)

    Torres, Juan; Menéndez, José Manuel

    2015-02-01

    This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together with a representative auto-exposure algorithm in the recent literature. Besides the sunrises and the nightfalls, multiple weather conditions occurred which produced light changes in the scene: sunny hours that produced sharpen shadows and highlights; cloud coverages that softened the shadows; and cloudy and rainy hours that dimmed the scene. Several indicators were used to measure the performance of the algorithms. They provided the objective quality as regards: the time that the algorithms recover from an under or over exposure, the brightness stability, and the change related to the optimal exposure. The results demonstrated that our algorithm reacts faster to all the light changes than the selected state-of-the-art algorithm. It is also capable of acquiring well exposed images and maintaining the brightness stable during more time. Summing up the results, we concluded that the proposed algorithm provides a fast and stable auto-exposure method that maintains an optimal exposure for video surveillance applications. Future work will involve the evaluation of this algorithm in robotics.

  10. Rechargeable Battery Auto-Cycler Requiring Lower Power and Dissipating Reduced Waste Heat

    NASA Technical Reports Server (NTRS)

    Hanson, Thomas David (Inventor)

    2018-01-01

    A battery charger system includes a power supply and a switch connected to the power supply wherein the switch has a first switch half and a second switch half. First and second batteries are selectively connected to the power supply via the switch. The first and second switch halves are moved between a plurality of operational positions to fully charge the first battery, discharge the first battery into the second battery, discharge the second battery into the first battery, and fully charge the second battery.

  11. Short-Term Exposure to Ambient Air Pollution and Biomarkers of Systemic Inflammation: The Framingham Heart Study.

    PubMed

    Li, Wenyuan; Dorans, Kirsten S; Wilker, Elissa H; Rice, Mary B; Ljungman, Petter L; Schwartz, Joel D; Coull, Brent A; Koutrakis, Petros; Gold, Diane R; Keaney, John F; Vasan, Ramachandran S; Benjamin, Emelia J; Mittleman, Murray A

    2017-09-01

    The objective of this study is to examine associations between short-term exposure to ambient air pollution and circulating biomarkers of systemic inflammation in participants from the Framingham Offspring and Third Generation cohorts in the greater Boston area. We included 3996 noncurrent smoking participants (mean age, 53.6 years; 54% women) who lived within 50 km from a central air pollution monitoring site in Boston, MA, and calculated the 1- to 7-day moving averages of fine particulate matter (diameter<2.5 µm), black carbon, sulfate, nitrogen oxides, and ozone before the examination visits. We used linear mixed effects models for C-reactive protein and tumor necrosis factor receptor 2, which were measured up to twice for each participant; we used linear regression models for interleukin-6, fibrinogen, and tumor necrosis factor α, which were measured once. We adjusted for demographics, socioeconomic position, lifestyle, time, and weather. The 3- to 7-day moving averages of fine particulate matter (diameter<2.5 µm) and sulfate were positively associated with C-reactive protein concentrations. A 5 µg/m 3 higher 5-day moving average fine particulate matter (diameter<2.5 µm) was associated with 4.2% (95% confidence interval: 0.8, 7.6) higher circulating C-reactive protein. Positive associations were also observed for nitrogen oxides with interleukin-6 and for black carbon, sulfate, and ozone with tumor necrosis factor receptor 2. However, black carbon, sulfate, and nitrogen oxides were negatively associated with fibrinogen, and sulfate was negatively associated with tumor necrosis factor α. Higher short-term exposure to relatively low levels of ambient air pollution was associated with higher levels of C-reactive protein, interleukin-6, and tumor necrosis factor receptor 2 but not fibrinogen or tumor necrosis factor α in individuals residing in the greater Boston area. © 2017 American Heart Association, Inc.

  12. First experience with THE AUTOLAP™ SYSTEM: an image-based robotic camera steering device.

    PubMed

    Wijsman, Paul J M; Broeders, Ivo A M J; Brenkman, Hylke J; Szold, Amir; Forgione, Antonello; Schreuder, Henk W R; Consten, Esther C J; Draaisma, Werner A; Verheijen, Paul M; Ruurda, Jelle P; Kaufman, Yuval

    2018-05-01

    Robotic camera holders for endoscopic surgery have been available for 20 years but market penetration is low. The current camera holders are controlled by voice, joystick, eyeball tracking, or head movements, and this type of steering has proven to be successful but excessive disturbance of surgical workflow has blocked widespread introduction. The Autolap™ system (MST, Israel) uses a radically different steering concept based on image analysis. This may improve acceptance by smooth, interactive, and fast steering. These two studies were conducted to prove safe and efficient performance of the core technology. A total of 66 various laparoscopic procedures were performed with the AutoLap™ by nine experienced surgeons, in two multi-center studies; 41 cholecystectomies, 13 fundoplications including hiatal hernia repair, 4 endometriosis surgeries, 2 inguinal hernia repairs, and 6 (bilateral) salpingo-oophorectomies. The use of the AutoLap™ system was evaluated in terms of safety, image stability, setup and procedural time, accuracy of imaged-based movements, and user satisfaction. Surgical procedures were completed with the AutoLap™ system in 64 cases (97%). The mean overall setup time of the AutoLap™ system was 4 min (04:08 ± 0.10). Procedure times were not prolonged due to the use of the system when compared to literature average. The reported user satisfaction was 3.85 and 3.96 on a scale of 1 to 5 in two studies. More than 90% of the image-based movements were accurate. No system-related adverse events were recorded while using the system. Safe and efficient use of the core technology of the AutoLap™ system was demonstrated with high image stability and good surgeon satisfaction. The results support further clinical studies that will focus on usability, improved ergonomics and additional image-based features.

  13. A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.

    PubMed

    Houseman, E Andres; Virji, M Abbas

    2017-08-01

    Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates were significant in some frequentist models, but in the Bayesian model their credible intervals contained zero; such discrepancies were observed in multiple datasets. Variance components from the Bayesian model reflected substantial autocorrelation, consistent with the frequentist models, except for the auto-regressive moving average model. Plots of means from the Bayesian model showed good fit to the observed data. The proposed Bayesian model provides an approach for modeling non-stationary autocorrelation in a hierarchical modeling framework to estimate task means, standard deviations, quantiles, and parameter estimates for covariates that are less biased and have better performance characteristics than some of the contemporary methods. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2017.

  14. Fouling resistance prediction using artificial neural network nonlinear auto-regressive with exogenous input model based on operating conditions and fluid properties correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biyanto, Totok R.

    Fouling in a heat exchanger in Crude Preheat Train (CPT) refinery is an unsolved problem that reduces the plant efficiency, increases fuel consumption and CO{sub 2} emission. The fouling resistance behavior is very complex. It is difficult to develop a model using first principle equation to predict the fouling resistance due to different operating conditions and different crude blends. In this paper, Artificial Neural Networks (ANN) MultiLayer Perceptron (MLP) with input structure using Nonlinear Auto-Regressive with eXogenous (NARX) is utilized to build the fouling resistance model in shell and tube heat exchanger (STHX). The input data of the model aremore » flow rates and temperatures of the streams of the heat exchanger, physical properties of product and crude blend data. This model serves as a predicting tool to optimize operating conditions and preventive maintenance of STHX. The results show that the model can capture the complexity of fouling characteristics in heat exchanger due to thermodynamic conditions and variations in crude oil properties (blends). It was found that the Root Mean Square Error (RMSE) are suitable to capture the nonlinearity and complexity of the STHX fouling resistance during phases of training and validation.« less

  15. Non-linear auto-regressive models for cross-frequency coupling in neural time series

    PubMed Central

    Tallot, Lucille; Grabot, Laetitia; Doyère, Valérie; Grenier, Yves; Gramfort, Alexandre

    2017-01-01

    We address the issue of reliably detecting and quantifying cross-frequency coupling (CFC) in neural time series. Based on non-linear auto-regressive models, the proposed method provides a generative and parametric model of the time-varying spectral content of the signals. As this method models the entire spectrum simultaneously, it avoids the pitfalls related to incorrect filtering or the use of the Hilbert transform on wide-band signals. As the model is probabilistic, it also provides a score of the model “goodness of fit” via the likelihood, enabling easy and legitimate model selection and parameter comparison; this data-driven feature is unique to our model-based approach. Using three datasets obtained with invasive neurophysiological recordings in humans and rodents, we demonstrate that these models are able to replicate previous results obtained with other metrics, but also reveal new insights such as the influence of the amplitude of the slow oscillation. Using simulations, we demonstrate that our parametric method can reveal neural couplings with shorter signals than non-parametric methods. We also show how the likelihood can be used to find optimal filtering parameters, suggesting new properties on the spectrum of the driving signal, but also to estimate the optimal delay between the coupled signals, enabling a directionality estimation in the coupling. PMID:29227989

  16. Assessing the Efficacy of Adjustable Moving Averages Using ASEAN-5 Currencies

    PubMed Central

    2016-01-01

    The objective of this research is to examine the trends in the exchange rate markets of the ASEAN-5 countries (Indonesia (IDR), Malaysia (MYR), the Philippines (PHP), Singapore (SGD), and Thailand (THB)) through the application of dynamic moving average trading systems. This research offers evidence of the usefulness of the time-varying volatility technical analysis indicator, Adjustable Moving Average (AMA′) in deciphering trends in these ASEAN-5 exchange rate markets. This time-varying volatility factor, referred to as the Efficacy Ratio in this paper, is embedded in AMA′. The Efficacy Ratio adjusts the AMA′ to the prevailing market conditions by avoiding whipsaws (losses due, in part, to acting on wrong trading signals, which generally occur when there is no general direction in the market) in range trading and by entering early into new trends in trend trading. The efficacy of AMA′ is assessed against other popular moving-average rules. Based on the January 2005 to December 2014 dataset, our findings show that the moving averages and AMA′ are superior to the passive buy-and-hold strategy. Specifically, AMA′ outperforms the other models for the United States Dollar against PHP (USD/PHP) and USD/THB currency pairs. The results show that different length moving averages perform better in different periods for the five currencies. This is consistent with our hypothesis that a dynamic adjustable technical indicator is needed to cater for different periods in different markets. PMID:27574972

  17. Pre-Drinking and the Temporal Gradient of Intoxication in a New Zealand Nightlife Environment.

    PubMed

    Cameron, Michael P; Roskruge, Matthew J; Droste, Nic; Miller, Peter G

    2018-01-01

    We measured changes in the average level of intoxication over time in the nighttime economy and identified the factors associated with intoxication, including pre-drinking. A random intercept sample of 320 pedestrians (105 women; 215 men) was interviewed and received breath alcohol analysis in the nighttime economy of Hamilton, New Zealand. Data were collected over a five-night period, between 7 P.M. and 2:30 A.M. Data were analyzed by plotting the moving average breath alcohol concentration (BrAC) over time and using linear regression models to identify the factors associated with BrAC. Mean BrAC was 241.5 mcg/L for the full sample; 179.7 for women and 271.7 for men, which is a statistically significant difference. Mean BrAC was also significantly higher among those who engaged in pre-drinking than those who did not. In the regression models, time of night and pre-drinking were significantly associated with higher BrAC. The effect of pre-drinking on BrAC was larger for women than for men. The average level of intoxication increases throughout the night. However, this masks a potentially important gender difference, in that women's intoxication levels stop increasing after midnight, whereas men's increase continuously through the night. Similarly, intoxication of pre-drinkers stops increasing from 11 P.M., although remaining higher than non-pre-drinkers throughout the night. Analysis of BrAC provides a more nuanced understanding of intoxication levels in the nighttime economy.

  18. Timescale Halo: Average-Speed Targets Elicit More Positive and Less Negative Attributions than Slow or Fast Targets

    PubMed Central

    Hernandez, Ivan; Preston, Jesse Lee; Hepler, Justin

    2014-01-01

    Research on the timescale bias has found that observers perceive more capacity for mind in targets moving at an average speed, relative to slow or fast moving targets. The present research revisited the timescale bias as a type of halo effect, where normal-speed people elicit positive evaluations and abnormal-speed (slow and fast) people elicit negative evaluations. In two studies, participants viewed videos of people walking at a slow, average, or fast speed. We find evidence for a timescale halo effect: people walking at an average-speed were attributed more positive mental traits, but fewer negative mental traits, relative to slow or fast moving people. These effects held across both cognitive and emotional dimensions of mind and were mediated by overall positive/negative ratings of the person. These results suggest that, rather than eliciting greater perceptions of general mind, the timescale bias may reflect a generalized positivity toward average speed people relative to slow or fast moving people. PMID:24421882

  19. An intervention to improve spontaneous adverse drug reaction reporting by hospital physicians: a time series analysis in Spain.

    PubMed

    Pedrós, Consuelo; Vallano, Antoni; Cereza, Gloria; Mendoza-Aran, Gemma; Agustí, Antònia; Aguilera, Cristina; Danés, Immaculada; Vidal, Xavier; Arnau, Josep M

    2009-01-01

    Spontaneous reporting of adverse drug reactions (ADRs) in hospitals is scarce and several obstacles to such reporting have been identified previously. To assess the effectiveness of a multifaceted intervention based on healthcare management agreements for improving spontaneous reporting of ADRs by physicians in a hospital setting. In 2003, the spontaneous reporting of ADRs was included as one of the objectives of hospital physicians at the Vall d'Hebron Hospital, Barcelona, Spain, within the context of management agreements between clinical services and hospital managers. A continuous intervention related to these management agreements, including periodic educational meetings and economic incentives, was then initiated. We carried out an ecological time series analysis and assessed the change in the total number of spontaneous reports of ADRs, and the number of serious ADRs, unexpected ADRs, and ADRs associated with new drugs between a period previous to the intervention (from 1998 to 2002) and the period during the intervention (from 2003 to 2005). A time series analysis with ARIMA (Auto-Regressive Integrated Moving Average) models was performed. The median number of reported ADRs per year increased from 40 (range 23-55) in the first period to 224 (range 98-248) in the second period. In the first period, the monthly number of reported ADRs was stable (3.47 per month; 95% CI 1.90, 5.03), but in the second period the number increased progressively (increase of 0.74 per month; 95% CI 0.62, 0.86). In the second period, the proportion of reported serious ADRs increased nearly 2-fold (63.1% vs 32.5% in the first period). The absolute number of previously unknown or poorly known ADRs increased 4-fold in the second period (54 vs 13 in the first period). There was also an increase in the absolute number of suspected pharmacological exposures to new drugs (97 vs 28) and in the number of different new drugs suspected of causing ADRs (50 vs 19). A continuous intervention based on healthcare management agreements with economic incentives and educational activities is associated with a quantitative and qualitative improvement of spontaneous reporting of ADRs by hospital physicians.

  20. The impact of nonreferral outpatient co-payment on medical care utilization and expenditures in Taiwan.

    PubMed

    Chen, Li-Chia; Schafheutle, Ellen I; Noyce, Peter R

    2009-09-01

    Taiwan's National Health Insurance's (NHI) generous coverage and patients' freedom to access different tiers of medical facilities have resulted in accelerating outpatient care utilization and costs. To deter nonessential visits and encourage initial contact in primary care (physician clinics), a differential co-payment was introduced on 15th July 2005. Under this, patients pay more for outpatient consultations at "higher tiers" of medical facilities (local community hospitals, regional hospitals, medical centers), particularly if accessed without referral. This study explored the impact of this policy on outpatient medical activities and expenditures, different co-payment groups, and tiers of medical facilities. A segmented time-series analysis on regional weekly outpatient medical claims (January 2004 to July 2006) was conducted. Outcome variables (number of visits, number of outpatients, total cost of outpatient care) and variables for cost structure were stratified by tiers of medical facilities and co-payment groups. Analysis used the auto-regressive integrated moving-average model in STATA 9.0. The overall number of outpatient visits significantly decreased after policy implementation due to a reduction in the number of patients using outpatient facilities, but total costs of care remained unchanged. The policy had its greatest impact on the number of visits to regional and local community hospitals but had no influence on those to the medical centers. Medical utilization in physician clinics decreased due to an audit of reimbursement declarations. Overall, the policy failed to encourage referrals from primary care to higher tiers because there was no obvious shifting of medical utilization and costs reversely. Differential co-payment policy decreased total medication utilization but not costs to NHI. The results suggest that the increased level of co-payment charge and the strategy of a single cost-sharing policy are not sufficient to promote referrals within the system. To achieve an effective co-payment policy, further research is needed to explore how patients' out-of-pocket payment affects medical utilization and which forces (not susceptible to co-payment) act in tertiary facilities.

  1. Aboveground Biomass Estimation Using Reconstructed Feature of Airborne Discrete-Return LIDAR by Auto-Encoder Neural Network

    NASA Astrophysics Data System (ADS)

    Li, T.; Wang, Z.; Peng, J.

    2018-04-01

    Aboveground biomass (AGB) estimation is critical for quantifying carbon stocks and essential for evaluating carbon cycle. In recent years, airborne LiDAR shows its great ability for highly-precision AGB estimation. Most of the researches estimate AGB by the feature metrics extracted from the canopy height distribution of the point cloud which calculated based on precise digital terrain model (DTM). However, if forest canopy density is high, the probability of the LiDAR signal penetrating the canopy is lower, resulting in ground points is not enough to establish DTM. Then the distribution of forest canopy height is imprecise and some critical feature metrics which have a strong correlation with biomass such as percentiles, maximums, means and standard deviations of canopy point cloud can hardly be extracted correctly. In order to address this issue, we propose a strategy of first reconstructing LiDAR feature metrics through Auto-Encoder neural network and then using the reconstructed feature metrics to estimate AGB. To assess the prediction ability of the reconstructed feature metrics, both original and reconstructed feature metrics were regressed against field-observed AGB using the multiple stepwise regression (MS) and the partial least squares regression (PLS) respectively. The results showed that the estimation model using reconstructed feature metrics improved R2 by 5.44 %, 18.09 %, decreased RMSE value by 10.06 %, 22.13 % and reduced RMSEcv by 10.00 %, 21.70 % for AGB, respectively. Therefore, reconstructing LiDAR point feature metrics has potential for addressing AGB estimation challenge in dense canopy area.

  2. Aggregating the response in time series regression models, applied to weather-related cardiovascular mortality

    NASA Astrophysics Data System (ADS)

    Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B. M. J.

    2018-07-01

    In environmental epidemiology studies, health response data (e.g. hospitalization or mortality) are often noisy because of hospital organization and other social factors. The noise in the data can hide the true signal related to the exposure. The signal can be unveiled by performing a temporal aggregation on health data and then using it as the response in regression analysis. From aggregated series, a general methodology is introduced to account for the particularities of an aggregated response in a regression setting. This methodology can be used with usually applied regression models in weather-related health studies, such as generalized additive models (GAM) and distributed lag nonlinear models (DLNM). In particular, the residuals are modelled using an autoregressive-moving average (ARMA) model to account for the temporal dependence. The proposed methodology is illustrated by modelling the influence of temperature on cardiovascular mortality in Canada. A comparison with classical DLNMs is provided and several aggregation methods are compared. Results show that there is an increase in the fit quality when the response is aggregated, and that the estimated relationship focuses more on the outcome over several days than the classical DLNM. More precisely, among various investigated aggregation schemes, it was found that an aggregation with an asymmetric Epanechnikov kernel is more suited for studying the temperature-mortality relationship.

  3. Static and moving solid/gas interface modeling in a hybrid rocket engine

    NASA Astrophysics Data System (ADS)

    Mangeot, Alexandre; William-Louis, Mame; Gillard, Philippe

    2018-07-01

    A numerical model was developed with CFD-ACE software to study the working condition of an oxygen-nitrogen/polyethylene hybrid rocket combustor. As a first approach, a simplified numerical model is presented. It includes a compressible transient gas phase in which a two-step combustion mechanism is implemented coupled to a radiative model. The solid phase from the fuel grain is a semi-opaque material with its degradation process modeled by an Arrhenius type law. Two versions of the model were tested. The first considers the solid/gas interface with a static grid while the second uses grid deformation during the computation to follow the asymmetrical regression. The numerical results are obtained with two different regression kinetics originating from ThermoGravimetry Analysis and test bench results. In each case, the fuel surface temperature is retrieved within a range of 5% error. However, good results are only found using kinetics from the test bench. The regression rate is found within 0.03 mm s-1 and average combustor pressure and its variation over time have the same intensity than the measurements conducted on the test bench. The simulation that uses grid deformation to follow the regression shows a good stability over a 10 s simulated time simulation.

  4. Examination of the Armagh Observatory Annual Mean Temperature Record, 1844-2004

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2006-01-01

    The long-term annual mean temperature record (1844-2004) of the Armagh Observatory (Armagh, Northern Ireland, United Kingdom) is examined for evidence of systematic variation, in particular, as related to solar/geomagnetic forcing and secular variation. Indeed, both are apparent in the temperature record. Moving averages for 10 years of temperature are found to highly correlate against both 10-year moving averages of the aa-geomagnetic index and sunspot number, having correlation coefficients of approx. 0.7, inferring that nearly half the variance in the 10-year moving average of temperature can be explained by solar/geomagnetic forcing. The residuals appear episodic in nature, with cooling seen in the 1880s and again near 1980. Seven of the last 10 years of the temperature record has exceeded 10 C, unprecedented in the overall record. Variation of sunspot cyclic averages and 2-cycle moving averages of temperature strongly associate with similar averages for the solar/geomagnetic cycle, with the residuals displaying an apparent 9-cycle variation and a steep rise in temperature associated with cycle 23. Hale cycle averages of temperature for even-odd pairs of sunspot cycles correlate against similar averages for the solar/geomagnetic cycle and, especially, against the length of the Hale cycle. Indications are that annual mean temperature will likely exceed 10 C over the next decade.

  5. Short-term electric power demand forecasting based on economic-electricity transmission model

    NASA Astrophysics Data System (ADS)

    Li, Wenfeng; Bai, Hongkun; Liu, Wei; Liu, Yongmin; Wang, Yubin Mao; Wang, Jiangbo; He, Dandan

    2018-04-01

    Short-term electricity demand forecasting is the basic work to ensure safe operation of the power system. In this paper, a practical economic electricity transmission model (EETM) is built. With the intelligent adaptive modeling capabilities of Prognoz Platform 7.2, the econometric model consists of three industrial added value and income levels is firstly built, the electricity demand transmission model is also built. By multiple regression, moving averages and seasonal decomposition, the problem of multiple correlations between variables is effectively overcome in EETM. The validity of EETM is proved by comparison with the actual value of Henan Province. Finally, EETM model is used to forecast the electricity consumption of the 1-4 quarter of 2018.

  6. The bifoil photodyne: a photonic crystal oscillator.

    PubMed

    Lugo, J E; Doti, R; Sanchez, N; de la Mora, M B; del Rio, J A; Faubert, J

    2014-01-15

    Optical tweezers is an example how to use light to generate a physical force. They have been used to levitate viruses, bacteria, cells, and sub cellular organisms. Nonetheless it would be beneficial to use such force to develop a new kind of applications. However the radiation pressure usually is small to think in moving larger objects. Currently, there is some research investigating novel photonic working principles to generate a higher force. Here, we studied theoretically and experimentally the induction of electromagnetic forces in one-dimensional photonic crystals when light impinges on the off-axis direction. The photonic structure consists of a micro-cavity like structure formed of two one-dimensional photonic crystals made of free-standing porous silicon, separated by a variable air gap and the working wavelength is 633 nm. We show experimental evidence of this force when the photonic structure is capable of making auto-oscillations and forced-oscillations. We measured peak displacements and velocities ranging from 2 up to 35 microns and 0.4 up to 2.1 mm/s with a power of 13 mW. Recent evidence showed that giant resonant light forces could induce average velocity values of 0.45 mm/s in microspheres embedded in water with 43 mW light power.

  7. The bifoil photodyne: a photonic crystal oscillator

    PubMed Central

    Lugo, J. E.; Doti, R.; Sanchez, N.; de la Mora, M. B.; del Rio, J. A.; Faubert, J.

    2014-01-01

    Optical tweezers is an example how to use light to generate a physical force. They have been used to levitate viruses, bacteria, cells, and sub cellular organisms. Nonetheless it would be beneficial to use such force to develop a new kind of applications. However the radiation pressure usually is small to think in moving larger objects. Currently, there is some research investigating novel photonic working principles to generate a higher force. Here, we studied theoretically and experimentally the induction of electromagnetic forces in one-dimensional photonic crystals when light impinges on the off-axis direction. The photonic structure consists of a micro-cavity like structure formed of two one-dimensional photonic crystals made of free-standing porous silicon, separated by a variable air gap and the working wavelength is 633 nm. We show experimental evidence of this force when the photonic structure is capable of making auto-oscillations and forced-oscillations. We measured peak displacements and velocities ranging from 2 up to 35 microns and 0.4 up to 2.1 mm/s with a power of 13 mW. Recent evidence showed that giant resonant light forces could induce average velocity values of 0.45 mm/s in microspheres embedded in water with 43 mW light power. PMID:24423985

  8. Auto-adjustable pin tool for friction stir welding

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey (Inventor); Oelgoetz, Peter A. (Inventor)

    1999-01-01

    An auto-adjusting pin tool for friction stir welding is presented wherein the pin tool automatically adjusts for welding materials of varying thicknesses, and the pin can be incrementally withdrawn from the workpieces thus eliminating any crater or keyhole in the weld. The inventive apparatus is comprised of a welding head housing a motor connected to a controller instrument package and an arbor supported by bearings. The arbor forms an interior cylinder and is encircled by a stationary slip ring though which are ported hydraulic passageways into the interior cylinder of the arbor such that a piston housed therein may be moved axially. Coupled to the piston is a pin tool which is treaded on its lower end and which is moveably seated in, and extending through, a shoulder housing having concave lower face. When welding, the rotating treaded end of the pin enters and stirs the workpieces while the lower face of the shoulder housing compacts the workpieces. As the welding head traverses the shoulder housing the controller senses any rising pressure on the lower face of the shoulder housing and withdraws the arbor to keep the pressure constant. At the same time, the piston moves towards the workpieces thus extending the pin further from the shoulder. This keeps the pin at a proper depth in the workpieces regardless of their thicknesses. As the weld terminates this same operation can be used to incrementally withdraw the pin during the final part of the traverse, thus eliminating any keyhole or crater that would otherwise be created.

  9. First clinical experience in carbon ion scanning beam therapy: retrospective analysis of patient positional accuracy.

    PubMed

    Mori, Shinichiro; Shibayama, Kouichi; Tanimoto, Katsuyuki; Kumagai, Motoki; Matsuzaki, Yuka; Furukawa, Takuji; Inaniwa, Taku; Shirai, Toshiyuki; Noda, Koji; Tsuji, Hiroshi; Kamada, Tadashi

    2012-09-01

    Our institute has constructed a new treatment facility for carbon ion scanning beam therapy. The first clinical trials were successfully completed at the end of November 2011. To evaluate patient setup accuracy, positional errors between the reference Computed Tomography (CT) scan and final patient setup images were calculated using 2D-3D registration software. Eleven patients with tumors of the head and neck, prostate and pelvis receiving carbon ion scanning beam treatment participated. The patient setup process takes orthogonal X-ray flat panel detector (FPD) images and the therapists adjust the patient table position in six degrees of freedom to register the reference position by manual or auto- (or both) registration functions. We calculated residual positional errors with the 2D-3D auto-registration function using the final patient setup orthogonal FPD images and treatment planning CT data. Residual error averaged over all patients in each fraction decreased from the initial to the last treatment fraction [1.09 mm/0.76° (averaged in the 1st and 2nd fractions) to 0.77 mm/0.61° (averaged in the 15th and 16th fractions)]. 2D-3D registration calculation time was 8.0 s on average throughout the treatment course. Residual errors in translation and rotation averaged over all patients as a function of date decreased with the passage of time (1.6 mm/1.2° in May 2011 to 0.4 mm/0.2° in December 2011). This retrospective residual positional error analysis shows that the accuracy of patient setup during the first clinical trials of carbon ion beam scanning therapy was good and improved with increasing therapist experience.

  10. A walk through the planned CS building. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Khorramabadi, Delnaz

    1991-01-01

    Using the architectural plan views of our future computer science building as test objects, we have completed the first stage of a Building walkthrough system. The inputs to our system are AutoCAD files. An AutoCAD converter translates the geometrical information in these files into a format suitable for 3D rendering. Major model errors, such as incorrect polygon intersections and random face orientations, are detected and fixed automatically. Interactive viewing and editing tools are provided to view the results, to modify and clean the model and to change surface attributes. Our display system provides a simple-to-use user interface for interactive exploration of buildings. Using only the mouse buttons, the user can move inside and outside the building and change floors. Several viewing and rendering options are provided, such as restricting the viewing frustum, avoiding wall collisions, and selecting different rendering algorithms. A plan view of the current floor, with the position of the eye point and viewing direction on it, is displayed at all times. The scene illumination can be manipulated, by interactively controlling intensity values for 5 light sources.

  11. Remote gaze tracking system on a large display.

    PubMed

    Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun

    2013-10-07

    We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°~±0.775° and a speed of 5~10 frames/s.

  12. Remote Gaze Tracking System on a Large Display

    PubMed Central

    Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun

    2013-01-01

    We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°∼±0.775° and a speed of 5∼10 frames/s. PMID:24105351

  13. Potentiation of Prostate Cancer Radiotherapy using Combined Antiangiogenic and Antitumor Therapies

    DTIC Science & Technology

    2006-10-01

    frozen sections based on the overall intensity of the fluorescently conjugated antibody to EF5. Combination treated groups are shown paired with day...different pericyte markers were evaluated (one of which, NG2, is being recut to improve the auto -segmentation process). The expectation was that the...studies of vascular regression in the retina following hyperoxia, and in tumors following VEGF withdrawal, have shown that vessels covered by α

  14. [Comparisons of manual and automatic refractometry with subjective results].

    PubMed

    Wübbolt, I S; von Alven, S; Hülssner, O; Erb, C

    2006-11-01

    Refractometry is very important in everyday clinical practice. The aim of this study is to compare the precision of three objective methods of refractometry with subjective dioptometry (Phoropter). The objective methods with the smallest deviation to subjective refractometry results are evaluated. The objective methods/instruments used were retinoscopy, Prism Refractometer PR 60 (Rodenstock) and Auto Refractometer RM-A 7000 (Topcon). The results of monocular dioptometry (sphere, cylinder and axis) of each objective method were compared to the results of the subjective method. The examination was carried out on 178 eyes, which were divided into 3 age-related groups: 6 - 12 years (103 eyes), 13 - 18 years (38 eyes) and older than 18 years (37 eyes). All measurements were made in cycloplegia. The smallest standard deviation of the measurement error was found for the Auto Refractometer RM-A 7000. Both the PR 60 and retinoscopy had a clearly higher standard deviation. Furthermore, the RM-A 7000 showed in three and retinoscopy in four of the nine comparisons a significant bias in the measurement error. The Auto Refractometer provides measurements with the smallest deviation compared to the subjective method. Here it has to be taken into account that the measurements for the sphere have an average deviation of + 0.2 dpt. In comparison to retinoscopy the examination of children with the RM-A 7000 is difficult. An advantage of the Auto Refractometer is the fast and easy handling, so that measurements can be performed by medical staff.

  15. Auto correlation analysis of coda waves from local earthquakes for detecting temporal changes in shallow subsurface structures - The 2011 Tohoku-Oki, Japan, earthquake -

    NASA Astrophysics Data System (ADS)

    Nakahara, H.

    2013-12-01

    For monitoring temporal changes in subsurface structures, I propose to use auto correlation functions of coda waves from local earthquakes recorded at surface receivers, which probably contain more body waves than surface waves. Because the use of coda waves requires earthquakes, time resolution for monitoring decreases. But at regions with high seismicity, it may be possible to monitor subsurface structures in sufficient time resolutions. Studying the 2011 Tohoku-Oki (Mw 9.0), Japan, earthquake for which velocity changes have been already reported by previous studies, I try to validate the method. KiK-net stations in northern Honshu are used in the analysis. For each moderate earthquake, normalized auto correlation functions of surface records are stacked with respect to time windows in S-wave coda. Aligning the stacked normalized auto correlation functions with time, I search for changes in arrival times of phases. The phases at lag times of less than 1s are studied because changes at shallow depths are focused. Based on the stretching method, temporal variations in the arrival times are measured at the stations. Clear phase delays are found to be associated with the mainshock and to gradually recover with time. Amounts of the phase delays are in the order of 10% on average with the maximum of about 50% at some stations. For validation, the deconvolution analysis using surface and subsurface records at the same stations are conducted. The results show that the phase delays from the deconvolution analysis are slightly smaller than those from the auto correlation analysis, which implies that the phases on the auto correlations are caused by larger velocity changes at shallower depths. The auto correlation analysis seems to have an accuracy of about several percents, which is much larger than methods using earthquake doublets and borehole array data. So this analysis might be applicable to detect larger changes. In spite of these disadvantages, this analysis is still attractive because it can be applied to many records on the surface in regions where no boreholes are available. Acknowledgements: Seismograms recorded by KiK-net managed by National Research Institute for Earth Science and Disaster Prevention (NIED) were used in this study. This study was partially supported by JST J-RAPID program and JSPS KAKENHI Grant Numbers 24540449 and 23540449.

  16. A Novel Therapy for Chronic Sleep-Onset Insomnia: A Retrospective, Nonrandomized Controlled Study of Auto-Adjusting, Dual-Level, Positive Airway Pressure Technology.

    PubMed

    Krakow, Barry; Ulibarri, Victor A; McIver, Natalia D; Nadorff, Michael R

    2016-09-29

    Evidence indicates that behavioral or drug therapy may not target underlying pathophysiologic mechanisms for chronic insomnia, possibly due to previously unrecognized high rates (30%-90%) of sleep apnea in chronic insomnia patients. Although treatment studies with positive airway pressure (PAP) demonstrate decreased severity of chronic sleep maintenance insomnia in patients with co-occurring sleep apnea, sleep-onset insomnia has not shown similar results. We hypothesized advanced PAP technology would be associated with decreased sleep-onset insomnia severity in a sample of predominantly psychiatric patients with comorbid sleep apnea. We reviewed charts of 74 severe sleep-onset insomnia patients seen from March 2011 to August 2015, all meeting American Academy of Sleep Medicine Work Group criteria for a chronic insomnia disorder and all affirming behavioral and psychological origins for insomnia (averaging 10 of 18 indicators/patient), as well as averaging 2 or more psychiatric symptoms or conditions: depression (65.2%), anxiety (41.9%), traumatic exposure (35.1%), claustrophobia (29.7%), panic attacks (28.4%), and posttraumatic stress disorder (20.3%). All patients failed continuous or bilevel PAP and were manually titrated with auto-adjusting PAP modes (auto-bilevel and adaptive-servo ventilation). At 1-year follow-up, patients were compared through nonrandom assignment on the basis of a PAP compliance metric of > 20 h/wk (56 PAP users) versus < 20 h/wk (18 partial PAP users). PAP users showed significantly greater decreases in global insomnia severity (Hedges' g = 1.72) and sleep-onset insomnia (g = 2.07) compared to partial users (g = 1.04 and 0.91, respectively). Both global and sleep-onset insomnia severity decreased below moderate levels in PAP users compared to partial users whose outcomes persisted at moderately severe levels. In a nonrandomized controlled retrospective study, advanced PAP technology (both auto-bilevel and adaptive servo-ventilation) were associated with large decreases in insomnia severity for sleep-onset insomnia patients who strongly believed psychological factors caused their sleeplessness. PAP treatment of sleep-onset insomnia merits further investigation. © Copyright 2016 Physicians Postgraduate Press, Inc.

  17. Increased Risk of Paroxysmal Atrial Fibrillation Episodes Associated with Acute Increases in Ambient Air Pollution

    PubMed Central

    Rich, David Q.; Mittleman, Murray A.; Link, Mark S.; Schwartz, Joel; Luttmann-Gibson, Heike; Catalano, Paul J.; Speizer, Frank E.; Gold, Diane R.; Dockery, Douglas W.

    2006-01-01

    Objectives: We reported previously that 24-hr moving average ambient air pollution concentrations were positively associated with ventricular arrhythmias detected by implantable cardioverter defibrillators (ICDs). ICDs also detect paroxysmal atrial fibrillation episodes (PAF) that result in rapid ventricular rates. In this same cohort of ICD patients, we assessed the association between ambient air pollution and episodes of PAF. Design: We performed a case–crossover study. Participants: Patients who lived in the Boston, Massachusetts, metropolitan area and who had ICDs implanted between June 1995 and December 1999 (n = 203) were followed until July 2002. Evaluations/Measurements: We used conditional logistic regression to explore the association between community air pollution and 91 electrophysiologist-confirmed episodes of PAF among 29 subjects. Results: We found a statistically significant positive association between episodes of PAF and increased ozone concentration (22 ppb) in the hour before the arrhythmia (odds ratio = 2.08; 95% confidence interval = 1.22, 3.54; p = 0.001). The risk estimate for a longer (24-hr) moving average was smaller, thus suggesting an immediate effect. Positive but not statistically significant risks were associated with fine particles, nitrogen dioxide, and black carbon. Conclusions: Increased ambient O3 pollution was associated with increased risk of episodes of rapid ventricular response due to PAF, thereby suggesting that community air pollution may be a precipitant of these events. PMID:16393668

  18. Prototype of an auto-calibrating, context-aware, hybrid brain-computer interface.

    PubMed

    Faller, J; Torrellas, S; Miralles, F; Holzner, C; Kapeller, C; Guger, C; Bund, J; Müller-Putz, G R; Scherer, R

    2012-01-01

    We present the prototype of a context-aware framework that allows users to control smart home devices and to access internet services via a Hybrid BCI system of an auto-calibrating sensorimotor rhythm (SMR) based BCI and another assistive device (Integra Mouse mouth joystick). While there is extensive literature that describes the merit of Hybrid BCIs, auto-calibrating and co-adaptive ERD BCI training paradigms, specialized BCI user interfaces, context-awareness and smart home control, there is up to now, no system that includes all these concepts in one integrated easy-to-use framework that can truly benefit individuals with severe functional disabilities by increasing independence and social inclusion. Here we integrate all these technologies in a prototype framework that does not require expert knowledge or excess time for calibration. In a first pilot-study, 3 healthy volunteers successfully operated the system using input signals from an ERD BCI and an Integra Mouse and reached average positive predictive values (PPV) of 72 and 98% respectively. Based on what we learned here we are planning to improve the system for a test with a larger number of healthy volunteers so we can soon bring the system to benefit individuals with severe functional disability.

  19. Forecasting coconut production in the Philippines with ARIMA model

    NASA Astrophysics Data System (ADS)

    Lim, Cristina Teresa

    2015-02-01

    The study aimed to depict the situation of the coconut industry in the Philippines for the future years applying Autoregressive Integrated Moving Average (ARIMA) method. Data on coconut production, one of the major industrial crops of the country, for the period of 1990 to 2012 were analyzed using time-series methods. Autocorrelation (ACF) and partial autocorrelation functions (PACF) were calculated for the data. Appropriate Box-Jenkins autoregressive moving average model was fitted. Validity of the model was tested using standard statistical techniques. The forecasting power of autoregressive moving average (ARMA) model was used to forecast coconut production for the eight leading years.

  20. Hyperglycemic clamp and oral glucose tolerance test for 3-year prediction of clinical onset in persistently autoantibody-positive offspring and siblings of type 1 diabetic patients.

    PubMed

    Balti, Eric V; Vandemeulebroucke, Evy; Weets, Ilse; Van De Velde, Ursule; Van Dalem, Annelien; Demeester, Simke; Verhaeghen, Katrijn; Gillard, Pieter; De Block, Christophe; Ruige, Johannes; Keymeulen, Bart; Pipeleers, Daniel G; Decochez, Katelijn; Gorus, Frans K

    2015-02-01

    In preparation of future prevention trials, we aimed to identify predictors of 3-year diabetes onset among oral glucose tolerance test (OGTT)- and hyperglycemic clamp-derived metabolic markers in persistently islet autoantibody positive (autoAb(+)) offspring and siblings of patients with type 1 diabetes (T1D). The design is a registry-based study. Functional tests were performed in a hospital setting. Persistently autoAb(+) first-degree relatives of patients with T1D (n = 81; age 5-39 years). We assessed 3-year predictive ability of OGTT- and clamp-derived markers using receiver operating characteristics (ROC) and Cox regression analysis. Area under the curve of clamp-derived first-phase C-peptide release (AUC(5-10 min); min 5-10) was determined in all relatives and second-phase release (AUC(120-150 min); min 120-150) in those aged 12-39 years (n = 62). Overall, the predictive ability of AUC(5-10 min) was better than that of peak C-peptide, the best predictor among OGTT-derived parameters (ROC-AUC [95%CI]: 0.89 [0.80-0.98] vs 0.81 [0.70-0.93]). Fasting blood glucose (FBG) and AUC(5-10 min) provided the best combination of markers for prediction of diabetes within 3 years; (ROC-AUC [95%CI]: 0.92 [0.84-1.00]). In multivariate Cox regression analysis, AUC(5-10 min)) (P = .001) was the strongest independent predictor and interacted significantly with all tested OGTT-derived parameters. AUC(5-10 min) below percentile 10 of controls was associated with 50-70% progression to T1D regardless of age. Similar results were obtained for AUC(120-150 min). Clamp-derived first-phase C-peptide release can be used as an efficient and simple screening strategy in persistently autoAb(+) offspring and siblings of T1D patients to predict impending diabetes.

  1. Mutual information estimation for irregularly sampled time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Marwan, N.; Heitzig, J.; Kurths, J.

    2012-04-01

    For the automated, objective and joint analysis of time series, similarity measures are crucial. Used in the analysis of climate records, they allow for a complimentary, unbiased view onto sparse datasets. The irregular sampling of many of these time series, however, makes it necessary to either perform signal reconstruction (e.g. interpolation) or to develop and use adapted measures. Standard linear interpolation comes with an inevitable loss of information and bias effects. We have recently developed a Gaussian kernel-based correlation algorithm with which the interpolation error can be substantially lowered, but this would not work should the functional relationship in a bivariate setting be non-linear. We therefore propose an algorithm to estimate lagged auto and cross mutual information from irregularly sampled time series. We have extended the standard and adaptive binning histogram estimators and use Gaussian distributed weights in the estimation of the (joint) probabilities. To test our method we have simulated linear and nonlinear auto-regressive processes with Gamma-distributed inter-sampling intervals. We have then performed a sensitivity analysis for the estimation of actual coupling length, the lag of coupling and the decorrelation time in the synthetic time series and contrast our results to the performance of a signal reconstruction scheme. Finally we applied our estimator to speleothem records. We compare the estimated memory (or decorrelation time) to that from a least-squares estimator based on fitting an auto-regressive process of order 1. The calculated (cross) mutual information results are compared for the different estimators (standard or adaptive binning) and contrasted with results from signal reconstruction. We find that the kernel-based estimator has a significantly lower root mean square error and less systematic sampling bias than the interpolation-based method. It is possible that these encouraging results could be further improved by using non-histogram mutual information estimators, like k-Nearest Neighbor or Kernel-Density estimators, but for short (<1000 points) and irregularly sampled datasets the proposed algorithm is already a great improvement.

  2. Factors associated with residential mobility during pregnancy.

    PubMed

    Amoah, Doris K; Nolan, Vikki; Relyea, George; Gurney, James G; Yu, Xinhua; Tylavsky, Frances A; Mzayek, Fawaz

    2017-09-18

    Our objective was to determine the factors associated with residential moving during pregnancy, as it may increase stress during pregnancy and affect birth outcomes. Data were obtained from the Conditions Affecting Neurocognitive Development and Learning in Early Childhood (CANDLE) study. Participants were recruited from December 2006 to June 2011 and included 1,448 pregnant women. The average gestational age at enrollment was 23 weeks. The primary outcome of residential mobility was defined as any change in address during pregnancy. Multivariate regression was used to assess the adjusted associations of factors with residential mobility. Out of 1,448 participants, approximately 9 percent moved between baseline (enrollment) and delivery. After adjusting for covariates, mothers with lower educational attainment [less than high school (adjusted odds ratio [aOR] = 3.74, 95% confidence interval [CI] = 1.78, 7.85) and high school/technical school (aOR = 3.57, 95% CI = 2.01, 6.32) compared to college degree or higher], and shorter length of residence in neighborhood were more likely to have moved compared to other mothers. Length of residence was protective of mobility (aOR = 0.91, 95% CI = 0.86, 0.96 per year). Increased understanding of residential mobility during pregnancy may help improve the health of mothers and their children.

  3. Gbm.auto: A software tool to simplify spatial modelling and Marine Protected Area planning

    PubMed Central

    Officer, Rick; Clarke, Maurice; Reid, David G.; Brophy, Deirdre

    2017-01-01

    Boosted Regression Trees. Excellent for data-poor spatial management but hard to use Marine resource managers and scientists often advocate spatial approaches to manage data-poor species. Existing spatial prediction and management techniques are either insufficiently robust, struggle with sparse input data, or make suboptimal use of multiple explanatory variables. Boosted Regression Trees feature excellent performance and are well suited to modelling the distribution of data-limited species, but are extremely complicated and time-consuming to learn and use, hindering access for a wide potential user base and therefore limiting uptake and usage. BRTs automated and simplified for accessible general use with rich feature set We have built a software suite in R which integrates pre-existing functions with new tailor-made functions to automate the processing and predictive mapping of species abundance data: by automating and greatly simplifying Boosted Regression Tree spatial modelling, the gbm.auto R package suite makes this powerful statistical modelling technique more accessible to potential users in the ecological and modelling communities. The package and its documentation allow the user to generate maps of predicted abundance, visualise the representativeness of those abundance maps and to plot the relative influence of explanatory variables and their relationship to the response variables. Databases of the processed model objects and a report explaining all the steps taken within the model are also generated. The package includes a previously unavailable Decision Support Tool which combines estimated escapement biomass (the percentage of an exploited population which must be retained each year to conserve it) with the predicted abundance maps to generate maps showing the location and size of habitat that should be protected to conserve the target stocks (candidate MPAs), based on stakeholder priorities, such as the minimisation of fishing effort displacement. Gbm.auto for management in various settings By bridging the gap between advanced statistical methods for species distribution modelling and conservation science, management and policy, these tools can allow improved spatial abundance predictions, and therefore better management, decision-making, and conservation. Although this package was built to support spatial management of a data-limited marine elasmobranch fishery, it should be equally applicable to spatial abundance modelling, area protection, and stakeholder engagement in various scenarios. PMID:29216310

  4. Physics-based coastal current tomographic tracking using a Kalman filter.

    PubMed

    Wang, Tongchen; Zhang, Ying; Yang, T C; Chen, Huifang; Xu, Wen

    2018-05-01

    Ocean acoustic tomography can be used based on measurements of two-way travel-time differences between the nodes deployed on the perimeter of the surveying area to invert/map the ocean current inside the area. Data at different times can be related using a Kalman filter, and given an ocean circulation model, one can in principle now cast and even forecast current distribution given an initial distribution and/or the travel-time difference data on the boundary. However, an ocean circulation model requires many inputs (many of them often not available) and is unpractical for estimation of the current field. A simplified form of the discretized Navier-Stokes equation is used to show that the future velocity state is just a weighted spatial average of the current state. These weights could be obtained from an ocean circulation model, but here in a data driven approach, auto-regressive methods are used to obtain the time and space dependent weights from the data. It is shown, based on simulated data, that the current field tracked using a Kalman filter (with an arbitrary initial condition) is more accurate than that estimated by the standard methods where data at different times are treated independently. Real data are also examined.

  5. Heart rate measurement based on a time-lapse image.

    PubMed

    Takano, Chihiro; Ohta, Yuji

    2007-10-01

    Using a time-lapse image acquired from a CCD camera, we developed a non-contact and non-invasive device, which could measure both the respiratory and pulse rate simultaneously. The time-lapse image of a part of the subject's skin was consecutively captured, and the changes in the average image brightness of the region of interest (ROI) were measured for 30s. The brightness data were processed by a series of operations of interpolation as follows a first-order derivative, a low pass filter of 2 Hz, and a sixth-order auto-regressive (AR) spectral analysis. Fourteen sound and healthy female subjects (22-27 years of age) participated in the experiments. Each subject was told to keep a relaxed seating posture with no physical restriction. At the same time, heart rate was measured by a pulse oximeter and respiratory rate was measured by a thermistor placed at the external naris. Using AR spectral analysis, two clear peaks could be detected at approximately 0.3 and 1.2 Hz. The peaks were thought to correspond to the respiratory rate and the heart rate. Correlation coefficients of 0.90 and 0.93 were obtained for the measurement of heart rate and respiratory rate, respectively.

  6. THE VELOCITY DISTRIBUTION OF NEARBY STARS FROM HIPPARCOS DATA. II. THE NATURE OF THE LOW-VELOCITY MOVING GROUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bovy, Jo; Hogg, David W., E-mail: jo.bovy@nyu.ed

    2010-07-10

    The velocity distribution of nearby stars ({approx}<100 pc) contains many overdensities or 'moving groups', clumps of comoving stars, that are inconsistent with the standard assumption of an axisymmetric, time-independent, and steady-state Galaxy. We study the age and metallicity properties of the low-velocity moving groups based on the reconstruction of the local velocity distribution in Paper I of this series. We perform stringent, conservative hypothesis testing to establish for each of these moving groups whether it could conceivably consist of a coeval population of stars. We conclude that they do not: the moving groups are neither trivially associated with their eponymousmore » open clusters nor with any other inhomogeneous star formation event. Concerning a possible dynamical origin of the moving groups, we test whether any of the moving groups has a higher or lower metallicity than the background population of thin disk stars, as would generically be the case if the moving groups are associated with resonances of the bar or spiral structure. We find clear evidence that the Hyades moving group has higher than average metallicity and weak evidence that the Sirius moving group has lower than average metallicity, which could indicate that these two groups are related to the inner Lindblad resonance of the spiral structure. Further, we find weak evidence that the Hercules moving group has higher than average metallicity, as would be the case if it is associated with the bar's outer Lindblad resonance. The Pleiades moving group shows no clear metallicity anomaly, arguing against a common dynamical origin for the Hyades and Pleiades groups. Overall, however, the moving groups are barely distinguishable from the background population of stars, raising the likelihood that the moving groups are associated with transient perturbations.« less

  7. A Secondary Antibody-Detecting Molecular Weight Marker with Mouse and Rabbit IgG Fc Linear Epitopes for Western Blot Analysis

    PubMed Central

    Cheng, Ta-Chun; Tung, Yi-Ching; Chu, Pei-Yu; Chuang, Chih-Hung; Hsieh, Yuan-Chin; Huang, Chien-Chiao; Wang, Yeng-Tseng; Kao, Chien-Han; Roffler, Steve R.; Cheng, Tian-Lu

    2016-01-01

    Molecular weight markers that can tolerate denaturing conditions and be auto-detected by secondary antibodies offer great efficacy and convenience for Western Blotting. Here, we describe M&R LE protein markers which contain linear epitopes derived from the heavy chain constant regions of mouse and rabbit immunoglobulin G (IgG Fc LE). These markers can be directly recognized and stained by a wide range of anti-mouse and anti-rabbit secondary antibodies. We selected three mouse (M1, M2 and M3) linear IgG1 and three rabbit (R1, R2 and R3) linear IgG heavy chain epitope candidates based on their respective crystal structures. Western blot analysis indicated that M2 and R2 linear epitopes are effectively recognized by anti-mouse and anti-rabbit secondary antibodies, respectively. We fused the M2 and R2 epitopes (M&R LE) and incorporated the polypeptide in a range of 15–120 kDa auto-detecting markers (M&R LE protein marker). The M&R LE protein marker can be auto-detected by anti-mouse and anti-rabbit IgG secondary antibodies in standard immunoblots. Linear regression analysis of the M&R LE protein marker plotted as gel mobility versus the log of the marker molecular weights revealed good linearity with a correlation coefficient R2 value of 0.9965, indicating that the M&R LE protein marker displays high accuracy for determining protein molecular weights. This accurate, regular and auto-detected M&R LE protein marker may provide a simple, efficient and economical tool for protein analysis. PMID:27494183

  8. A Secondary Antibody-Detecting Molecular Weight Marker with Mouse and Rabbit IgG Fc Linear Epitopes for Western Blot Analysis.

    PubMed

    Lin, Wen-Wei; Chen, I-Ju; Cheng, Ta-Chun; Tung, Yi-Ching; Chu, Pei-Yu; Chuang, Chih-Hung; Hsieh, Yuan-Chin; Huang, Chien-Chiao; Wang, Yeng-Tseng; Kao, Chien-Han; Roffler, Steve R; Cheng, Tian-Lu

    2016-01-01

    Molecular weight markers that can tolerate denaturing conditions and be auto-detected by secondary antibodies offer great efficacy and convenience for Western Blotting. Here, we describe M&R LE protein markers which contain linear epitopes derived from the heavy chain constant regions of mouse and rabbit immunoglobulin G (IgG Fc LE). These markers can be directly recognized and stained by a wide range of anti-mouse and anti-rabbit secondary antibodies. We selected three mouse (M1, M2 and M3) linear IgG1 and three rabbit (R1, R2 and R3) linear IgG heavy chain epitope candidates based on their respective crystal structures. Western blot analysis indicated that M2 and R2 linear epitopes are effectively recognized by anti-mouse and anti-rabbit secondary antibodies, respectively. We fused the M2 and R2 epitopes (M&R LE) and incorporated the polypeptide in a range of 15-120 kDa auto-detecting markers (M&R LE protein marker). The M&R LE protein marker can be auto-detected by anti-mouse and anti-rabbit IgG secondary antibodies in standard immunoblots. Linear regression analysis of the M&R LE protein marker plotted as gel mobility versus the log of the marker molecular weights revealed good linearity with a correlation coefficient R2 value of 0.9965, indicating that the M&R LE protein marker displays high accuracy for determining protein molecular weights. This accurate, regular and auto-detected M&R LE protein marker may provide a simple, efficient and economical tool for protein analysis.

  9. Autologous/Allogeneic Hematopoietic Cell Transplantation versus Tandem Autologous Transplantation for Multiple Myeloma: Comparison of Long-Term Postrelapse Survival.

    PubMed

    Htut, Myo; D'Souza, Anita; Krishnan, Amrita; Bruno, Benedetto; Zhang, Mei-Jie; Fei, Mingwei; Diaz, Miguel Angel; Copelan, Edward; Ganguly, Siddhartha; Hamadani, Mehdi; Kharfan-Dabaja, Mohamed; Lazarus, Hillard; Lee, Cindy; Meehan, Kenneth; Nishihori, Taiga; Saad, Ayman; Seo, Sachiko; Ramanathan, Muthalagu; Usmani, Saad Z; Gasparetto, Christina; Mark, Tomer M; Nieto, Yago; Hari, Parameswaran

    2018-03-01

    We compared postrelapse overall survival (OS) after autologous/allogeneic (auto/allo) versus tandem autologous (auto/auto) hematopoietic cell transplantation (HCT) in patients with multiple myeloma (MM). Postrelapse survival of patients receiving an auto/auto or auto/allo HCT for MM and prospectively reported to the Center for International Blood and Marrow Transplant Research between 2000 and 2010 were analyzed. Relapse occurred in 404 patients (72.4%) in the auto/auto group and in 178 patients (67.4%) in the auto/allo group after a median follow-up of 8.5 years. Relapse occurred before 6 months after a second HCT in 46% of the auto/allo patients, compared with 26% of the auto/auto patients. The 6-year postrelapse survival was better in the auto/allo group compared with the auto/auto group (44% versus 35%; P = .05). Mortality due to MM was 69% (n = 101) in the auto/allo group and 83% (n = 229) deaths in auto/auto group. In multivariate analysis, both cohorts had a similar risk of death in the first year after relapse (hazard ratio [HR], .72; P = .12); however, for time points beyond 12 months after relapse, overall survival was superior in the auto/allo cohort (HR for death in auto/auto =1.55; P = .005). Other factors associated with superior survival were enrollment in a clinical trial for HCT, male sex, and use of novel agents at induction before HCT. Our findings shown superior survival afterrelapse in auto/allo HCT recipients compared with auto/auto HCT recipients. This likely reflects a better response to salvage therapy, such as immunomodulatory drugs, potentiated by a donor-derived immunologic milieu. Further augmentation of the post-allo-HCT immune system with new immunotherapies, such as monoclonal antibodies, checkpoint inhibitors, and others, merit investigation. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.

  10. Factors Influencing Willingness to Move: An Examination of Nonmetropolitan Residents.

    ERIC Educational Resources Information Center

    Swanson, Louis E., Jr.; And Others

    1979-01-01

    Examining relationships between social restraints and economic incentives on individuals' willingness to move, special attention was given to labor force participation relative to social factors. Regression Analysis found age and community tenure correlated negatively with willingness to move; people who were employed or not yet retired showed…

  11. Skin exposure to aliphatic polyisocyanates in the auto body repair and refinishing industry: III. A personal exposure algorithm.

    PubMed

    Liu, Youcheng; Stowe, Meredith H; Bello, Dhimiter; Sparer, Judy; Gore, Rebecca J; Cullen, Mark R; Redlich, Carrie A; Woskie, Susan R

    2009-01-01

    Isocyanate skin exposure may play an important role in sensitization and the development of isocyanate asthma, but such exposures are frequently intermittent and difficult to assess. Exposure metrics are needed to better estimate isocyanate skin exposures. The goal of this study was to develop a semiquantitative algorithm to estimate personal skin exposures in auto body shop workers using task-based skin exposure data and daily work diaries. The relationship between skin and respiratory exposure metrics was also evaluated. The development and results of respiratory exposure metrics were previously reported. Using the task-based data obtained with a colorimetric skin exposure indicator and a daily work diary, we developed a skin exposure algorithm to estimate a skin exposure index (SEI) for each worker. This algorithm considered the type of personal protective equipment (PPE) used, the percentage of skin area covered by PPE and skin exposures without and underneath the PPE. The SEI was summed across the day (daily SEI) and survey week (weekly average SEI) for each worker, compared among the job title categories and also compared with the respiratory exposure metrics. A total of 893 person-days was calculated for 232 workers (49 painters, 118 technicians and 65 office workers) from 33 auto body shops. The median (10th-90th percentile, maximum) daily SEI was 0 (0-0, 1.0), 0 (0-1.9, 4.8) and 1.6 (0-3.5, 6.1) and weekly average SEI was 0 (0-0.0, 0.7), 0.3 (0-1.6, 4.2) and 1.9 (0.4-3.0, 3.6) for office workers, technicians and painters, respectively, which were significantly different (P < 0.0001). The median (10th-90th percentile, maximum) daily SEI was 0 (0-2.4, 6.1) and weekly average SEI was 0.2 (0-2.3, 4.2) for all workers. A relatively weak positive Spearman correlation was found between daily SEI and time-weighted average (TWA) respiratory exposure metrics (microg NCO m(-3)) (r = 0.380, n = 893, P < 0.0001) and between weekly SEI and TWA respiratory exposure metrics (r = 0.482, n = 232, P < 0.0001). The skin exposure algorithm developed in this study provides task-based personal daily and weekly average skin exposure indices that are adjusted for the use of PPE. These skin exposure indices can be used to assess isocyanate exposure-response relationships.

  12. A vector auto-regressive model for onshore and offshore wind synthesis incorporating meteorological model information

    NASA Astrophysics Data System (ADS)

    Hill, D.; Bell, K. R. W.; McMillan, D.; Infield, D.

    2014-05-01

    The growth of wind power production in the electricity portfolio is striving to meet ambitious targets set, for example by the EU, to reduce greenhouse gas emissions by 20% by 2020. Huge investments are now being made in new offshore wind farms around UK coastal waters that will have a major impact on the GB electrical supply. Representations of the UK wind field in syntheses which capture the inherent structure and correlations between different locations including offshore sites are required. Here, Vector Auto-Regressive (VAR) models are presented and extended in a novel way to incorporate offshore time series from a pan-European meteorological model called COSMO, with onshore wind speeds from the MIDAS dataset provided by the British Atmospheric Data Centre. Forecasting ability onshore is shown to be improved with the inclusion of the offshore sites with improvements of up to 25% in RMS error at 6 h ahead. In addition, the VAR model is used to synthesise time series of wind at each offshore site, which are then used to estimate wind farm capacity factors at the sites in question. These are then compared with estimates of capacity factors derived from the work of Hawkins et al. (2011). A good degree of agreement is established indicating that this synthesis tool should be useful in power system impact studies.

  13. On the Relationship between Solar Wind Speed, Earthward-Directed Coronal Mass Ejections, Geomagnetic Activity, and the Sunspot Cycle Using 12-Month Moving Averages

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2008-01-01

    For 1996 .2006 (cycle 23), 12-month moving averages of the aa geomagnetic index strongly correlate (r = 0.92) with 12-month moving averages of solar wind speed, and 12-month moving averages of the number of coronal mass ejections (CMEs) (halo and partial halo events) strongly correlate (r = 0.87) with 12-month moving averages of sunspot number. In particular, the minimum (15.8, September/October 1997) and maximum (38.0, August 2003) values of the aa geomagnetic index occur simultaneously with the minimum (376 km/s) and maximum (547 km/s) solar wind speeds, both being strongly correlated with the following recurrent component (due to high-speed streams). The large peak of aa geomagnetic activity in cycle 23, the largest on record, spans the interval late 2002 to mid 2004 and is associated with a decreased number of halo and partial halo CMEs, whereas the smaller secondary peak of early 2005 seems to be associated with a slight rebound in the number of halo and partial halo CMEs. Based on the observed aaM during the declining portion of cycle 23, RM for cycle 24 is predicted to be larger than average, being about 168+/-60 (the 90% prediction interval), whereas based on the expected aam for cycle 24 (greater than or equal to 14.6), RM for cycle 24 should measure greater than or equal to 118+/-30, yielding an overlap of about 128+/-20.

  14. Low-flow characteristics of Virginia streams

    USGS Publications Warehouse

    Austin, Samuel H.; Krstolic, Jennifer L.; Wiegand, Ute

    2011-01-01

    Low-flow annual non-exceedance probabilities (ANEP), called probability-percent chance (P-percent chance) flow estimates, regional regression equations, and transfer methods are provided describing the low-flow characteristics of Virginia streams. Statistical methods are used to evaluate streamflow data. Analysis of Virginia streamflow data collected from 1895 through 2007 is summarized. Methods are provided for estimating low-flow characteristics of gaged and ungaged streams. The 1-, 4-, 7-, and 30-day average streamgaging station low-flow characteristics for 290 long-term, continuous-record, streamgaging stations are determined, adjusted for instances of zero flow using a conditional probability adjustment method, and presented for non-exceedance probabilities of 0.9, 0.8, 0.7, 0.6, 0.5, 0.4, 0.3, 0.2, 0.1, 0.05, 0.02, 0.01, and 0.005. Stream basin characteristics computed using spatial data and a geographic information system are used as explanatory variables in regional regression equations to estimate annual non-exceedance probabilities at gaged and ungaged sites and are summarized for 290 long-term, continuous-record streamgaging stations, 136 short-term, continuous-record streamgaging stations, and 613 partial-record streamgaging stations. Regional regression equations for six physiographic regions use basin characteristics to estimate 1-, 4-, 7-, and 30-day average low-flow annual non-exceedance probabilities at gaged and ungaged sites. Weighted low-flow values that combine computed streamgaging station low-flow characteristics and annual non-exceedance probabilities from regional regression equations provide improved low-flow estimates. Regression equations developed using the Maintenance of Variance with Extension (MOVE.1) method describe the line of organic correlation (LOC) with an appropriate index site for low-flow characteristics at 136 short-term, continuous-record streamgaging stations and 613 partial-record streamgaging stations. Monthly streamflow statistics computed on the individual daily mean streamflows of selected continuous-record streamgaging stations and curves describing flow-duration are presented. Text, figures, and lists are provided summarizing low-flow estimates, selected low-flow sites, delineated physiographic regions, basin characteristics, regression equations, error estimates, definitions, and data sources. This study supersedes previous studies of low flows in Virginia.

  15. Comparison of Conventional and ANN Models for River Flow Forecasting

    NASA Astrophysics Data System (ADS)

    Jain, A.; Ganti, R.

    2011-12-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. River flow is generally estimated using time series or rainfall-runoff models. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been extensively adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conventional models. In this paper, a comparative study has been carried out for river flow forecasting using the conventional and ANN models. Among the conventional models, multiple linear, and non linear regression, and time series models of auto regressive (AR) type have been developed. Feed forward neural network model structure trained using the back propagation algorithm, a gradient search method, was adopted. The daily river flow data derived from Godavari Basin @ Polavaram, Andhra Pradesh, India have been employed to develop all the models included here. Two inputs, flows at two past time steps, (Q(t-1) and Q(t-2)) were selected using partial auto correlation analysis for forecasting flow at time t, Q(t). A wide range of error statistics have been used to evaluate the performance of all the models developed in this study. It has been found that the regression and AR models performed comparably, and the ANN model performed the best amongst all the models investigated in this study. It is concluded that ANN model should be adopted in real catchments for hydrological modeling and forecasting.

  16. Parametric output-only identification of time-varying structures using a kernel recursive extended least squares TARMA approach

    NASA Astrophysics Data System (ADS)

    Ma, Zhi-Sai; Liu, Li; Zhou, Si-Da; Yu, Lei; Naets, Frank; Heylen, Ward; Desmet, Wim

    2018-01-01

    The problem of parametric output-only identification of time-varying structures in a recursive manner is considered. A kernelized time-dependent autoregressive moving average (TARMA) model is proposed by expanding the time-varying model parameters onto the basis set of kernel functions in a reproducing kernel Hilbert space. An exponentially weighted kernel recursive extended least squares TARMA identification scheme is proposed, and a sliding-window technique is subsequently applied to fix the computational complexity for each consecutive update, allowing the method to operate online in time-varying environments. The proposed sliding-window exponentially weighted kernel recursive extended least squares TARMA method is employed for the identification of a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudo-linear regression TARMA method via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics. Furthermore, the comparisons demonstrate the superior achievable accuracy, lower computational complexity and enhanced online identification capability of the proposed kernel recursive extended least squares TARMA approach.

  17. SU-F-P-30: Clinical Assessment of Auto Beam-Hold Triggered by Fiducial Localization During Prostate RapidArc Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, P; Chen, Q

    2016-06-15

    Purpose: To assess the clinical efficacy of auto beam hold during prostate RapidArc delivery, triggered by fiducial localization on kV imaging with a Varian True Beam. Methods: Prostate patients with four gold fiducials were candidates in this study. Daily setup was accomplished by aligning to fiducials using orthogonal kV imaging. During RapidArc delivery, a kV image was automatically acquired with a momentary beam hold every 60 degrees of gantry rotation. The position of each fiducial was identified by a search algorithm and compared to a predetermined 1.4 cm diameter target area. Treatment continued if all the fiducials were within themore » target area. If any fiducial was outside the target area the beam hold was not released, and the operators determined if the patient needed re-alignment using the daily setup method. Results: Four patients were initially selected. For three patients, the auto beam hold performed seamlessly. In one instance, the system correctly identified misaligned fiducials, stopped treatment, and the patient was re-positioned. The fourth patient had a prosthetic hip which sometimes blocked the fiducials and caused the fiducial search algorithm to fail. The auto beam hold was disabled for this patient and the therapists manually monitored the fiducial positions during treatment. Average delivery time for a 2-arc fraction was increased by 59 seconds. Phantom studies indicated the dose discrepancy related to multiple beam holds is <0.1%. For a plan with 43 fractions, the additional imaging increased dose by an estimated 68 cGy. Conclusion: Automated intrafraction kV imaging can effectively perform auto beam holds due to patient movement, with the exception of prosthetic hip patients. The additional imaging dose and delivery time are clinically acceptable. It may be a cost-effective alternative to Calypso in RapidArc prostate patient delivery. Further study is warranted to explore its feasibility under various clinical conditions.« less

  18. Nutritional Intake and Nutritional Status by the Type of Hematopoietic Stem Cell Transplantation

    PubMed Central

    Lee, Ji Sun; Kim, Jee Yeon

    2012-01-01

    The aim of this study was to investigate the changes of nutritional intake and nutritional status and analyze the association between them during hematopoietic stem cell transplantation. This was a retrospective cross sectional study on 36 patients (9 Autologous transplantation group and 27 Allogeneic transplantation group) undergoing hematopoietic stem cell transplantation at The Catholic University of Korea, Seoul St. Mary's Hospital from May to August 2010. To assess oral intake and parenteral nutrition intake, 24-hour recall method and patient's charts review was performed. Nutritional status was measured with the scored patient-generated subjective global assessment (PG-SGA). The subjects consisted of 6 (66.7%) males and 3 (33.3%) females in the autologous transplantation group (auto), 12 (44.4%) males and 15 (55.6%) females in the allogeneic transplantation group (allo). The mean age was 40.9 ± 13.6 years (auto) and 37.8 ± 11.0 years (allo). The average hospitalized period was 25.2 ± 3.5 days (auto) and 31.6 ± 6.6 days (allo), which were significant different (p < 0.05). Nutritional intake was lowest at Post+1wk in two groups. In addition, calorie intake by oral diet to recommended intake at Post+2wk was low (20.8% auto and 20.5% allo) but there were no significant differences in change of nutritional intake over time (Admission, Pre-1day, Post+1wk, Post+2wk) between auto group and allo group by repeated measures ANOVA test. The result of nutritional assessment through PG-SGA was significantly different at Pre-1day only (p < 0.01). There was a significant negative correlation between the nutritional status during Post+2wk and the oral calorie/protein intake to recommended amount measured during Post+1wk and Post+2wk (p < 0.01). These results could be used to establish evidence-based nutritional care guidelines for patients during hematopoietic stem cell transplantation. PMID:23430590

  19. Performance of univariate forecasting on seasonal diseases: the case of tuberculosis.

    PubMed

    Permanasari, Adhistya Erna; Rambli, Dayang Rohaya Awang; Dominic, P Dhanapal Durai

    2011-01-01

    The annual disease incident worldwide is desirable to be predicted for taking appropriate policy to prevent disease outbreak. This chapter considers the performance of different forecasting method to predict the future number of disease incidence, especially for seasonal disease. Six forecasting methods, namely linear regression, moving average, decomposition, Holt-Winter's, ARIMA, and artificial neural network (ANN), were used for disease forecasting on tuberculosis monthly data. The model derived met the requirement of time series with seasonality pattern and downward trend. The forecasting performance was compared using similar error measure in the base of the last 5 years forecast result. The findings indicate that ARIMA model was the most appropriate model since it obtained the less relatively error than the other model.

  20. Evaluation of a commercial automatic treatment planning system for liver stereotactic body radiation therapy treatments.

    PubMed

    Gallio, Elena; Giglioli, Francesca Romana; Girardi, Andrea; Guarneri, Alessia; Ricardi, Umberto; Ropolo, Roberto; Ragona, Riccardo; Fiandra, Christian

    2018-02-01

    Automated treatment planning is a new frontier in radiotherapy. The Auto-Planning module of the Pinnacle 3 treatment planning system (TPS) was evaluated for liver stereotactic body radiation therapy treatments. Ten cases were included in the study. Six plans were generated for each case by four medical physics experts. The first two planned with Pinnacle TPS, both with manual module (MP) and Auto-Planning one (AP). The other two physicists generated two plans with Monaco TPS (VM). Treatment plan comparisons were then carried on the various dosimetric parameters of target and organs at risk, monitor units, number of segments, plan complexity metrics and human resource planning time. The user dependency of Auto-Planning was also tested and the plans were evaluated by a trained physician. Statistically significant differences (Anova test) were observed for spinal cord doses, plan average beam irregularity, number of segments, monitor units and human planning time. The Fisher-Hayter test applied to these parameters showed significant statistical differences between AP e MP for spinal cord doses and human planning time; between MP and VM for monitor units, number of segments and plan irregularity; for all those between AP and VM. The two plans created by different planners with AP were similar to each other. The plans created with Auto-Planning were comparable to the manually generated plans. The time saved in planning enables the planner to commit more resources to more complex cases. The independence of the planner enables to standardize plan quality. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. High average power magnetic modulator for metal vapor lasers

    DOEpatents

    Ball, Don G.; Birx, Daniel L.; Cook, Edward G.; Miller, John L.

    1994-01-01

    A three-stage magnetic modulator utilizing magnetic pulse compression designed to provide a 60 kV pulse to a copper vapor laser at a 4.5 kHz repetition rate is disclosed. This modulator operates at 34 kW input power. The circuit includes a step up auto transformer and utilizes a rod and plate stack construction technique to achieve a high packing factor.

  2. Associations between air pollution and perceived stress: the Veterans Administration Normative Aging Study.

    PubMed

    Mehta, Amar J; Kubzansky, Laura D; Coull, Brent A; Kloog, Itai; Koutrakis, Petros; Sparrow, David; Spiro, Avron; Vokonas, Pantel; Schwartz, Joel

    2015-01-27

    There is mixed evidence suggesting that air pollution may be associated with increased risk of developing psychiatric disorders. We aimed to investigate the association between air pollution and non-specific perceived stress, often a precursor to development of affective psychiatric disorders. This longitudinal analysis consisted of 987 older men participating in at least one visit for the Veterans Administration Normative Aging Study between 1995 and 2007 (n = 2,244 visits). At each visit, participants were administered the 14-item Perceived Stress Scale (PSS), which quantifies stress experienced in the previous week. Scores ranged from 0-56 with higher scores indicating increased stress. Differences in PSS score per interquartile range increase in moving average (1, 2, and 4-weeks) of air pollution exposures were estimated using linear mixed-effects regression after adjustment for age, race, education, physical activity, anti-depressant medication use, seasonality, meteorology, and day of week. We also evaluated effect modification by season (April-September and March-October for warm and cold season, respectively). Fine particles (PM2.5), black carbon (BC), nitrogen dioxide, and particle number counts (PNC) at moving averages of 1, 2, and 4-weeks were associated with higher perceived stress ratings. The strongest associations were observed for PNC; for example, a 15,997 counts/cm(3) interquartile range increase in 1-week average PNC was associated with a 3.2 point (95%CI: 2.1-4.3) increase in PSS score. Season modified the associations for specific pollutants; higher PSS scores in association with PM2.5, BC, and sulfate were observed mainly in colder months. Air pollution was associated with higher levels of perceived stress in this sample of older men, particularly in colder months for specific pollutants.

  3. Potential of pedestrian protection systems--a parameter study using finite element models of pedestrian dummy and generic passenger vehicles.

    PubMed

    Fredriksson, Rikard; Shin, Jaeho; Untaroiu, Costin D

    2011-08-01

    To study the potential of active, passive, and integrated (combined active and passive) safety systems in reducing pedestrian upper body loading in typical impact configurations. Finite element simulations using models of generic sedan car fronts and the Polar II pedestrian dummy were performed for 3 impact configurations at 2 impact speeds. Chest contact force, head injury criterion (HIC(15)), head angular acceleration, and the cumulative strain damage measure (CSDM(0.25)) were employed as injury parameters. Further, 3 countermeasures were modeled: an active autonomous braking system, a passive deployable countermeasure, and an integrated system combining the active and passive systems. The auto-brake system was modeled by reducing impact speed by 10 km/h (equivalent to ideal full braking over 0.3 s) and introducing a pitch of 1 degree and in-crash deceleration of 1 g. The deployable system consisted of a deployable hood, lifting 100 mm in the rear, and a lower windshield air bag. All 3 countermeasures showed benefit in a majority of impact configurations in terms of injury prevention. The auto-brake system reduced chest force in a majority of the configurations and decreased HIC(15), head angular acceleration, and CSDM in all configurations. Averaging all impact configurations, the auto-brake system showed reductions of injury predictors from 20 percent (chest force) to 82 percent (HIC). The passive deployable countermeasure reduced chest force and HIC(15) in a majority of configurations and head angular acceleration and CSDM in all configurations, although the CSDM decrease in 2 configurations was minimal. On average a reduction from 20 percent (CSDM) to 58 percent (HIC) was recorded in the passive deployable countermeasures. Finally, the integrated system evaluated in this study reduced all injury assessment parameters in all configurations compared to the reference situations. The average reductions achieved by the integrated system ranged from 56 percent (CSDM) to 85 percent (HIC). Both the active (autonomous braking) and passive deployable system studied had a potential to decrease pedestrian upper body loading. An integrated pedestrian safety system combining the active and passive systems increased the potential of the individual systems in reducing pedestrian head and chest loading.

  4. Time series regression model for infectious disease and weather.

    PubMed

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Aggregating the response in time series regression models, applied to weather-related cardiovascular mortality.

    PubMed

    Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B M J

    2018-07-01

    In environmental epidemiology studies, health response data (e.g. hospitalization or mortality) are often noisy because of hospital organization and other social factors. The noise in the data can hide the true signal related to the exposure. The signal can be unveiled by performing a temporal aggregation on health data and then using it as the response in regression analysis. From aggregated series, a general methodology is introduced to account for the particularities of an aggregated response in a regression setting. This methodology can be used with usually applied regression models in weather-related health studies, such as generalized additive models (GAM) and distributed lag nonlinear models (DLNM). In particular, the residuals are modelled using an autoregressive-moving average (ARMA) model to account for the temporal dependence. The proposed methodology is illustrated by modelling the influence of temperature on cardiovascular mortality in Canada. A comparison with classical DLNMs is provided and several aggregation methods are compared. Results show that there is an increase in the fit quality when the response is aggregated, and that the estimated relationship focuses more on the outcome over several days than the classical DLNM. More precisely, among various investigated aggregation schemes, it was found that an aggregation with an asymmetric Epanechnikov kernel is more suited for studying the temperature-mortality relationship. Copyright © 2018. Published by Elsevier B.V.

  6. Reconstructing latent dynamical noise for better forecasting observables

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito

    2018-03-01

    I propose a method for reconstructing multi-dimensional dynamical noise inspired by the embedding theorem of Muldoon et al. [Dyn. Stab. Syst. 13, 175 (1998)] by regarding multiple predictions as different observables. Then, applying the embedding theorem by Stark et al. [J. Nonlinear Sci. 13, 519 (2003)] for a forced system, I produce time series forecast by supplying the reconstructed past dynamical noise as auxiliary information. I demonstrate the proposed method on toy models driven by auto-regressive models or independent Gaussian noise.

  7. The Use of Shrinkage Techniques in the Estimation of Attrition Rates for Large Scale Manpower Models

    DTIC Science & Technology

    1988-07-27

    auto regressive model combined with a linear program that solves for the coefficients using MAD. But this success has diminished with time (Rowe...8217Harrison-Stevens Forcasting and the Multiprocess Dy- namic Linear Model ", The American Statistician, v.40, pp. 12 9 - 1 3 5 . 1986. 8. Box, G. E. P. and...1950. 40. McCullagh, P. and Nelder, J., Generalized Linear Models , Chapman and Hall. 1983. 41. McKenzie, E. General Exponential Smoothing and the

  8. Sol-gel auto-combustion synthesis and properties of Co2Z-type hexagonal ferrite ultrafine powders

    NASA Astrophysics Data System (ADS)

    Liu, Junliang; Yang, Min; Wang, Shengyun; Lv, Jingqing; Li, Yuqing; Zhang, Ming

    2018-05-01

    Z-type hexagonal ferrite ultrafine powders with chemical formulations of (BaxSr1-x)3Co2Fe24O41 (x varied from 0.0 to 1.0) have been synthesized by a sol-gel auto-combustion technique. The average particle sizes of the synthesized powders ranged from 2 to 5 μm. The partial substitution of Ba2+ by Sr2+ led to the shrinkage of the crystal lattices and resulted in changes in the magnetic sub-lattices, which tailored the static and dynamic magnetic properties of the as-synthesized powders. As the substitution ratio of Ba2+ by Sr2+, the saturation magnetization of the synthesized powders almost consistently increased from 43.3 to 56.1 emu/g, while the real part of permeability approached to a relatively high value about 2.2 owing to the balance of the saturation magnetization and magnetic anisotropy field.

  9. Voice Based City Panic Button System

    NASA Astrophysics Data System (ADS)

    Febriansyah; Zainuddin, Zahir; Bachtiar Nappu, M.

    2018-03-01

    The development of voice activated panic button application aims to design faster early notification of hazardous condition in community to the nearest police by using speech as the detector where the current application still applies touch-combination on screen and use coordination of orders from control center then the early notification still takes longer time. The method used in this research was by using voice recognition as the user voice detection and haversine formula for the comparison of closest distance between the user and the police. This research was equipped with auto sms, which sent notification to the victim’s relatives, that was also integrated with Google Maps application (GMaps) as the map to the victim’s location. The results show that voice registration on the application reaches 100%, incident detection using speech recognition while the application is running is 94.67% in average, and the auto sms to the victim relatives reaches 100%.

  10. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    ERIC Educational Resources Information Center

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  11. Sol–gel auto-combustion synthesis of PVP/CoFe{sub 2}O{sub 4} nanocomposite and its magnetic characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtan, U.; Topkaya, R., E-mail: rtopkaya@gyte.edu.tr; Baykal, A.

    2013-11-15

    Graphical abstract: - Highlights: • The Poly(vinyl pyrrolidone) (PVP) was used as a surface capping agent. • PVP/CoFe{sub 2}O{sub 4} nanocomposite was synthesized by a sol-gel auto-combustion method. • The existence of the spin-disordered surface layer was established. - Abstract: Poly(vinyl pyrrolidone)/CoFe{sub 2}O{sub 4} nanocomposite has been fabricated by a sol–gel auto-combustion method. Poly(vinyl pyrrolidone) was used as a reducing agent as well as a surface capping agent to prevent particle aggregation and stabilize the particles. The average crystallite size estimated from X-ray line profile fitting was found to be 20 ± 7 nm. The high field irreversibility and unsaturatedmore » magnetization behaviours indicate the presence of the core–shell structure in the sample. The exchange bias effect observed at 10 K suggests the existence of the magnetically aligned core surrounded by spin-disordered surface layer. The reduced remanent magnetization value of 0.6 at 10 K (higher than the theoretical value of 0.5) shows the PVP/CoFe{sub 2}O{sub 4} nanocomposite to have cubic magnetocrystalline anisotropy according to the Stoner–Wohlfarth model.« less

  12. Tree-ring-based estimates of long-term seasonal precipitation in the Souris River Region of Saskatchewan, North Dakota and Manitoba

    USGS Publications Warehouse

    Ryberg, Karen R.; Vecchia, Aldo V.; Akyüz, F. Adnan; Lin, Wei

    2016-01-01

    Historically unprecedented flooding occurred in the Souris River Basin of Saskatchewan, North Dakota and Manitoba in 2011, during a longer term period of wet conditions in the basin. In order to develop a model of future flows, there is a need to evaluate effects of past multidecadal climate variability and/or possible climate change on precipitation. In this study, tree-ring chronologies and historical precipitation data in a four-degree buffer around the Souris River Basin were analyzed to develop regression models that can be used for predicting long-term variations of precipitation. To focus on longer term variability, 12-year moving average precipitation was modeled in five subregions (determined through cluster analysis of measures of precipitation) of the study area over three seasons (November–February, March–June and July–October). The models used multiresolution decomposition (an additive decomposition based on powers of two using a discrete wavelet transform) of tree-ring chronologies from Canada and the US and seasonal 12-year moving average precipitation based on Adjusted and Homogenized Canadian Climate Data and US Historical Climatology Network data. Results show that precipitation varies on long-term (multidecadal) time scales of 16, 32 and 64 years. Past extended pluvial and drought events, which can vary greatly with season and subregion, were highlighted by the models. Results suggest that the recent wet period may be a part of natural variability on a very long time scale.

  13. Tandem Autologous versus Single Autologous Transplantation Followed by Allogeneic Hematopoietic Cell Transplantation for Patients with Multiple Myeloma: Results from the Blood and Marrow Transplant Clinical Trials Network (BMT CTN) 0102 Trial

    PubMed Central

    Krishnan, Amrita; Pasquini, Marcelo C.; Logan, Brent; Stadtmauer, Edward A.; Vesole, David H.; Alyea, Edwin; Antin, Joseph H.; Comenzo, Raymond; Goodman, Stacey; Hari, Parameswaran; Laport, Ginna; Qazilbash, Muzaffar H.; Rowley, Scott; Sahebi, Firoozeh; Somlo, George; Vogl, Dan T.; Weisdorf, Daniel; Ewell, Marian; Wu, Juan; Geller, Nancy L.; Horowitz, Mary M.; Giralt, Sergio; Maloney, David G.

    2012-01-01

    Background Autologous hematopoietic cell transplantation (HCT) improves survival in patients with multiple myeloma, but disease progression remains a challenge. Allogeneic HCT (alloHCT) has the potential to reduce disease progression through graft-versus-myeloma effects. The aim of the BMT CTN 0102 trial was to compare outcomes of autologous HCT (autoHCT) followed by alloHCT with non-myeloablative conditioning (auto-allo) to tandem autoHCT (auto-auto) in patients with standard risk myeloma. Patients in the auto-auto arm were randomized to one year of thalidomide and dexamethasone (Thal-Dex) maintenance therapy or observation (Obs). Methods Patients with multiple myeloma within 10 months from initiation of induction therapy were classified as standard (SRD) or high risk (HRD) disease based on cytogenetics and beta-2-microglobulin levels. Assignment to auto-allo HCT was based on availability of an HLA-matched sibling donor. Primary endpoint was three-year progression-free survival (PFS) according to intent-to-treat analysis. Results 710 patients were enrolled completed a minimum of 3-year follow up. Among 625 SRD patients, 189 and 436 were assigned to auto-allo and auto-auto, respectively. Seventeen percent (33/189) of SR patients in the auto-allo arm and 16% (70/436) in the auto-auto arm did not receive a second transplant. Thal-Dex was not completed in 77% (168/217) of assigned patients. PFS and overall survival (OS) did not differ between the Thal-Dex (49%, 80%) and Obs (41%, 81%) cohorts and these two arms were pooled for analysis. Three year PFS was 43% and 46% (p=0·671) and three-year OS was 77% and 80 % (p=0·191) with auto-allo and auto-auto, respectively. Corresponding progression/relapse rates were 46% and 50% (p=0·402); treatment-related mortality rates were 11% and 4% (p<0·001), respectively. Auto/allo patients with chronic graft-vs-host disease had a decreased risk of relapse. Most common grade 3 to 5 adverse events in auto-allo was hypebilirubenemia (21/189) and in the auto-auto was peripheral neuropathy (52/436). Among 85 HRD patients (37 auto-allo), three PFS was 40% and 33% (p=0·743) and three-year OS was 59% and 67% (p=0·460) with auto-allo and auto-auto, respectively. Conclusion Thal-Dex maintenance was associated with poor compliance and did not improve PFS or OS. At three years there was no improvement in PFS or OS with auto-allo compared to auto-auto transplantation in patients with standard risk myeloma. Decisions to proceed with alloHCT after an autoHCT in patients with standard risk myeloma should take into consideration results of the current trial. Future investigation of alloHCT in myeloma should focus to minimize TRM and maximize graft-versus myeloma effects. This trial was registered in Clinicaltrials.gov (NCT00075829) and was funded by the National Heart, Lung and Blood Institute and National Cancer Institute. PMID:21962393

  14. Annual forest inventory estimates based on the moving average

    Treesearch

    Francis A. Roesch; James R. Steinman; Michael T. Thompson

    2002-01-01

    Three interpretations of the simple moving average estimator, as applied to the USDA Forest Service's annual forest inventory design, are presented. A corresponding approach to composite estimation over arbitrarily defined land areas and time intervals is given for each interpretation, under the assumption that the investigator is armed with only the spatial/...

  15. 78 FR 26879 - Medicare Program; Inpatient Rehabilitation Facility Prospective Payment System for Federal Fiscal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-08

    ...: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Proposed rule. SUMMARY: This proposed rule..., especially the teaching status adjustment factor. Therefore, we implemented a 3-year moving average approach... moving average to calculate the facility-level adjustment factors. For FY 2011, we issued a notice to...

  16. Vehicle Integrated Photovoltaics for Compression Ignition Vehicles: An Experimental Investigation of Solar Alkaline Water Electrolysis for Improving Diesel Combustion and a Solar Charging System for Reducing Auxiliary Engine Loads

    NASA Astrophysics Data System (ADS)

    Negroni, Garry Inocentes

    Vehicle-integrated photovoltaic electricity can be applied towards aspiration of hydrogen-oxygen-steam gas produced through alkaline electrolysis and reductions in auxiliary alternator load for reducing hydrocarbon emissions in low nitrogen oxide indirect-injection compression-ignition engines. Aspiration of 0.516 ± 0.007 liters-per-minute of gas produced through alkaline electrolysis of potassium-hydroxide 2wt.% improves full-load performance; however, part-load performance decreases due to auto-ignition of aspirated gas prior to top-dead center. Alternator load reductions offer improved part-load and full-load performance with practical limitations resulting from accessory electrical loads. In an additive approach, solar electrolysis can electrochemically convert solar photovoltaic electricity into a gas comprised of stoichiometric hydrogen and oxygen gas. Aspiration of this hydrogen-oxygen gas enhances combustion properties decreasing emissions and increased combustion efficiency in light-duty diesel vehicles. The 316L stainless steel (SS) electrolyser plates are arranged with two anodes and three cathodes space with four bipolar plates delineating four stacks in parallel with five cells per stack. The electrolyser was tested using potassium hydroxide 2 wt.% and hydronium 3wt.% at measured voltage and current inputs. The flow rate output from the reservoir cell was measured in parallel with the V and I inputs producing a regression model correlating current input to flow rate. KOH 2 wt.% produced 0.005 LPM/W, while H9O44 3 wt.% produced less at 0.00126 LPM/W. In a subtractive approach, solar energy can be used to charge a larger energy storage device, as is with plug-in electric vehicles, in order to alleviate the engine of the mechanical load placed upon it by the vehicles electrical accessories through the alternator. Solar electrolysis can improve part-load emissions and full-load performance. The average solar-to-battery efficiency based on the OEM rated efficiency was 11.4%. The average voltage efficiency of the electrolyser during dynamometer testing was 69.16%, producing a solar-to-electrolysis efficiency of 7.88%. At varying engine speeds, HC emissions decreased an average of 54.4% at multiple engine speeds at part-load, while CO2 increased by 2.54% due to oxygen enrichment of intake air. However, the auto-ignition of a small amount of hydrogen (0.0035% of diesel fuel energy) had a negative impact on part-load power (-3.671%) and torque (-3.296%). Full-load sweep testing showed an increase in peak power (1.562%) and peak torque (2.608%). Solar electrolysis gas aspiration reduced soot opacity by 31.5%. The alternator-less part-load step tests show average HC and CO2 emissions decrease on average 25.05% and 1.14% respectively. The test also indicates an increase in average part-load power (1.57%) and torque (2.12%). Alternator-less operation can reduce soot opacity by 56.76%. Full-load testing of the vehicle with alternator unplugged indicates that alternator load upon an engine increase with engine ne speed even with no load and no pilot excitation. Alternator load elimination's performance and emissions improvements should be considered, however, practical limitations exist in winter-night, summer-midday scenarios and for longer duration of operation.

  17. Comparison of modeling methods to predict the spatial distribution of deep-sea coral and sponge in the Gulf of Alaska

    NASA Astrophysics Data System (ADS)

    Rooper, Christopher N.; Zimmermann, Mark; Prescott, Megan M.

    2017-08-01

    Deep-sea coral and sponge ecosystems are widespread throughout most of Alaska's marine waters, and are associated with many different species of fishes and invertebrates. These ecosystems are vulnerable to the effects of commercial fishing activities and climate change. We compared four commonly used species distribution models (general linear models, generalized additive models, boosted regression trees and random forest models) and an ensemble model to predict the presence or absence and abundance of six groups of benthic invertebrate taxa in the Gulf of Alaska. All four model types performed adequately on training data for predicting presence and absence, with regression forest models having the best overall performance measured by the area under the receiver-operating-curve (AUC). The models also performed well on the test data for presence and absence with average AUCs ranging from 0.66 to 0.82. For the test data, ensemble models performed the best. For abundance data, there was an obvious demarcation in performance between the two regression-based methods (general linear models and generalized additive models), and the tree-based models. The boosted regression tree and random forest models out-performed the other models by a wide margin on both the training and testing data. However, there was a significant drop-off in performance for all models of invertebrate abundance ( 50%) when moving from the training data to the testing data. Ensemble model performance was between the tree-based and regression-based methods. The maps of predictions from the models for both presence and abundance agreed very well across model types, with an increase in variability in predictions for the abundance data. We conclude that where data conforms well to the modeled distribution (such as the presence-absence data and binomial distribution in this study), the four types of models will provide similar results, although the regression-type models may be more consistent with biological theory. For data with highly zero-inflated distributions and non-normal distributions such as the abundance data from this study, the tree-based methods performed better. Ensemble models that averaged predictions across the four model types, performed better than the GLM or GAM models but slightly poorer than the tree-based methods, suggesting ensemble models might be more robust to overfitting than tree methods, while mitigating some of the disadvantages in predictive performance of regression methods.

  18. Spread Spectrum Signal Characteristic Estimation Using Exponential Averaging and an AD-HOC Chip rate Estimator

    DTIC Science & Technology

    2007-03-01

    Quadrature QPSK Quadrature Phase-Shift Keying RV Random Variable SHAC Single-Hop-Observation Auto- Correlation SINR Signal-to-Interference...The fast Fourier transform ( FFT ) accumulation method and the strip spectral correlation algorithm subdivide the support region in the bi-frequency...diamond shapes, while the strip spectral correlation algorithm subdivides the region into strips. Each strip covers a number of the FFT accumulation

  19. Femoral articular shape and geometry. A three-dimensional computerized analysis of the knee.

    PubMed

    Siu, D; Rudan, J; Wevers, H W; Griffiths, P

    1996-02-01

    An average, three-dimensional anatomic shape and geometry of the distal femur were generated from x-ray computed tomography data of five fresh asymptomatic cadaver knees using AutoCAD (AutoDesk, Sausalito, CA), a computer-aided design and drafting software. Each femur model was graphically repositioned to a standardized orientation using a series of alignment templates and scaled to a nominal size of 85 mm in mediolateral and 73 mm in anteroposterior dimensions. An average generic shape of the distal femur was synthesized by combining these pseudosolid models and reslicing the composite structure at different elevations using clipping and smoothing techniques in interactive computer graphics. The resulting distal femoral geometry was imported into a computer-aided manufacturing system, and anatomic prototypes of the distal femur were produced. Quantitative geometric analyses of the generic femur in the coronal and transverse planes revealed definite condylar camber (3 degrees-6 degrees) and toe-in (8 degrees-10 degrees) with an oblique patellofemoral groove (15 degrees) with respect to the mechanical axis of the femur. In the sagittal plane, each condyle could be approximated by three concatenated circular arcs (anterior, distal, and posterior) with slope continuity and a single arc for the patellofemoral groove. The results of this study may have important implications in future femoral prosthesis design and clinical applications.

  20. Auto-ignitions of a methane/air mixture at high and intermediate temperatures

    NASA Astrophysics Data System (ADS)

    Leschevich, V. V.; Martynenko, V. V.; Penyazkov, O. G.; Sevrouk, K. L.; Shabunya, S. I.

    2016-09-01

    A rapid compression machine (RCM) and a shock tube (ST) have been employed to study ignition delay times of homogeneous methane/air mixtures at intermediate-to-high temperatures. Both facilities allow measurements to be made at temperatures of 900-2000 K, at pressures of 0.38-2.23 MPa, and at equivalence ratios of 0.5, 1.0, and 2.0. In ST experiments, nitrogen served as a diluent gas, whereas in RCM runs the diluent gas composition ranged from pure nitrogen to pure argon. Recording pressure, UV, and visible emissions identified the evolution of chemical reactions. Correlations of ignition delay time were generated from the data for each facility. At temperatures below 1300 K, a significant reduction of average activation energy from 53 to 15.3 kcal/mol was obtained. Moreover, the RCM data showed significant scatter that dramatically increased with decreasing temperature. An explanation for the abnormal scatter in the data was proposed based on the high-speed visualization of auto-ignition phenomena and experiments performed with oxygen-free and fuel-free mixtures. It is proposed that the main reason for such a significant reduction of average activation energy is attributable to the premature ignition of ultrafine particles in the reactive mixture.

  1. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  2. Correcting for day of the week and public holiday effects: improving a national daily syndromic surveillance service for detecting public health threats.

    PubMed

    Buckingham-Jeffery, Elizabeth; Morbey, Roger; House, Thomas; Elliot, Alex J; Harcourt, Sally; Smith, Gillian E

    2017-05-19

    As service provision and patient behaviour varies by day, healthcare data used for public health surveillance can exhibit large day of the week effects. These regular effects are further complicated by the impact of public holidays. Real-time syndromic surveillance requires the daily analysis of a range of healthcare data sources, including family doctor consultations (called general practitioners, or GPs, in the UK). Failure to adjust for such reporting biases during analysis of syndromic GP surveillance data could lead to misinterpretations including false alarms or delays in the detection of outbreaks. The simplest smoothing method to remove a day of the week effect from daily time series data is a 7-day moving average. Public Health England developed the working day moving average in an attempt also to remove public holiday effects from daily GP data. However, neither of these methods adequately account for the combination of day of the week and public holiday effects. The extended working day moving average was developed. This is a further data-driven method for adding a smooth trend curve to a time series graph of daily healthcare data, that aims to take both public holiday and day of the week effects into account. It is based on the assumption that the number of people seeking healthcare services is a combination of illness levels/severity and the ability or desire of patients to seek healthcare each day. The extended working day moving average was compared to the seven-day and working day moving averages through application to data from two syndromic indicators from the GP in-hours syndromic surveillance system managed by Public Health England. The extended working day moving average successfully smoothed the syndromic healthcare data by taking into account the combined day of the week and public holiday effects. In comparison, the seven-day and working day moving averages were unable to account for all these effects, which led to misleading smoothing curves. The results from this study make it possible to identify trends and unusual activity in syndromic surveillance data from GP services in real-time independently of the effects caused by day of the week and public holidays, thereby improving the public health action resulting from the analysis of these data.

  3. Moving in the Right Direction: Helping Children Cope with a Relocation

    ERIC Educational Resources Information Center

    Kruse, Tricia

    2012-01-01

    According to national figures, 37.1 million people moved in 2009 (U.S. Census Bureau, 2010). In fact, the average American will move 11.7 times in their lifetime. Why are Americans moving so much? There are a variety of reasons. Regardless of the reason, moving is a common experience for children. If one looks at the developmental characteristics…

  4. What is new about covered interest parity condition in the European Union? Evidence from fractal cross-correlation regressions

    NASA Astrophysics Data System (ADS)

    Ferreira, Paulo; Kristoufek, Ladislav

    2017-11-01

    We analyse the covered interest parity (CIP) using two novel regression frameworks based on cross-correlation analysis (detrended cross-correlation analysis and detrending moving-average cross-correlation analysis), which allow for studying the relationships at different scales and work well under non-stationarity and heavy tails. CIP is a measure of capital mobility commonly used to analyse financial integration, which remains an interesting feature of study in the context of the European Union. The importance of this features is related to the fact that the adoption of a common currency is associated with some benefits for countries, but also involves some risks such as the loss of economic instruments to face possible asymmetric shocks. While studying the Eurozone members could explain some problems in the common currency, studying the non-Euro countries is important to analyse if they are fit to take the possible benefits. Our results point to the CIP verification mainly in the Central European countries while in the remaining countries, the verification of the parity is only residual.

  5. Spatio-temporal prediction of daily temperatures using time-series of MODIS LST images

    NASA Astrophysics Data System (ADS)

    Hengl, Tomislav; Heuvelink, Gerard B. M.; Perčec Tadić, Melita; Pebesma, Edzer J.

    2012-01-01

    A computational framework to generate daily temperature maps using time-series of publicly available MODIS MOD11A2 product Land Surface Temperature (LST) images (1 km resolution; 8-day composites) is illustrated using temperature measurements from the national network of meteorological stations (159) in Croatia. The input data set contains 57,282 ground measurements of daily temperature for the year 2008. Temperature was modeled as a function of latitude, longitude, distance from the sea, elevation, time, insolation, and the MODIS LST images. The original rasters were first converted to principal components to reduce noise and filter missing pixels in the LST images. The residual were next analyzed for spatio-temporal auto-correlation; sum-metric separable variograms were fitted to account for zonal and geometric space-time anisotropy. The final predictions were generated for time-slices of a 3D space-time cube, constructed in the R environment for statistical computing. The results show that the space-time regression model can explain a significant part of the variation in station-data (84%). MODIS LST 8-day (cloud-free) images are unbiased estimator of the daily temperature, but with relatively low precision (±4.1°C); however their added value is that they systematically improve detection of local changes in land surface temperature due to local meteorological conditions and/or active heat sources (urban areas, land cover classes). The results of 10-fold cross-validation show that use of spatio-temporal regression-kriging and incorporation of time-series of remote sensing images leads to significantly more accurate maps of temperature than if plain spatial techniques were used. The average (global) accuracy of mapping temperature was ±2.4°C. The regression-kriging explained 91% of variability in daily temperatures, compared to 44% for ordinary kriging. Further software advancement—interactive space-time variogram exploration and automated retrieval, resampling and filtering of MODIS images—are anticipated.

  6. A comparison of several techniques for imputing tree level data

    Treesearch

    David Gartner

    2002-01-01

    As Forest Inventory and Analysis (FIA) changes from periodic surveys to the multipanel annual survey, new analytical methods become available. The current official statistic is the moving average. One alternative is an updated moving average. Several methods of updating plot per acre volume have been discussed previously. However, these methods may not be appropriate...

  7. The Effect on Non-Normal Distributions on the Integrated Moving Average Model of Time-Series Analysis.

    ERIC Educational Resources Information Center

    Doerann-George, Judith

    The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…

  8. Implementation hurdles of an interactive, integrated, point-of-care computerised decision support system for hospital antibiotic prescription.

    PubMed

    Chow, A L; Ang, A; Chow, C Z; Ng, T M; Teng, C; Ling, L M; Ang, B S; Lye, D C

    2016-02-01

    Antimicrobial stewardship is used to combat antimicrobial resistance. In Singapore, a tertiary hospital has integrated a computerised decision support system, called Antibiotic Resistance Utilisation and Surveillance-Control (ARUSC), into the electronic inpatient prescribing system. ARUSC is launched either by the physician to seek guidance for an infectious disease condition or via auto-trigger when restricted antibiotics are prescribed. This paper describes the implementation of ARUSC over three phases from 1 May 2011 to 30 April 2013, compared factors between ARUSC launches via auto-trigger and for guidance, examined factors associated with acceptance of ARUSC recommendations, and assessed user acceptability. During the study period, a monthly average of 9072 antibiotic prescriptions was made, of which 2370 (26.1%) involved ARUSC launches. Launches via auto-trigger comprised 48.1% of ARUSC launches. In phase 1, 23% of ARUSC launches were completed. This rose to 38% in phase 2, then 87% in phase 3, as escapes from the ARUSC programme were sequentially disabled. Amongst completed launches for guidance, 89% of ARUSC recommendations were accepted versus 40% amongst completed launches via auto-trigger. Amongst ARUSC launches for guidance, being from a medical department [adjusted odds ratio (aOR)=1.20, 95% confidence interval (CI) 1.04-1.37] and ARUSC launch during on-call (aOR=1.81, 95% CI 1.61-2.05) were independently associated with acceptance of ARUSC recommendations. Junior physicians found ARUSC useful. Senior physicians found ARUSC reliable but admitted to having preferences for antibiotics that may conflict with ARUSC. Hospital-wide implementation of ARUSC encountered hurdles from physicians. With modifications, the completion rate improved. Copyright © 2015 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.

  9. Cervical screening programmes: can automation help? Evidence from systematic reviews, an economic analysis and a simulation modelling exercise applied to the UK.

    PubMed

    Willis, B H; Barton, P; Pearmain, P; Bryan, S; Hyde, C

    2005-03-01

    To assess the effectiveness and cost-effectiveness of adding automated image analysis to cervical screening programmes. Searching of all major electronic databases to the end of 2000 was supplemented by a detailed survey for unpublished UK literature. Four systematic reviews were conducted according to recognised guidance. The review of 'clinical effectiveness' included studies assessing reproducibility and impact on health outcomes and processes in addition to evaluations of test accuracy. A discrete event simulation model was developed, although the economic evaluation ultimately relied on a cost-minimisation analysis. The predominant finding from the systematic reviews was the very limited amount of rigorous primary research. None of the included studies refers to the only commercially available automated image analysis device in 2002, the AutoPap Guided Screening (GS) System. The results of the included studies were debatably most compatible with automated image analysis being equivalent in test performance to manual screening. Concerning process, there was evidence that automation does lead to reductions in average slide processing times. In the PRISMATIC trial this was reduced from 10.4 to 3.9 minutes, a statistically significant and practically important difference. The economic evaluation tentatively suggested that the AutoPap GS System may be efficient. The key proviso is that credible data become available to support that the AutoPap GS System has test performance and processing times equivalent to those obtained for PAPNET. The available evidence is still insufficient to recommend implementation of automated image analysis systems. The priority for action remains further research, particularly the 'clinical effectiveness' of the AutoPap GS System. Assessing the cost-effectiveness of introducing automation alongside other approaches is also a priority.

  10. Efficiency and cost analysis of cell saver auto transfusion system in total knee arthroplasty.

    PubMed

    Bilgili, Mustafa Gökhan; Erçin, Ersin; Peker, Gökhan; Kural, Cemal; Başaran, Serdar Hakan; Duramaz, Altuğ; Avkan, Cevdet

    2014-06-01

    Blood loss and replacement is still a controversial issue in major orthopaedic surgery. Allogenic blood transfusion may cause legal problems and concerns regarding the transmission of transfusion-related diseases. Cellsaver Systems (CSS) were developed as an alternative to allogenic transfusion but CSS transfusion may cause coagulation, infection and haemodynamic instability. Our aim was to analyse the efficiency and cost analysis of a cell saver auto-transfusion system in the total knee arthroplasty procedure. Retrospective comparative study. Those patients who were operated on by unilateral, cemented total knee arthroplasty (TKA) were retrospectively evaluated. Group 1 included 37 patients who were treated using the cell saver system, and Group 2 involved 39 patients who were treated by allogenic blood transfusion. The groups were compared in terms of preoperative haemoglobin and haematocrit levels, blood loss and transfusion amount, whether allogenic transfusion was made, degree of deformity, body mass index and cost. No significant results could be obtained in the statistical comparisons made in terms of the demographic properties, deformity properties, preoperative laboratory values, transfusion amount and length of hospital stay of the groups. Average blood loss was calculated to be less in Group 1 (p<0.05) and cost was higher in Group 1 (p<0.05). Cell saver systems do not decrease the amount of allogenic blood transfusion and costs more. Therefore, the routine usage of the auto-transfusion systems is a controversial issue. Cell saver system usage does not affect allogenic blood transfusion incidence or allogenic blood transfusion volume. It was found that preoperative haemoglobin and body mass index rates may affect allogenic blood transfusion. Therefore, it is foreseen that auto-transfusion systems could be useful in patients with low haemoglobin level and body mass index.

  11. Clinical assessment of auto-positive end-expiratory pressure by diaphragmatic electrical activity during pressure support and neurally adjusted ventilatory assist.

    PubMed

    Bellani, Giacomo; Coppadoro, Andrea; Patroniti, Nicolò; Turella, Marta; Arrigoni Marocco, Stefano; Grasselli, Giacomo; Mauri, Tommaso; Pesenti, Antonio

    2014-09-01

    Auto-positive end-expiratory pressure (auto-PEEP) may substantially increase the inspiratory effort during assisted mechanical ventilation. Purpose of this study was to assess whether the electrical activity of the diaphragm (EAdi) signal can be reliably used to estimate auto-PEEP in patients undergoing pressure support ventilation and neurally adjusted ventilatory assist (NAVA) and whether NAVA was beneficial in comparison with pressure support ventilation in patients affected by auto-PEEP. In 10 patients with a clinical suspicion of auto-PEEP, the authors simultaneously recorded EAdi, airway, esophageal pressure, and flow during pressure support and NAVA, whereas external PEEP was increased from 2 to 14 cm H2O. Tracings were analyzed to measure apparent "dynamic" auto-PEEP (decrease in esophageal pressure to generate inspiratory flow), auto-EAdi (EAdi value at the onset of inspiratory flow), and IDEAdi (inspiratory delay between the onset of EAdi and the inspiratory flow). The pressure necessary to overcome auto-PEEP, auto-EAdi, and IDEAdi was significantly lower in NAVA as compared with pressure support ventilation, decreased with increase in external PEEP, although the effect of external PEEP was less pronounced in NAVA. Both auto-EAdi and IDEAdi were tightly correlated with auto-PEEP (r = 0.94 and r = 0.75, respectively). In the presence of auto-PEEP at lower external PEEP levels, NAVA was characterized by a characteristic shape of the airway pressure. In patients with auto-PEEP, NAVA, compared with pressure support ventilation, led to a decrease in the pressure necessary to overcome auto-PEEP, which could be reliably monitored by the electrical activity of the diaphragm before inspiratory flow onset (auto-EAdi).

  12. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  13. Geohydrology and simulation of ground-water flow in the aquifer system near Calvert City, Kentucky

    USGS Publications Warehouse

    Starn, J.J.; Arihood, L.D.; Rose, M.F.

    1995-01-01

    The U.S. Geological Survey, in cooperation with the Kentucky Natural Resources and Environmental Protection Cabinet, constructed a two-dimensional, steady-state ground-water-flow model to estimate hydraulic properties, contributing areas to discharge boundaries, and the average linear velocity at selected locations in an aquifer system near Calvert City, Ky. Nonlinear regression was used to estimate values of model parameters and the reliability of the parameter estimates. The regression minimizes the weighted difference between observed and calculated hydraulic heads and rates of flow. The calibrated model generally was better than alternative models considered, and although adding transmissive faults in the bedrock produced a slightly better model, fault transmissivity was not estimated reliably. The average transmissivity of the aquifer was 20,000 feet squared per day. Recharge to two outcrop areas, the McNairy Formation of Cretaceous age and the alluvium of Quaternary age, were 0.00269 feet per day (11.8 inches per year) and 0.000484 feet per day (2.1 inches per year), respectively. Contributing areas to wells at the Calvert City Water Company in 1992 did not include the Calvert City Industrial Complex. Since completing the fieldwork for this study in 1992, the Calvert City Water Company discontinued use of their wells and began withdrawing water from new wells that were located 4.5 miles east-southeast of the previous location; the contributing area moved farther from the industrial complex. The extent of the alluvium contributing water to wells was limited by the overlying lacustrine deposits. The average linear ground-water velocity at the industrial complex ranged from 0.90 feet per day to 4.47 feet per day with a mean of 1.98 feet per day.

  14. Genetic Analysis of Milk Yield in First-Lactation Holstein Friesian in Ethiopia: A Lactation Average vs Random Regression Test-Day Model Analysis

    PubMed Central

    Meseret, S.; Tamir, B.; Gebreyohannes, G.; Lidauer, M.; Negussie, E.

    2015-01-01

    The development of effective genetic evaluations and selection of sires requires accurate estimates of genetic parameters for all economically important traits in the breeding goal. The main objective of this study was to assess the relative performance of the traditional lactation average model (LAM) against the random regression test-day model (RRM) in the estimation of genetic parameters and prediction of breeding values for Holstein Friesian herds in Ethiopia. The data used consisted of 6,500 test-day (TD) records from 800 first-lactation Holstein Friesian cows that calved between 1997 and 2013. Co-variance components were estimated using the average information restricted maximum likelihood method under single trait animal model. The estimate of heritability for first-lactation milk yield was 0.30 from LAM whilst estimates from the RRM model ranged from 0.17 to 0.29 for the different stages of lactation. Genetic correlations between different TDs in first-lactation Holstein Friesian ranged from 0.37 to 0.99. The observed genetic correlation was less than unity between milk yields at different TDs, which indicated that the assumption of LAM may not be optimal for accurate evaluation of the genetic merit of animals. A close look at estimated breeding values from both models showed that RRM had higher standard deviation compared to LAM indicating that the TD model makes efficient utilization of TD information. Correlations of breeding values between models ranged from 0.90 to 0.96 for different group of sires and cows and marked re-rankings were observed in top sires and cows in moving from the traditional LAM to RRM evaluations. PMID:26194217

  15. Automated VMAT planning for postoperative adjuvant treatment of advanced gastric cancer.

    PubMed

    Sharfo, Abdul Wahab M; Stieler, Florian; Kupfer, Oskar; Heijmen, Ben J M; Dirkx, Maarten L P; Breedveld, Sebastiaan; Wenz, Frederik; Lohr, Frank; Boda-Heggemann, Judit; Buergy, Daniel

    2018-04-23

    Postoperative/adjuvant radiotherapy of advanced gastric cancer involves a large planning target volume (PTV) with multi-concave shapes which presents a challenge for volumetric modulated arc therapy (VMAT) planning. This study investigates the advantages of automated VMAT planning for this site compared to manual VMAT planning by expert planners. For 20 gastric cancer patients in the postoperative/adjuvant setting, dual-arc VMAT plans were generated using fully automated multi-criterial treatment planning (autoVMAT), and compared to manually generated VMAT plans (manVMAT). Both automated and manual plans were created to deliver a median dose of 45 Gy to the PTV using identical planning and segmentation parameters. Plans were evaluated by two expert radiation oncologists for clinical acceptability. AutoVMAT and manVMAT plans were also compared based on dose-volume histogram (DVH) and predicted normal tissue complication probability (NTCP) analysis. Both manVMAT and autoVMAT plans were considered clinically acceptable. Target coverage was similar (manVMAT: 96.6 ± 1.6%, autoVMAT: 97.4 ± 1.0%, p = 0.085). With autoVMAT, median kidney dose was reduced on average by > 25%; (for left kidney from 11.3 ± 2.1 Gy to 8.9 ± 3.5 Gy (p = 0.002); for right kidney from 9.2 ± 2.2 Gy to 6.1 ± 1.3 Gy (p <  0.001)). Median dose to the liver was lower as well (18.8 ± 2.3 Gy vs. 17.1 ± 3.6 Gy, p = 0.048). In addition, Dmax of the spinal cord was significantly reduced (38.3 ± 3.7 Gy vs. 31.6 ± 2.6 Gy, p <  0.001). Substantial improvements in dose conformity and integral dose were achieved with autoVMAT plans (4.2% and 9.1%, respectively; p <  0.001). Due to the better OAR sparing in the autoVMAT plans compared to manVMAT plans, the predicted NTCPs for the left and right kidney and the liver-PTV were significantly reduced by 11.3%, 12.8%, 7%, respectively (p ≤ 0.001). Delivery time and total number of monitor units were increased in autoVMAT plans (from 168 ± 19 s to 207 ± 26 s, p = 0.006) and (from 781 ± 168 MU to 1001 ± 134 MU, p = 0.003), respectively. For postoperative/adjuvant radiotherapy of advanced gastric cancer, involving a complex target shape, automated VMAT planning is feasible and can substantially reduce the dose to the kidneys and the liver, without compromising the target dose delivery.

  16. Energy consumption model on WiMAX subscriber station

    NASA Astrophysics Data System (ADS)

    Mubarakah, N.; Suherman; Al-Hakim, M. Y.; Warman, E.

    2018-02-01

    Mobile communication technologies move toward miniaturization. Mobile device’s energy source relies on its battery endurance. The smaller the mobile device, it is expected the slower the battery drains. Energy consumption reduction in mobile devices has been of interest of researcher. In order to optimize energy consumption, its usage should be predictable. This paper proposes a model of predicted energy amount consumed by the WiMAX subscriber station by using regression analysis of active WiMAX states and their durations. The proposed model was assessed by using NS-2 simulation for more than a hundred thousand of recorded energy consumptions data in every WiMAX states. The assessment show a small average deviation between predicted and measured energy consumptions, about 0.18% for training data and 0.187% and 0.191% for test data.

  17. Myocarditis in auto-immune or auto-inflammatory diseases.

    PubMed

    Comarmond, Cloé; Cacoub, Patrice

    2017-08-01

    Myocarditis is a major cause of heart disease in young patients and a common precursor of heart failure due to dilated cardiomyopathy. Some auto-immune and/or auto-inflammatory diseases may be accompanied by myocarditis, such as sarcoidosis, Behçet's disease, eosinophilic granulomatosis with polyangiitis, myositis, and systemic lupus erythematosus. However, data concerning myocarditis in such auto-immune and/or auto-inflammatory diseases are sparse. New therapeutic strategies should better target the modulation of the immune system, depending on the phase of the disease and the type of underlying auto-immune and/or auto-inflammatory disease. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Numerical Models of Human Circulatory System under Altered Gravity: Brain Circulation

    NASA Technical Reports Server (NTRS)

    Kim, Chang Sung; Kiris, Cetin; Kwak, Dochan; David, Tim

    2003-01-01

    A computational fluid dynamics (CFD) approach is presented to model the blood flow through the human circulatory system under altered gravity conditions. Models required for CFD simulation relevant to major hemodynamic issues are introduced such as non-Newtonian flow models governed by red blood cells, a model for arterial wall motion due to fluid-wall interactions, a vascular bed model for outflow boundary conditions, and a model for auto-regulation mechanism. The three-dimensional unsteady incompressible Navier-Stokes equations coupled with these models are solved iteratively using the pseudocompressibility method and dual time stepping. Moving wall boundary conditions from the first-order fluid-wall interaction model are used to study the influence of arterial wall distensibility on flow patterns and wall shear stresses during the heart pulse. A vascular bed modeling utilizing the analogy with electric circuits is coupled with an auto-regulation algorithm for multiple outflow boundaries. For the treatment of complex geometry, a chimera overset grid technique is adopted to obtain connectivity between arterial branches. For code validation, computed results are compared with experimental data for steady and unsteady non-Newtonian flows. Good agreement is obtained for both cases. In sin-type Gravity Benchmark Problems, gravity source terms are added to the Navier-Stokes equations to study the effect of gravitational variation on the human circulatory system. This computational approach is then applied to localized blood flows through a realistic carotid bifurcation and two Circle of Willis models, one using an idealized geometry and the other model using an anatomical data set. A three- dimensional anatomical Circle of Willis configuration is reconstructed from human-specific magnetic resonance images using an image segmentation method. The blood flow through these Circle of Willis models is simulated to provide means for studying gravitational effects on the brain circulation under auto-regulation.

  19. 'Le moment de la lune'. an auto-ethnographic tale of practice about menarche in a children's hospital.

    PubMed

    Denshire, Sally

    2011-08-01

    Auto-ethnographic accounts can highlight unsaid moments of professional practice. In this case, my auto-ethnographic tale 'Le moment de la lune' re-inscribes subjugated knowledge about menstruation and occupational therapy practice in the era before adolescent wards. This fictional tale is written in direct dialogue with an article that was published in this journal at a particular point in my own career as an occupational therapist. In the tale I am 'writing in' what was not written about in my article and in occupational therapy generally. This 'writing-in'/re-inscribing is the research method. My previous article 'Normal spaces' published in this journal in 1985, was organised around principles and generalities of youth-specific practice. The original article had little locating the personal or evoking the body and a heavy reliance on the literature. Issues of gender and culture were largely absent, or, perhaps, 'written out'. The corresponding tale of embodied sexuality, 'Le moment de la lune', articulates something of local complex practice and the particularity of individual therapeutic work to do with menstruation in self-care. Points of tension in 'Normal spaces' are elaborated and I explain how 'Le moment de la lune' problematises supporting menarche in a children's hospital. Now practice has moved on with dedicated adolescent wards in all major children's hospitals. Nevertheless, occupational therapy practice around issues of menstrual self-management is still under-documented. Writing about unspoken moments of practice can have ethical implications for expanding the ways occupational therapy practice can be written and understood. © 2011 The Author. Australian Occupational Therapy Journal © 2011 Occupational Therapy Australia.

  20. Relationship of hemoglobin to occupational exposure to motor vehicle exhaust.

    PubMed

    Potula, V; Hu, H

    1996-01-01

    To study the relationship of hemoglobin to exposure to motor vehicle exhaust. Survey. Traffic police, bus drivers, and auto-shop workers (all exposed to auto exhaust in Madras, India) and unexposed office workers. We measured levels of blood lead (by graphite furnace atomic absorption spectrophotometry), and hemoglobin. Information also was collected on age, employment duration, smoking status, alcohol ingestion, and diet type (vegetarian or nonvegetarian). Increasing exposure to motor vehicle exhaust, as reflected by job category, was significantly associated with lower levels of hemoglobin (p < 0.01). A final multivariate regression model was constructed that began with indicator variables for each job (with office workers as the reference category) and included age, duration of employment, blood lead level, alcohol ingestion, dietary type, and smoking status. After a backward-elimination procedure, employment duration as an auto-shop worker or bus driver remained as significant correlates of lower hemoglobin level and current smoking and long employment duration as significant correlates of higher hemoglobin level. Occupational exposure to automobile exhaust may be a risk factor for decreased hemoglobin level in Madras. This effect appears to be independent of blood lead level and may represent hematopoietic suppression incurred by benzene or accumulated lead burden (which is not well reflected by blood lead levels). Smoking probably increased hemoglobin level through the chronic effects of exposure to carbon monoxide. In this study, a long employment duration may have served as a proxy for better socioeconomic, and therefore, better nutritional status.

  1. Annual replenishment of bed material by sediment transport in the Wind River near Riverton, Wyoming

    USGS Publications Warehouse

    Smalley, M.L.; Emmett, W.W.; Wacker, A.M.

    1994-01-01

    The U.S. Geological Survey, in cooperation with the Wyoming Department of Transportation, conducted a study during 1985-87 to determine the annual replenishment of sand and gravel along a point bar in the Wind River near Riverton, Wyoming. Hydraulic- geometry relations determined from streamflow measurements; streamflow characteristics determined from 45 years of record at the study site; and analyses of suspended-sediment, bedload, and bed- material samples were used to describe river transport characteristics and to estimate the annual replenishment of sand and gravel. The Wind River is a perennial, snowmelt-fed stream. Average daily discharge at the study site is about 734 cubic feet per second, and bankfull discharge (recurrence interval about 1.5 years) is about 5,000 cubic feet per second. At bankfull discharge, the river is about 136 feet wide and has an average depth of about 5.5 feet and average velocity of about 6.7 feet per second. Streams slope is about 0.0010 foot per foot. Bed material sampled on the point bar before the 1986 high flows ranged from sand to cobbles, with a median diameter of about 22 millimeters. Data for sediment samples collected during water year 1986 were used to develop regression equations between suspended-sediment load and water discharge and between bedload and water discharge. Average annual suspended-sediment load was computed to be about 561,000 tons per year using the regression equation in combination with flow-duration data. The regression equation for estimating bedload was not used; instead, average annual bedload was computed as 1.5 percent of average annual suspended load about 8,410 tons per year. This amount of bedload material is estimated to be in temporary storage along a reach containing seven riffles--a length of approximately 1 river mile. On the basis of bedload material sampled during the 1986 high flows, about 75 percent (by weight) is sand (2 millimeters in diameter or finer); median particle size is about 0.5 milli- meter. About 20 percent (by weight) is medium gravel to small cobbles--12.7 millimeters (0.5 inch) or coarser. The bedload moves slowly (about 0.03 percent of the water speed) and briefly (about 10 percent of the time). The average travel distance of a median-sized particle is about 1 river mile per year. The study results indicate that the average replenishment rate of bedload material coarser than 12.7 millimeters is about 1,500 to 2,000 tons (less than 1,500 cubic yards) per year. Finer material (0.075 to 6.4 millimeters in diameter) is replen- ishment at about 4,500 to 5,000 cubic yards per year. The total volume of potentially usable material would average about 6,000 cubic yards per year.

  2. Quantifying rapid changes in cardiovascular state with a moving ensemble average.

    PubMed

    Cieslak, Matthew; Ryan, William S; Babenko, Viktoriya; Erro, Hannah; Rathbun, Zoe M; Meiring, Wendy; Kelsey, Robert M; Blascovich, Jim; Grafton, Scott T

    2018-04-01

    MEAP, the moving ensemble analysis pipeline, is a new open-source tool designed to perform multisubject preprocessing and analysis of cardiovascular data, including electrocardiogram (ECG), impedance cardiogram (ICG), and continuous blood pressure (BP). In addition to traditional ensemble averaging, MEAP implements a moving ensemble averaging method that allows for the continuous estimation of indices related to cardiovascular state, including cardiac output, preejection period, heart rate variability, and total peripheral resistance, among others. Here, we define the moving ensemble technique mathematically, highlighting its differences from fixed-window ensemble averaging. We describe MEAP's interface and features for signal processing, artifact correction, and cardiovascular-based fMRI analysis. We demonstrate the accuracy of MEAP's novel B point detection algorithm on a large collection of hand-labeled ICG waveforms. As a proof of concept, two subjects completed a series of four physical and cognitive tasks (cold pressor, Valsalva maneuver, video game, random dot kinetogram) on 3 separate days while ECG, ICG, and BP were recorded. Critically, the moving ensemble method reliably captures the rapid cyclical cardiovascular changes related to the baroreflex during the Valsalva maneuver and the classic cold pressor response. Cardiovascular measures were seen to vary considerably within repetitions of the same cognitive task for each individual, suggesting that a carefully designed paradigm could be used to capture fast-acting event-related changes in cardiovascular state. © 2017 Society for Psychophysiological Research.

  3. The Cost of Commonality: Assessing Value in Joint Programs

    DTIC Science & Technology

    2015-12-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...economies of scale in order to provide cheaper goods. When those economies of scale are not realized, as was the case with the U.S. auto market “ Big Three

  4. Least Squares Moving-Window Spectral Analysis.

    PubMed

    Lee, Young Jong

    2017-08-01

    Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.

  5. The Performance of Multilevel Growth Curve Models under an Autoregressive Moving Average Process

    ERIC Educational Resources Information Center

    Murphy, Daniel L.; Pituch, Keenan A.

    2009-01-01

    The authors examined the robustness of multilevel linear growth curve modeling to misspecification of an autoregressive moving average process. As previous research has shown (J. Ferron, R. Dailey, & Q. Yi, 2002; O. Kwok, S. G. West, & S. B. Green, 2007; S. Sivo, X. Fan, & L. Witta, 2005), estimates of the fixed effects were unbiased, and Type I…

  6. Multi-scale observations of the variability of magmatic CO2 emissions, Mammoth Mountain, CA, USA

    NASA Astrophysics Data System (ADS)

    Lewicki, J. L.; Hilley, G. E.

    2014-09-01

    One of the primary indicators of volcanic unrest at Mammoth Mountain is diffuse emission of magmatic CO2, which can effectively track this unrest if its variability in space and time and relationship to near-surface meteorological and hydrologic phenomena versus those occurring at depth beneath the mountain are understood. In June-October 2013, we conducted accumulation chamber soil CO2 flux surveys and made half-hourly CO2 flux measurements with automated eddy covariance and accumulation chamber (auto-chamber) instrumentation at the largest area of diffuse CO2 degassing on Mammoth Mountain (Horseshoe Lake tree kill; HLTK). Estimated CO2 emission rates for HLTK based on 20 June, 30 July, and 24-25 October soil CO2 flux surveys were 165, 172, and 231 t d- 1, respectively. The average (June-October) CO2 emission rate estimated for this area was 123 t d- 1 based on an inversion of 4527 eddy covariance CO2 flux measurements and corresponding modeled source weight functions. Average daily eddy covariance and auto-chamber CO2 fluxes consistently declined over the four-month observation time. Wavelet analysis of auto-chamber CO2 flux and environmental parameter time series was used to evaluate the periodicity of, and local correlation between these variables in time-frequency space. Overall, CO2 emissions at HLTK were highly dynamic, displaying short-term (hourly to weekly) temporal variability related to meteorological and hydrologic changes, as well as long-term (monthly to multi-year) variations related to migration of CO2-rich magmatic fluids beneath the volcano. Accumulation chamber soil CO2 flux surveys were also conducted in the four additional areas of diffuse CO2 degassing on Mammoth Mountain in July-August 2013. Summing CO2 emission rates for all five areas yielded a total for the mountain of 311 t d- 1, which may suggest that emissions returned to 1998-2009 levels, following an increase from 2009 to 2011.

  7. Multi-scale observations of the variability of magmatic CO2 emissions, Mammoth Mountain, CA, USA

    USGS Publications Warehouse

    Lewicki, Jennifer L.; Hilley, George E.

    2014-01-01

    One of the primary indicators of volcanic unrest at Mammoth Mountain is diffuse emission of magmatic CO2, which can effectively track this unrest if its variability in space and time and relationship to near-surface meteorological and hydrologic phenomena versus those occurring at depth beneath the mountain are understood. In June–October 2013, we conducted accumulation chamber soil CO2 flux surveys and made half-hourly CO2 flux measurements with automated eddy covariance and accumulation chamber (auto-chamber) instrumentation at the largest area of diffuse CO2 degassing on Mammoth Mountain (Horseshoe Lake tree kill; HLTK). Estimated CO2 emission rates for HLTK based on 20 June, 30 July, and 24–25 October soil CO2 flux surveys were 165, 172, and 231 t d− 1, respectively. The average (June–October) CO2 emission rate estimated for this area was 123 t d− 1 based on an inversion of 4527 eddy covariance CO2 flux measurements and corresponding modeled source weight functions. Average daily eddy covariance and auto-chamber CO2 fluxes consistently declined over the four-month observation time. Wavelet analysis of auto-chamber CO2 flux and environmental parameter time series was used to evaluate the periodicity of, and local correlation between these variables in time–frequency space. Overall, CO2 emissions at HLTK were highly dynamic, displaying short-term (hourly to weekly) temporal variability related to meteorological and hydrologic changes, as well as long-term (monthly to multi-year) variations related to migration of CO2-rich magmatic fluids beneath the volcano. Accumulation chamber soil CO2 flux surveys were also conducted in the four additional areas of diffuse CO2 degassing on Mammoth Mountain in July–August 2013. Summing CO2 emission rates for all five areas yielded a total for the mountain of 311 t d− 1, which may suggest that emissions returned to 1998–2009 levels, following an increase from 2009 to 2011.

  8. Comparison between wavelet transform and moving average as filter method of MODIS imagery to recognize paddy cropping pattern in West Java

    NASA Astrophysics Data System (ADS)

    Dwi Nugroho, Kreshna; Pebrianto, Singgih; Arif Fatoni, Muhammad; Fatikhunnada, Alvin; Liyantono; Setiawan, Yudi

    2017-01-01

    Information on the area and spatial distribution of paddy field are needed to support sustainable agricultural and food security program. Mapping or distribution of cropping pattern paddy field is important to obtain sustainability paddy field area. It can be done by direct observation and remote sensing method. This paper discusses remote sensing for paddy field monitoring based on MODIS time series data. In time series MODIS data, difficult to direct classified of data, because of temporal noise. Therefore wavelet transform and moving average are needed as filter methods. The Objective of this study is to recognize paddy cropping pattern with wavelet transform and moving average in West Java using MODIS imagery (MOD13Q1) from 2001 to 2015 then compared between both of methods. The result showed the spatial distribution almost have the same cropping pattern. The accuracy of wavelet transform (75.5%) is higher than moving average (70.5%). Both methods showed that the majority of the cropping pattern in West Java have pattern paddy-fallow-paddy-fallow with various time planting. The difference of the planting schedule was occurs caused by the availability of irrigation water.

  9. An Examination of Selected Geomagnetic Indices in Relation to the Sunspot Cycle

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2006-01-01

    Previous studies have shown geomagnetic indices to be useful for providing early estimates for the size of the following sunspot cycle several years in advance. Examined this study are various precursor methods for predicting the minimum and maximum amplitude of the following sunspot cycle, these precursors based on the aa and Ap geomagnetic indices and the number of disturbed days (NDD), days when the daily Ap index equaled or exceeded 25. Also examined is the yearly peak of the daily Ap index (Apmax), the number of days when Ap greater than or equal to 100, cyclic averages of sunspot number R, aa, Ap, NDD, and the number of sudden storm commencements (NSSC), as well the cyclic sums of NDD and NSSC. The analysis yields 90-percent prediction intervals for both the minimum and maximum amplitudes for cycle 24, the next sunspot cycle. In terms of yearly averages, the best regressions give Rmin = 9.8+/-2.9 and Rmax = 153.8+/-24.7, equivalent to Rm = 8.8+/-2.8 and RM = 159+/-5.5, based on the 12-mo moving average (or smoothed monthly mean sunspot number). Hence, cycle 24 is expected to be above average in size, similar to cycles 21 and 22, producing more than 300 sudden storm commencements and more than 560 disturbed days, of which about 25 will be Ap greater than or equal to 100. On the basis of annual averages, the sunspot minimum year for cycle 24 will be either 2006 or 2007.

  10. The Ontario printed educational message (OPEM) trial to narrow the evidence-practice gap with respect to prescribing practices of general and family physicians: a cluster randomized controlled trial, targeting the care of individuals with diabetes and hypertension in Ontario, Canada

    PubMed Central

    Zwarenstein, Merrick; Hux, Janet E; Kelsall, Diane; Paterson, Michael; Grimshaw, Jeremy; Davis, Dave; Laupacis, Andreas; Evans, Michael; Austin, Peter C; Slaughter, Pamela M; Shiller, Susan K; Croxford, Ruth; Tu, Karen

    2007-01-01

    Background There are gaps between what family practitioners do in clinical practice and the evidence-based ideal. The most commonly used strategy to narrow these gaps is the printed educational message (PEM); however, the attributes of successful printed educational messages and their overall effectiveness in changing physician practice are not clear. The current endeavor aims to determine whether such messages change prescribing quality in primary care practice, and whether these effects differ with the format of the message. Methods/design The design is a large, simple, factorial, unblinded cluster-randomized controlled trial. PEMs will be distributed with informed, a quarterly evidence-based synopsis of current clinical information produced by the Institute for Clinical Evaluative Sciences, Toronto, Canada, and will be sent to all eligible general and family practitioners in Ontario. There will be three replicates of the trial, with three different educational messages, each aimed at narrowing a specific evidence-practice gap as follows: 1) angiotensin-converting enzyme inhibitors, hypertension treatment, and cholesterol lowering agents for diabetes; 2) retinal screening for diabetes; and 3) diuretics for hypertension. For each of the three replicates there will be three intervention groups. The first group will receive informed with an attached postcard-sized, short, directive "outsert." The second intervention group will receive informed with a two-page explanatory "insert" on the same topic. The third intervention group will receive informed, with both the above-mentioned outsert and insert. The control group will receive informed only, without either an outsert or insert. Routinely collected physician billing, prescription, and hospital data found in Ontario's administrative databases will be used to monitor pre-defined prescribing changes relevant and specific to each replicate, following delivery of the educational messages. Multi-level modeling will be used to study patterns in physician-prescribing quality over four quarters, before and after each of the three interventions. Subgroup analyses will be performed to assess the association between the characteristics of the physician's place of practice and target behaviours. A further analysis of the immediate and delayed impacts of the PEMs will be performed using time-series analysis and interventional, auto-regressive, integrated moving average modeling. Trial registration number Current controlled trial ISRCTN72772651. PMID:18039361

  11. Estimating the Length of the North Atlantic Basin Hurricane Season

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    2012-01-01

    For the interval 1945-2011, the length of the hurricane season in the North Atlantic basin averages about 130 +/- 42 days (the +/-1 standard deviation interval), having a range of 47 to 235 days. Runs-testing reveals that the annual length of season varies nonrandomly at the 5% level of significance. In particular, its trend, as described using 10-yr moving averages, generally has been upward since about 1979, increasing from about 113 to 157 days (in 2003). Based on annual values, one finds a highly statistically important inverse correlation at the 0.1% level of significance between the length of season and the occurrence of the first storm day of the season. For the 2012 hurricane season, based on the reported first storm day of May 19, 2012 (i.e., DOY = 140), the inferred preferential regression predicts that the length of the current season likely will be about 173 +/- 23 days, suggesting that it will end about November 8 +/- 23 days, with only about a 5% chance that it will end either before about September 23, 2012 or after about December 24, 2012.

  12. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  13. Plasma Anti-Glial Fibrillary Acidic Protein Autoantibody Levels during the Acute and Chronic Phases of Traumatic Brain Injury: A Transforming Research and Clinical Knowledge in Traumatic Brain Injury Pilot Study.

    PubMed

    Wang, Kevin K W; Yang, Zhihui; Yue, John K; Zhang, Zhiqun; Winkler, Ethan A; Puccio, Ava M; Diaz-Arrastia, Ramon; Lingsma, Hester F; Yuh, Esther L; Mukherjee, Pratik; Valadka, Alex B; Gordon, Wayne A; Okonkwo, David O; Manley, Geoffrey T; Cooper, Shelly R; Dams-O'Connor, Kristen; Hricik, Allison J; Inoue, Tomoo; Maas, Andrew I R; Menon, David K; Schnyer, David M; Sinha, Tuhin K; Vassar, Mary J

    2016-07-01

    We described recently a subacute serum autoantibody response toward glial fibrillary acidic protein (GFAP) and its breakdown products 5-10 days after severe traumatic brain injury (TBI). Here, we expanded our anti-GFAP autoantibody (AutoAb[GFAP]) investigation to the multicenter observational study Transforming Research and Clinical Knowledge in TBI Pilot (TRACK-TBI Pilot) to cover the full spectrum of TBI (Glasgow Coma Scale 3-15) by using acute (<24 h) plasma samples from 196 patients with acute TBI admitted to three Level I trauma centers, and a second cohort of 21 participants with chronic TBI admitted to inpatient TBI rehabilitation. We find that acute patients self-reporting previous TBI with loss of consciousness (LOC) (n = 43) had higher day 1 AutoAb[GFAP] (mean ± standard error: 9.11 ± 1.42; n = 43) than healthy controls (2.90 ± 0.92; n = 16; p = 0.032) and acute patients reporting no previous TBI (2.97 ± 0.37; n = 106; p < 0.001), but not acute patients reporting previous TBI without LOC (8.01 ± 1.80; n = 47; p = 0.906). These data suggest that while exposure to TBI may trigger the AutoAb[GFAP] response, circulating antibodies are elevated specifically in acute TBI patients with a history of TBI. AutoAb[GFAP] levels for participants with chronic TBI (average post-TBI time 176 days or 6.21 months) were also significantly higher (15.08 ± 2.82; n = 21) than healthy controls (p < 0.001). These data suggest a persistent upregulation of the autoimmune response to specific brain antigen(s) in the subacute to chronic phase after TBI, as well as after repeated TBI insults. Hence, AutoAb[GFAP] may be a sensitive assay to study the dynamic interactions between post-injury brain and patient-specific autoimmune responses across acute and chronic settings after TBI.

  14. A python-based docking program utilizing a receptor bound ligand shape: PythDock.

    PubMed

    Chung, Jae Yoon; Cho, Seung Joo; Hah, Jung-Mi

    2011-09-01

    PythDock is a heuristic docking program that uses Python programming language with a simple scoring function and a population based search engine. The scoring function considers electrostatic and dispersion/repulsion terms. The search engine utilizes a particle swarm optimization algorithm. A grid potential map is generated using the shape information of a bound ligand within the active site. Therefore, the searching area is more relevant to the ligand binding. To evaluate the docking performance of PythDock, two well-known docking programs (AutoDock and DOCK) were also used with the same data. The accuracy of docked results were measured by the difference of the ligand structure between x-ray structure, and docked pose, i.e., average root mean squared deviation values of the bound ligand were compared for fourteen protein-ligand complexes. Since the number of ligands' rotational flexibility is an important factor affecting the accuracy of a docking, the data set was chosen to have various degrees of flexibility. Although PythDock has a scoring function simpler than those of other programs (AutoDock and DOCK), our results showed that PythDock predicted more accurate poses than both AutoDock4.2 and DOCK6.2. This indicates that PythDock could be a useful tool to study ligand-receptor interactions and could also be beneficial in structure based drug design.

  15. Oronasal masks require higher levels of positive airway pressure than nasal masks to treat obstructive sleep apnea.

    PubMed

    Bettinzoli, Michela; Taranto-Montemurro, Luigi; Messineo, Ludovico; Corda, Luciano; Redolfi, Stefania; Ferliga, Mauro; Tantucci, Claudio

    2014-12-01

    The purpose of this study was to compare the therapeutic pressure determined by an automated CPAP device (AutoCPAP) during the titration period, between nasal and oronasal mask and the residual apnea-hypopnea index (AHI) on a subsequent poligraphy performed with the established therapeutic CPAP. As a retrospective study, 109 subjects with moderate and severe obstructive sleep apnea-hypopnea (apnea-hypopnea index≥15 events/h) were studied. CPAP titration was performed using an auto-titrating device. There was significant difference in the mean pressure delivered with autoCPAP between the group of patients using the nasal mask (mean 10.0 cmH2O±2.0 SD) and the group which used the oronasal mask (mean 11.2 cmH2O±2.1) (p<0.05). Residual apneas were lower when using a nasal mask: average AHI of 2.6±2.5 compared to 4.5±4.0 using an oronasal mask (p<0.05). On multivariate analysis, the only independent predictor of the level of therapeutic pressure of CPAP was the type of mask used (r=0.245, p 0.008). Therapeutic CPAP level for OSAH is higher when administered via oronasal mask, leaving more residual events. These findings suggest that nasal mask should be the first choice for OSAH treatment.

  16. Temporal patterns of variable relationships in person-oriented research: longitudinal models of configural frequency analysis.

    PubMed

    von Eye, Alexander; Mun, Eun Young; Bogat, G Anne

    2008-03-01

    This article reviews the premises of configural frequency analysis (CFA), including methods of choosing significance tests and base models, as well as protecting alpha, and discusses why CFA is a useful approach when conducting longitudinal person-oriented research. CFA operates at the manifest variable level. Longitudinal CFA seeks to identify those temporal patterns that stand out as more frequent (CFA types) or less frequent (CFA antitypes) than expected with reference to a base model. A base model that has been used frequently in CFA applications, prediction CFA, and a new base model, auto-association CFA, are discussed for analysis of cross-classifications of longitudinal data. The former base model takes the associations among predictors and among criteria into account. The latter takes the auto-associations among repeatedly observed variables into account. Application examples of each are given using data from a longitudinal study of domestic violence. It is demonstrated that CFA results are not redundant with results from log-linear modeling or multinomial regression and that, of these approaches, CFA shows particular utility when conducting person-oriented research.

  17. Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.

    PubMed

    Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel

    2018-06-05

    In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.

  18. Comparisons of fully automated syphilis tests with conventional VDRL and FTA-ABS tests.

    PubMed

    Choi, Seung Jun; Park, Yongjung; Lee, Eun Young; Kim, Sinyoung; Kim, Hyon-Suk

    2013-06-01

    Serologic tests are widely used for the diagnosis of syphilis. However, conventional methods require well-trained technicians to produce reliable results. We compared automated nontreponemal and treponemal tests with conventional methods. The HiSens Auto Rapid Plasma Reagin (AutoRPR) and Treponema Pallidum particle agglutination (AutoTPPA) tests, which utilize latex turbidimetric immunoassay, were assessed. A total of 504 sera were assayed by AutoRPR, AutoTPPA, conventional VDRL and FTA-ABS. Among them, 250 samples were also tested by conventional TPPA. The concordance rate between the results of VDRL and AutoRPR was 67.5%, and 164 discrepant cases were all VDRL reactive but AutoRPR negative. In the 164 cases, 133 showed FTA-ABS reactivity. Medical records of 106 among the 133 cases were reviewed, and 82 among 106 specimens were found to be collected from patients already treated for syphilis. The concordance rate between the results of AutoTPPA and FTA-ABS was 97.8%. The results of conventional TPPA and AutoTPPA for 250 samples were concordant in 241 cases (96.4%). AutoRPR showed higher specificity than that of VDRL, while VDRL demonstrated higher sensitivity than that of AutoRPR regardless of whether the patients had been already treated for syphilis or not. Both FTA-ABS and AutoTPPA showed high sensitivities and specificities greater than 98.0%. Automated RPR and TPPA tests could be alternatives to conventional syphilis tests, and AutoRPR would be particularly suitable in treatment monitoring, since results by AutoRPR in cases after treatment became negative more rapidly than by VDRL. Copyright © 2013. Published by Elsevier Inc.

  19. An improved moving average technical trading rule

    NASA Astrophysics Data System (ADS)

    Papailias, Fotis; Thomakos, Dimitrios D.

    2015-06-01

    This paper proposes a modified version of the widely used price and moving average cross-over trading strategies. The suggested approach (presented in its 'long only' version) is a combination of cross-over 'buy' signals and a dynamic threshold value which acts as a dynamic trailing stop. The trading behaviour and performance from this modified strategy are different from the standard approach with results showing that, on average, the proposed modification increases the cumulative return and the Sharpe ratio of the investor while exhibiting smaller maximum drawdown and smaller drawdown duration than the standard strategy.

  20. Criterion-Referenced Test Items for Auto Body.

    ERIC Educational Resources Information Center

    Tannehill, Dana, Ed.

    This test item bank on auto body repair contains criterion-referenced test questions based upon competencies found in the Missouri Auto Body Competency Profile. Some test items are keyed for multiple competencies. The tests cover the following 26 competency areas in the auto body curriculum: auto body careers; measuring and mixing; tools and…

  1. Autologous CD34+ cell therapy improves exercise capacity, angina frequency and reduces mortality in no-option refractory angina: a patient-level pooled analysis of randomized double-blinded trials.

    PubMed

    Henry, Timothy D; Losordo, Douglas W; Traverse, Jay H; Schatz, Richard A; Jolicoeur, E Marc; Schaer, Gary L; Clare, Robert; Chiswell, Karen; White, Christopher J; Fortuin, F David; Kereiakes, Dean J; Zeiher, Andreas M; Sherman, Warren; Hunt, Andrea S; Povsic, Thomas J

    2018-01-05

    Autologous CD34+ (auto-CD34+) cells represent an attractive option for the treatment of refractory angina. Three double-blinded randomized trials (n = 304) compared intramyocardial (IM) auto-CD34+ cells with IM placebo injections to affect total exercise time (TET), angina frequency (AF), and major adverse cardiac events (MACE). Patient-level data were pooled from the Phase I, Phase II ACT-34, ACT-34 extension, and Phase III RENEW trials to determine the efficacy and safety of auto-CD34+ cells. Treatment effects for TET were analysed using an analysis of covariance mixed-effects model and for AF using Poisson regression in a log linear model with repeated measures. The Kaplan-Meier rate estimates for MACE were compared using the log-rank test. Autologous CD34+ cell therapy improved TET by 46.6 s [3 months, 95% confidence interval (CI) 13.0 s-80.3 s; P = 0.007], 49.5 s (6 months, 95% CI 9.3-89.7; P = 0.016), and 44.7 s (12 months, 95% CI - 2.7 s-92.1 s; P = 0.065). The relative frequency of angina was 0.78 (95% CI 0.63-0.98; P = 0.032), 0.66 (0.48-0.91; P = 0.012), and 0.58 (0.38-0.88; P = 0.011) at 3-, 6- and 12-months in auto-CD34+ compared with placebo patients. Results remained concordant when analysed by treatment received and when confined to the Phase III dose of 1 × 105 cells/kg. Autologous CD34 + cell therapy significantly decreased mortality (12.1% vs. 2.5%; P = 0.0025) and numerically reduced MACE (38.9% vs. 30.0; P = 0.14) at 24 months. Treatment with auto-CD34+ cells resulted in clinically meaningful durable improvements in TET and AF at 3-, 6- and 12-months, as well as a reduction in 24-month mortality in this patient-level meta-analysis. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please email: journals.permissions@oup.com.

  2. Effect of chromium doping on the structural and vibrational properties of Mn-Zn ferrites

    NASA Astrophysics Data System (ADS)

    Saleem, M.; Varshney, Dinesh

    2018-05-01

    The synthesis of Mn0.5Zn0.5-xCrxFe2O4 (x = 0.0, 0.1, 0.2 and 0.5) via sol-gel Auto-combustion technique is reported. The x-ray diffraction spectra analysis revealed the cubic spinel structure for all the prepared spinel ferrite samples with the space group Fd3m. The structural studies identify the decrease of lattice parameter however the crystallite size decreases on increasing the Cr concentration. The Raman spectrum reveals five active phonon modes at room temperature and shifting of modes toward the higher frequency side on moving from Mn-ZnFe2O4 to Mn-CrFe2O4.

  3. Portable Cooler/Warmers

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Early in the space program, NASA recognized the need to replace bulky coils, compressers, and motors for refrigeration purposes by looking at existing thermoelectric technology. This effort resulted in the development of miniaturized thermoelectric components and packaging to accommodate tight confines of spacecraft. Koolatron's portable electronic refrigerators incorporate this NASA technology. Each of the cooler/warmers employs one or two miniaturized thermoelectric modules. Although each module is only the size of a book of matches, it delivers the cooling power of a 10-pound block of ice. In some models, the cooler can be converted to a warmer. There are no moving parts. The Koolatrons can be plugged into auto cigarette lighters, recreational vehicles, boats or motel outlets.

  4. Minimum Alcohol Prices and Outlet Densities in British Columbia, Canada: Estimated Impacts on Alcohol-Attributable Hospital Admissions

    PubMed Central

    Zhao, Jinhui; Martin, Gina; Macdonald, Scott; Vallance, Kate; Treno, Andrew; Ponicki, William; Tu, Andrew; Buxton, Jane

    2013-01-01

    Objectives. We investigated whether periodic increases in minimum alcohol prices were associated with reduced alcohol-attributable hospital admissions in British Columbia. Methods. The longitudinal panel study (2002–2009) incorporated minimum alcohol prices, density of alcohol outlets, and age- and gender-standardized rates of acute, chronic, and 100% alcohol-attributable admissions. We applied mixed-method regression models to data from 89 geographic areas of British Columbia across 32 time periods, adjusting for spatial and temporal autocorrelation, moving average effects, season, and a range of economic and social variables. Results. A 10% increase in the average minimum price of all alcoholic beverages was associated with an 8.95% decrease in acute alcohol-attributable admissions and a 9.22% reduction in chronic alcohol-attributable admissions 2 years later. A Can$ 0.10 increase in average minimum price would prevent 166 acute admissions in the 1st year and 275 chronic admissions 2 years later. We also estimated significant, though smaller, adverse impacts of increased private liquor store density on hospital admission rates for all types of alcohol-attributable admissions. Conclusions. Significant health benefits were observed when minimum alcohol prices in British Columbia were increased. By contrast, adverse health outcomes were associated with an expansion of private liquor stores. PMID:23597383

  5. STOCK MARKET CRASH AND EXPECTATIONS OF AMERICAN HOUSEHOLDS*

    PubMed Central

    HUDOMIET, PÉTER; KÉZDI, GÁBOR; WILLIS, ROBERT J.

    2011-01-01

    SUMMARY This paper utilizes data on subjective probabilities to study the impact of the stock market crash of 2008 on households’ expectations about the returns on the stock market index. We use data from the Health and Retirement Study that was fielded in February 2008 through February 2009. The effect of the crash is identified from the date of the interview, which is shown to be exogenous to previous stock market expectations. We estimate the effect of the crash on the population average of expected returns, the population average of the uncertainty about returns (subjective standard deviation), and the cross-sectional heterogeneity in expected returns (disagreement). We show estimates from simple reduced-form regressions on probability answers as well as from a more structural model that focuses on the parameters of interest and separates survey noise from relevant heterogeneity. We find a temporary increase in the population average of expectations and uncertainty right after the crash. The effect on cross-sectional heterogeneity is more significant and longer lasting, which implies substantial long-term increase in disagreement. The increase in disagreement is larger among the stockholders, the more informed, and those with higher cognitive capacity, and disagreement co-moves with trading volume and volatility in the market. PMID:21547244

  6. Baseline repeated measures from controlled human exposure studies: associations between ambient air pollution exposure and the systemic inflammatory biomarkers IL-6 and fibrinogen.

    PubMed

    Thompson, Aaron M S; Zanobetti, Antonella; Silverman, Frances; Schwartz, Joel; Coull, Brent; Urch, Bruce; Speck, Mary; Brook, Jeffrey R; Manno, Michael; Gold, Diane R

    2010-01-01

    Systemic inflammation may be one of the mechanisms mediating the association between ambient air pollution and cardiovascular morbidity and mortality. Interleukin-6 (IL-6) and fibrinogen are biomarkers of systemic inflammation that are independent risk factors for cardio-vascular disease. We investigated the association between ambient air pollution and systemic inflammation using baseline measurements of IL-6 and fibrinogen from controlled human exposure studies. In this retrospective analysis we used repeated-measures data in 45 nonsmoking subjects. Hourly and daily moving averages were calculated for ozone, nitrogen dioxide, sulfur dioxide, and particulate matter

  7. Ambient temperature and biomarkers of heart failure: a repeated measures analysis.

    PubMed

    Wilker, Elissa H; Yeh, Gloria; Wellenius, Gregory A; Davis, Roger B; Phillips, Russell S; Mittleman, Murray A

    2012-08-01

    Extreme temperatures have been associated with hospitalization and death among individuals with heart failure, but few studies have explored the underlying mechanisms. We hypothesized that outdoor temperature in the Boston, Massachusetts, area (1- to 4-day moving averages) would be associated with higher levels of biomarkers of inflammation and myocyte injury in a repeated-measures study of individuals with stable heart failure. We analyzed data from a completed clinical trial that randomized 100 patients to 12 weeks of tai chi classes or to time-matched education control. B-type natriuretic peptide (BNP), C-reactive protein (CRP), and tumor necrosis factor (TNF) were measured at baseline, 6 weeks, and 12 weeks. Endothelin-1 was measured at baseline and 12 weeks. We used fixed effects models to evaluate associations with measures of temperature that were adjusted for time-varying covariates. Higher apparent temperature was associated with higher levels of BNP beginning with 2-day moving averages and reached statistical significance for 3- and 4-day moving averages. CRP results followed a similar pattern but were delayed by 1 day. A 5°C change in 3- and 4-day moving averages of apparent temperature was associated with 11.3% [95% confidence interval (CI): 1.1, 22.5; p = 0.03) and 11.4% (95% CI: 1.2, 22.5; p = 0.03) higher BNP. A 5°C change in the 4-day moving average of apparent temperature was associated with 21.6% (95% CI: 2.5, 44.2; p = 0.03) higher CRP. No clear associations with TNF or endothelin-1 were observed. Among patients undergoing treatment for heart failure, we observed positive associations between temperature and both BNP and CRP-predictors of heart failure prognosis and severity.

  8. Bus-based park-and-ride system: a stochastic model on multimodal network with congestion pricing schemes

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyuan; Meng, Qiang

    2014-05-01

    This paper focuses on modelling the network flow equilibrium problem on a multimodal transport network with bus-based park-and-ride (P&R) system and congestion pricing charges. The multimodal network has three travel modes: auto mode, transit mode and P&R mode. A continuously distributed value-of-time is assumed to convert toll charges and transit fares to time unit, and the users' route choice behaviour is assumed to follow the probit-based stochastic user equilibrium principle with elastic demand. These two assumptions have caused randomness to the users' generalised travel times on the multimodal network. A comprehensive network framework is first defined for the flow equilibrium problem with consideration of interactions between auto flows and transit (bus) flows. Then, a fixed-point model with unique solution is proposed for the equilibrium flows, which can be solved by a convergent cost averaging method. Finally, the proposed methodology is tested by a network example.

  9. SM91: Observations of interchange between acceleration and thermalization processes in auroral electrons

    NASA Technical Reports Server (NTRS)

    Pongratz, M.

    1972-01-01

    Results from a Nike-Tomahawk sounding rocket flight launched from Fort Churchill are presented. The rocket was launched into a breakup aurora at magnetic local midnight on 21 March 1968. The rocket was instrumented to measure electrons with an electrostatic analyzer electron spectrometer which made 29 measurements in the energy interval 0.5 KeV to 30 KeV. Complete energy spectra were obtained at a rate of 10/sec. Pitch angle information is presented via 3 computed average per rocket spin. The dumped electron average corresponds to averages over electrons moving nearly parallel to the B vector. The mirroring electron average corresponds to averages over electrons moving nearly perpendicular to the B vector. The average was also computed over the entire downward hemisphere (the precipitated electron average). The observations were obtained in an altitude range of 10 km at 230 km altitude.

  10. Kumaraswamy autoregressive moving average models for double bounded environmental data

    NASA Astrophysics Data System (ADS)

    Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme

    2017-12-01

    In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.

  11. A comparison of four streamflow record extension techniques

    USGS Publications Warehouse

    Hirsch, Robert M.

    1982-01-01

    One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., ‘line of organic correlation,’ ‘reduced major axis,’ ‘unique solution,’ and ‘equivalence line.’ The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.

  12. A Comparison of Four Streamflow Record Extension Techniques

    NASA Astrophysics Data System (ADS)

    Hirsch, Robert M.

    1982-08-01

    One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., `line of organic correlation,' `reduced major axis,' `unique solution,' and `equivalence line.' The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.

  13. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    PubMed

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  14. Computed tomography synthesis from magnetic resonance images in the pelvis using multiple random forests and auto-context features

    NASA Astrophysics Data System (ADS)

    Andreasen, Daniel; Edmund, Jens M.; Zografos, Vasileios; Menze, Bjoern H.; Van Leemput, Koen

    2016-03-01

    In radiotherapy treatment planning that is only based on magnetic resonance imaging (MRI), the electron density information usually obtained from computed tomography (CT) must be derived from the MRI by synthesizing a so-called pseudo CT (pCT). This is a non-trivial task since MRI intensities are neither uniquely nor quantitatively related to electron density. Typical approaches involve either a classification or regression model requiring specialized MRI sequences to solve intensity ambiguities, or an atlas-based model necessitating multiple registrations between atlases and subject scans. In this work, we explore a machine learning approach for creating a pCT of the pelvic region from conventional MRI sequences without using atlases. We use a random forest provided with information about local texture, edges and spatial features derived from the MRI. This helps to solve intensity ambiguities. Furthermore, we use the concept of auto-context by sequentially training a number of classification forests to create and improve context features, which are finally used to train a regression forest for pCT prediction. We evaluate the pCT quality in terms of the voxel-wise error and the radiologic accuracy as measured by water-equivalent path lengths. We compare the performance of our method against two baseline pCT strategies, which either set all MRI voxels in the subject equal to the CT value of water, or in addition transfer the bone volume from the real CT. We show an improved performance compared to both baseline pCTs suggesting that our method may be useful for MRI-only radiotherapy.

  15. Automotive Mechanics as Applied to Auto Body; Auto Body Repair and Refinishing 3: 9037.02.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    This is a course in which the student will receive the general information, technical knowledge, basic skills, attitudes, and values required for job entry level as an auto body repair helper. Course content includes general and specific goals, orientation, instruction in service tools and bench skills, and auto mechanics as applied to auto body.…

  16. Neural net forecasting for geomagnetic activity

    NASA Technical Reports Server (NTRS)

    Hernandez, J. V.; Tajima, T.; Horton, W.

    1993-01-01

    We use neural nets to construct nonlinear models to forecast the AL index given solar wind and interplanetary magnetic field (IMF) data. We follow two approaches: (1) the state space reconstruction approach, which is a nonlinear generalization of autoregressive-moving average models (ARMA) and (2) the nonlinear filter approach, which reduces to a moving average model (MA) in the linear limit. The database used here is that of Bargatze et al. (1985).

  17. The performance of two automatic servo-ventilation devices in the treatment of central sleep apnea.

    PubMed

    Javaheri, Shahrokh; Goetting, Mark G; Khayat, Rami; Wylie, Paul E; Goodwin, James L; Parthasarathy, Sairam

    2011-12-01

    This study was conducted to evaluate the therapeutic performance of a new auto Servo Ventilation device (Philips Respironics autoSV Advanced) for the treatment of complex central sleep apnea (CompSA). The features of autoSV Advanced include an automatic expiratory pressure (EPAP) adjustment, an advanced algorithm for distinguishing open versus obstructed airway apnea, a modified auto backup rate which is proportional to subject's baseline breathing rate, and a variable inspiratory support. Our primary aim was to compare the performance of the advanced servo-ventilator (BiPAP autoSV Advanced) with conventional servo-ventilator (BiPAP autoSV) in treating central sleep apnea (CSA). A prospective, multicenter, randomized, controlled trial. Five sleep laboratories in the United States. Thirty-seven participants were included. All subjects had full night polysomnography (PSG) followed by a second night continuous positive airway pressure (CPAP) titration. All had a central apnea index ≥ 5 per hour of sleep on CPAP. Subjects were randomly assigned to 2 full-night PSGs while treated with either the previously marketed autoSV, or the new autoSV Advanced device. The 2 randomized sleep studies were blindly scored centrally. Across the 4 nights (PSG, CPAP, autoSV, and autoSV Advanced), the mean ± 1 SD apnea hypopnea indices were 53 ± 23, 35 ± 20, 10 ± 10, and 6 ± 6, respectively; indices for CSA were 16 ± 19, 19 ± 18, 3 ± 4, and 0.6 ± 1. AutoSV Advanced was more effective than other modes in correcting sleep related breathing disorders. BiPAP autoSV Advanced was more effective than conventional BiPAP autoSV in the treatment of sleep disordered breathing in patients with CSA.

  18. Queues with Choice via Delay Differential Equations

    NASA Astrophysics Data System (ADS)

    Pender, Jamol; Rand, Richard H.; Wesson, Elizabeth

    Delay or queue length information has the potential to influence the decision of a customer to join a queue. Thus, it is imperative for managers of queueing systems to understand how the information that they provide will affect the performance of the system. To this end, we construct and analyze two two-dimensional deterministic fluid models that incorporate customer choice behavior based on delayed queue length information. In the first fluid model, customers join each queue according to a Multinomial Logit Model, however, the queue length information the customer receives is delayed by a constant Δ. We show that the delay can cause oscillations or asynchronous behavior in the model based on the value of Δ. In the second model, customers receive information about the queue length through a moving average of the queue length. Although it has been shown empirically that giving patients moving average information causes oscillations and asynchronous behavior to occur in U.S. hospitals, we analytically and mathematically show for the first time that the moving average fluid model can exhibit oscillations and determine their dependence on the moving average window. Thus, our analysis provides new insight on how operators of service systems should report queue length information to customers and how delayed information can produce unwanted system dynamics.

  19. Modeling and roles of meteorological factors in outbreaks of highly pathogenic avian influenza H5N1.

    PubMed

    Biswas, Paritosh K; Islam, Md Zohorul; Debnath, Nitish C; Yamage, Mat

    2014-01-01

    The highly pathogenic avian influenza A virus subtype H5N1 (HPAI H5N1) is a deadly zoonotic pathogen. Its persistence in poultry in several countries is a potential threat: a mutant or genetically reassorted progenitor might cause a human pandemic. Its world-wide eradication from poultry is important to protect public health. The global trend of outbreaks of influenza attributable to HPAI H5N1 shows a clear seasonality. Meteorological factors might be associated with such trend but have not been studied. For the first time, we analyze the role of meteorological factors in the occurrences of HPAI outbreaks in Bangladesh. We employed autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to assess the roles of different meteorological factors in outbreaks of HPAI. Outbreaks were modeled best when multiplicative seasonality was incorporated. Incorporation of any meteorological variable(s) as inputs did not improve the performance of any multivariable models, but relative humidity (RH) was a significant covariate in several ARIMA and SARIMA models with different autoregressive and moving average orders. The variable cloud cover was also a significant covariate in two SARIMA models, but air temperature along with RH might be a predictor when moving average (MA) order at lag 1 month is considered.

  20. On the Period-Amplitude and Amplitude-Period Relationships

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2008-01-01

    Examined are Period-Amplitude and Amplitude-Period relationships based on the cyclic behavior of the 12-month moving averages of monthly mean sunspot numbers for cycles 0.23, both in terms of Fisher's exact tests for 2x2 contingency tables and linear regression analyses. Concerning the Period-Amplitude relationship (same cycle), because cycle 23's maximum amplitude is known to be 120.8, the inferred regressions (90-percent prediction intervals) suggest that its period will be 131 +/- 24 months (using all cycles) or 131 +/- 18 months (ignoring cycles 2 and 4, which have the extremes of period, 108 and 164 months, respectively). Because cycle 23 has already persisted for 142 months (May 1996 through February 2008), based on the latter prediction, it should end before September 2008. Concerning the Amplitude-Period relationship (following cycle maximum amplitude versus preceding cycle period), because cycle 23's period is known to be at least 142 months, the inferred regressions (90-percent prediction intervals) suggest that cycle 24's maximum amplitude will be about less than or equal to 96.1 +/- 55.0 (using all cycle pairs) or less than or equal to 91.0 +/- 36.7 (ignoring statistical outlier cycle pairs). Hence, cycle 24's maximum amplitude is expected to be less than 151, perhaps even less than 128, unless cycle pair 23/24 proves to be a statistical outlier.

  1. Application and evaluation of forecasting methods for municipal solid waste generation in an Eastern-European city.

    PubMed

    Rimaityte, Ingrida; Ruzgas, Tomas; Denafas, Gintaras; Racys, Viktoras; Martuzevicius, Dainius

    2012-01-01

    Forecasting of generation of municipal solid waste (MSW) in developing countries is often a challenging task due to the lack of data and selection of suitable forecasting method. This article aimed to select and evaluate several methods for MSW forecasting in a medium-scaled Eastern European city (Kaunas, Lithuania) with rapidly developing economics, with respect to affluence-related and seasonal impacts. The MSW generation was forecast with respect to the economic activity of the city (regression modelling) and using time series analysis. The modelling based on social-economic indicators (regression implemented in LCA-IWM model) showed particular sensitivity (deviation from actual data in the range from 2.2 to 20.6%) to external factors, such as the synergetic effects of affluence parameters or changes in MSW collection system. For the time series analysis, the combination of autoregressive integrated moving average (ARIMA) and seasonal exponential smoothing (SES) techniques were found to be the most accurate (mean absolute percentage error equalled to 6.5). Time series analysis method was very valuable for forecasting the weekly variation of waste generation data (r (2) > 0.87), but the forecast yearly increase should be verified against the data obtained by regression modelling. The methods and findings of this study may assist the experts, decision-makers and scientists performing forecasts of MSW generation, especially in developing countries.

  2. Peripherally Inserted Central Catheter-Related Infections in a Cohort of Hospitalized Adult Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouzad, Caroline, E-mail: caroline.bouzad@gmail.com; Duron, Sandrine, E-mail: duronsandrine@yahoo.fr; Bousquet, Aurore, E-mail: aurorebousquet@yahoo.fr

    PurposeTo determine the incidence and the risks factors of peripherally inserted central catheter (PICC)-related infectious complications.Materials and MethodsMedical charts of every in-patient that underwent a PICC insertion in our hospital between January 2010 and October 2013 were reviewed. All PICC-related infections were recorded and categorized as catheter-related bloodstream infections (CR-BSI), exit-site infections, and septic thrombophlebitis.ResultsNine hundred and twenty-three PICCs were placed in 644 unique patients, mostly male (68.3 %) with a median age of 58 years. 31 (3.4 %) PICC-related infections occurred during the study period corresponding to an infection rate of 1.64 per 1000 catheter-days. We observed 27 (87.1 %) CR-BSI, corresponding tomore » a rate of 1.43 per 1000 catheter-days, 3 (9.7 %) septic thrombophlebitis, and 1 (3.2 %) exit-site infection. Multivariate logistic regression analysis showed a higher PICC-related infection rate with chemotherapy (odds ratio (OR) 7.2–confidence interval (CI) 95 % [1.77–29.5]), auto/allograft (OR 5.9–CI 95 % [1.2–29.2]), and anti-coagulant therapy (OR 2.2–95 % [1.4–12]).ConclusionChemotherapy, auto/allograft, and anti-coagulant therapy are associated with an increased risk of developing PICC-related infections.Clinical AdvanceChemotherapy, auto/allograft, and anti-coagulant therapy are important predictors of PICC-associated infections. A careful assessment of these risk factors may be important for future success in preventing PICC-related infections.« less

  3. On-road PM2.5 pollution exposure in multiple transport microenvironments in Delhi

    NASA Astrophysics Data System (ADS)

    Goel, Rahul; Gani, Shahzad; Guttikunda, Sarath K.; Wilson, Daniel; Tiwari, Geetam

    2015-12-01

    PM2.5 pollution in Delhi averaged 150 μg/m3 from 2012 through 2014, which is 15 times higher than the World Health Organization's annual-average guideline. For this setting, we present on-road exposure of PM2.5 concentrations for 11 transport microenvironments along a fixed 8.3-km arterial route, during morning rush hour. The data collection was carried out using a portable TSI DustTrak DRX 8433 aerosol monitor, between January and May (2014). The monthly-average measured ambient concentrations varied from 130 μg/m3 to 250 μg/m3. The on-road PM2.5 concentrations exceeded the ambient measurements by an average of 40% for walking, 10% for cycle, 30% for motorised two wheeler (2W), 30% for open-windowed (OW) car, 30% for auto rickshaw, 20% for air-conditioned as well as for OW bus, 20% for bus stop, and 30% for underground metro station. On the other hand, concentrations were lower by 50% inside air-conditioned (AC) car and 20% inside the metro rail carriage. We find that the percent exceedance for open modes (cycle, auto rickshaw, 2W, OW car, and OW bus) reduces non-linearly with increasing ambient concentration. The reduction is steeper at concentrations lower than 150 μg/m3 than at higher concentrations. After accounting for air inhalation rate and speed of travel, PM2.5 mass uptake per kilometer during cycling is 9 times of AC car, the mode with the lowest exposure. At current level of concentrations, an hour of cycling in Delhi during morning rush-hour period results in PM2.5 dose which is 40% higher than an entire-day dose in cities like Tokyo, London, and New York, where ambient concentrations range from 10 to 20 μg/m3.

  4. What do UK doctors in training value in a post? A discrete choice experiment.

    PubMed

    Cleland, Jennifer; Johnston, Peter; Watson, Verity; Krucien, Nicolas; Skåtun, Diane

    2016-02-01

    Many individual and job-related factors are known to influence medical careers decision making. Medical trainees' (residents) views of which characteristics of a training post are important to them have been extensively studied but how they trade-off these characteristics is under-researched. Such information is crucial for the development of effective policies to enhance recruitment and retention. Our aim was to investigate the strength of UK foundation doctors' and trainees' preferences for training post characteristics in terms of monetary value. We used an online questionnaire study incorporating a discrete choice experiment (DCE), distributed to foundation programme doctors and doctors in training across all specialty groups within three UK regions, in August-October 2013. The main outcome measures were monetary values for training-post characteristics, based on willingness to forgo and willingness to accept extra income for a change in each job characteristic, calculated from regression coefficients. The questionnaire was answered by 1323 trainees. Good working conditions were the most influential characteristics of a training position. Trainee doctors would need to be compensated by an additional 49.8% above the average earnings within their specialty to move from a post with good working conditions to one with poor working conditions. A training post with limited rather than good opportunities for one's spouse or partner would require compensation of 38.4% above the average earnings within their specialty. Trainees would require compensation of 30.8% above the average earnings within their specialty to move from a desirable to a less desirable locality. These preferences varied only to a limited extent according to individual characteristics. Trainees place most value on good working conditions, good opportunities for their partners and desirable geographical location when making career-related decisions. This intelligence can be used to develop alternative models of workforce planning or to develop information about job opportunities that address trainees' values. © 2016 John Wiley & Sons Ltd.

  5. MARD—A moving average rose diagram application for the geosciences

    NASA Astrophysics Data System (ADS)

    Munro, Mark A.; Blenkinsop, Thomas G.

    2012-12-01

    MARD 1.0 is a computer program for generating smoothed rose diagrams by using a moving average, which is designed for use across the wide range of disciplines encompassed within the Earth Sciences. Available in MATLAB®, Microsoft® Excel and GNU Octave formats, the program is fully compatible with both Microsoft® Windows and Macintosh operating systems. Each version has been implemented in a user-friendly way that requires no prior experience in programming with the software. MARD conducts a moving average smoothing, a form of signal processing low-pass filter, upon the raw circular data according to a set of pre-defined conditions selected by the user. This form of signal processing filter smoothes the angular dataset, emphasising significant circular trends whilst reducing background noise. Customisable parameters include whether the data is uni- or bi-directional, the angular range (or aperture) over which the data is averaged, and whether an unweighted or weighted moving average is to be applied. In addition to the uni- and bi-directional options, the MATLAB® and Octave versions also possess a function for plotting 2-dimensional dips/pitches in a single, lower, hemisphere. The rose diagrams from each version are exportable as one of a selection of common graphical formats. Frequently employed statistical measures that determine the vector mean, mean resultant (or length), circular standard deviation and circular variance are also included. MARD's scope is demonstrated via its application to a variety of datasets within the Earth Sciences.

  6. Spatiotemporal variability of urban growth factors: A global and local perspective on the megacity of Mumbai

    NASA Astrophysics Data System (ADS)

    Shafizadeh-Moghadam, Hossein; Helbich, Marco

    2015-03-01

    The rapid growth of megacities requires special attention among urban planners worldwide, and particularly in Mumbai, India, where growth is very pronounced. To cope with the planning challenges this will bring, developing a retrospective understanding of urban land-use dynamics and the underlying driving-forces behind urban growth is a key prerequisite. This research uses regression-based land-use change models - and in particular non-spatial logistic regression models (LR) and auto-logistic regression models (ALR) - for the Mumbai region over the period 1973-2010, in order to determine the drivers behind spatiotemporal urban expansion. Both global models are complemented by a local, spatial model, the so-called geographically weighted logistic regression (GWLR) model, one that explicitly permits variations in driving-forces across space. The study comes to two main conclusions. First, both global models suggest similar driving-forces behind urban growth over time, revealing that LRs and ALRs result in estimated coefficients with comparable magnitudes. Second, all the local coefficients show distinctive temporal and spatial variations. It is therefore concluded that GWLR aids our understanding of urban growth processes, and so can assist context-related planning and policymaking activities when seeking to secure a sustainable urban future.

  7. The analysis of optical-electro collimated light tube measurement system

    NASA Astrophysics Data System (ADS)

    Li, Zhenhui; Jiang, Tao; Cao, Guohua; Wang, Yanfei

    2005-12-01

    A new type of collimated light tube (CLT) is mentioned in this paper. The analysis and structure of CLT are described detail. The reticle and discrimination board are replaced by a optical-electro graphics generator, or DLP-Digital Light Processor. DLP gives all kinds of graphics controlled by computer, the lighting surface lies on the focus of the CLT. The rays of light pass through the CLT, and the tested products, the image of aim is received by variant focus objective CCD camera, the image can be processed by computer, then, some basic optical parameters will be obtained, such as optical aberration, image slope, etc. At the same time, motorized translation stage carry the DLP moving to simulate the limited distance. The grating ruler records the displacement of the DLP. The key technique is optical-electro auto-focus, the best imaging quality can be gotten by moving 6-D motorized positioning stage. Some principal questions can be solved in this device, for example, the aim generating, the structure of receiving system and optical matching.

  8. Dynamic Singularity Spectrum Distribution of Sea Clutter

    NASA Astrophysics Data System (ADS)

    Xiong, Gang; Yu, Wenxian; Zhang, Shuning

    2015-12-01

    The fractal and multifractal theory have provided new approaches for radar signal processing and target-detecting under the background of ocean. However, the related research mainly focuses on fractal dimension or multifractal spectrum (MFS) of sea clutter. In this paper, a new dynamic singularity analysis method of sea clutter using MFS distribution is developed, based on moving detrending analysis (DMA-MFSD). Theoretically, we introduce the time information by using cyclic auto-correlation of sea clutter. For transient correlation series, the instantaneous singularity spectrum based on multifractal detrending moving analysis (MF-DMA) algorithm is calculated, and the dynamic singularity spectrum distribution of sea clutter is acquired. In addition, we analyze the time-varying singularity exponent ranges and maximum position function in DMA-MFSD of sea clutter. For the real sea clutter data, we analyze the dynamic singularity spectrum distribution of real sea clutter in level III sea state, and conclude that the radar sea clutter has the non-stationary and time-varying scale characteristic and represents the time-varying singularity spectrum distribution based on the proposed DMA-MFSD method. The DMA-MFSD will also provide reference for nonlinear dynamics and multifractal signal processing.

  9. A multivariate auto-regressive combined-harmonics analysis and its application to ozone time series data

    NASA Astrophysics Data System (ADS)

    Yang, Eun-Su

    2001-07-01

    A new statistical approach is used to analyze Dobson Umkehr layer-ozone measurements at Arosa for 1979-1996 and Total Ozone Mapping Spectrometer (TOMS) Version 7 zonal mean ozone for 1979-1993, accounting for stratospheric aerosol optical depth (SAOD), quasi-biennial oscillation (QBO), and solar flux effects. A stepwise regression scheme selects statistically significant periodicities caused by season, SAOD, QBO, and solar variations and filters them out. Auto-regressive (AR) terms are included in ozone residuals and time lags are assumed for the residuals of exogenous variables. Then, the magnitudes of responses of ozone to the SAOD, QBO, and solar index (SI) series are derived from the stationary time series of the residuals. These Multivariate Auto-Regressive Combined Harmonics (MARCH) processes possess the following significant advantages: (1)the ozone trends are estimated more precisely than the previous methods; (2)the influences of the exogenous SAOD, QBO, and solar variations are clearly separated at various time lags; (3)the collinearity of the exogenous variables in the regression is significantly reduced; and (4)the probability of obtaining misleading correlations between ozone and exogenous times series is reduced. The MARCH results indicate that the Umkehr ozone response to SAOD (not a real ozone response but rather an optical interference effect), QBO, and solar effects is driven by combined dynamical radiative-chemical processes. These results are independently confirmed using the revised Standard models that include aerosol and solar forcing mechanisms with all possible time lags but not by the Standard model when restricted to a zero time lag in aerosol and solar ozone forcings. As for Dobson Umkehr ozone measurements at Arosa, the aerosol effects are most significant in layers 8, 7, and 6 with no time lag, as is to be expected due to the optical contamination of Umkehr measurements by SAOD. The QBO and solar UV effects appear in all layers 4-8, and in total ozone. In order to account for annual modulation of the equatorial winds that affects ozone at midlatitudes, a new QBO proxy is selected and applied to the Dobson Umkehr measurements at Arosa. The QBO proxy turns out to be more effective to filter the modulated ozone signals at midlatitudes than the mostly used QBO proxy, the Singapore winds at 30 mb. A statistically significant negative phase relationship is found between solar UV variation and ozone response, especially in layer 4, implying dynamical effects of solar variations on ozone at midlatitudes. Linear negative trends in ozone of -7.8 +/- 1.1 and -5.2 +/- 1.4 [%/decade +/- 2σ] are calculated in layers 7 (~35 km) and 8 (~40 km), respectively, for the period of 1979-1996, with smaller trends of -2.2 +/- 1.0, 1.8 +/- 0.9, and -1.4 +/- 1.1 in layers 6 (~30 km), 5 (~25 km), and 4 (~20 km), respectively. A trend in total ozone (layers 1 through 10) of -2.9 +/- 1.2 [%/decade +/- 2σ] is found over this same period. The aerosol effects obtained from the TOMS zonal means become significant at midlatitudes. QBO ozone contributes to the TOMS zonal means by +/-2 to 4% of their means. The negative solar ozone responses are also found at midlatitudes from the TOMS measurements. The most negative trends from TOMS zonal means are about -6.3 +/- 0.6%/decade at 40-50°N.

  10. Application of Multi-Input Multi-Output Feedback Control for F-16 Ventral Fin Buffet Alleviation Using Piezoelectric Actuators

    DTIC Science & Technology

    2012-03-22

    Power Amplifier (7). A power amplifier was required to drive the actuators. For this research a Trek , Inc. Model PZD 700 Dual Channel Amplifier was used...while the flight test amplifier was being built. The Trek amplifier was capable of amplifying 32 Figure 3.19: dSpace MicroAutoBox II Digital...averaging of 25% was used to reduce the errors caused by noise but still maintain accuracy. For the laboratory Trek amplifier, a 100 millivolt input

  11. Watershed regressions for pesticides (warp) models for predicting atrazine concentrations in Corn Belt streams

    USGS Publications Warehouse

    Stone, Wesley W.; Gilliom, Robert J.

    2012-01-01

    Watershed Regressions for Pesticides (WARP) models, previously developed for atrazine at the national scale, are improved for application to the United States (U.S.) Corn Belt region by developing region-specific models that include watershed characteristics that are influential in predicting atrazine concentration statistics within the Corn Belt. WARP models for the Corn Belt (WARP-CB) were developed for annual maximum moving-average (14-, 21-, 30-, 60-, and 90-day durations) and annual 95th-percentile atrazine concentrations in streams of the Corn Belt region. The WARP-CB models accounted for 53 to 62% of the variability in the various concentration statistics among the model-development sites. Model predictions were within a factor of 5 of the observed concentration statistic for over 90% of the model-development sites. The WARP-CB residuals and uncertainty are lower than those of the National WARP model for the same sites. Although atrazine-use intensity is the most important explanatory variable in the National WARP models, it is not a significant variable in the WARP-CB models. The WARP-CB models provide improved predictions for Corn Belt streams draining watersheds with atrazine-use intensities of 17 kg/km2 of watershed area or greater.

  12. Time series trends of the safety effects of pavement resurfacing.

    PubMed

    Park, Juneyoung; Abdel-Aty, Mohamed; Wang, Jung-Han

    2017-04-01

    This study evaluated the safety performance of pavement resurfacing projects on urban arterials in Florida using the observational before and after approaches. The safety effects of pavement resurfacing were quantified in the crash modification factors (CMFs) and estimated based on different ranges of heavy vehicle traffic volume and time changes for different severity levels. In order to evaluate the variation of CMFs over time, crash modification functions (CMFunctions) were developed using nonlinear regression and time series models. The results showed that pavement resurfacing projects decrease crash frequency and are found to be more safety effective to reduce severe crashes in general. Moreover, the results of the general relationship between the safety effects and time changes indicated that the CMFs increase over time after the resurfacing treatment. It was also found that pavement resurfacing projects for the urban roadways with higher heavy vehicle volume rate are more safety effective than the roadways with lower heavy vehicle volume rate. Based on the exploration and comparison of the developed CMFucntions, the seasonal autoregressive integrated moving average (SARIMA) and exponential functional form of the nonlinear regression models can be utilized to identify the trend of CMFs over time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Analysis on the adaptive countermeasures to ecological management under changing environment in the Tarim River Basin, China

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Xue, Lianqing; Zhang, Luochen; Chen, Xinfang; Chi, Yixia

    2017-12-01

    This article aims to explore the adaptive utilization strategies of flow regime versus traditional practices in the context of climate change and human activities in the arid area. The study presents quantitative analysis of climatic and anthropogenic factors to streamflow alteration in the Tarim River Basin (TRB) using the Budyko method and adaptive utilization strategies to eco-hydrological regime by comparing the applicability between autoregressive moving average model (ARMA) model and combined regression model. Our results suggest that human activities played a dominant role in streamflow deduction in the mainstream with contribution of 120.7%~190.1%. While in the headstreams, climatic variables were the primary determinant of streamflow by 56.5~152.6% of the increase. The comparison revealed that combined regression model performed better than ARMA model with the qualified rate of 80.49~90.24%. Based on the forecasts of streamflow for different purposes, the adaptive utilization scheme of water flow is established from the perspective of time and space. Our study presents an effective water resources scheduling scheme for the ecological environment and provides references for ecological protection and water allocation in the arid area.

  14. Sequential monitoring and stability of ex vivo-expanded autologous and non-autologous regulatory T cells following infusion in non-human primates

    PubMed Central

    Zhang, H.; Guo, H.; Lu, L.; Zahorchak, A. F.; Wiseman, R. W.; Raimondi, G.; Cooper, D. K. C.; Ezzelarab, M. B.; Thomson, A. W.

    2016-01-01

    Ex vivo-expanded cynomolgus monkey CD4+CD25+CD127− regulatory T cells (Treg) maintained Foxp3 demethylation status at the Treg-Specific Demethylation Region (TSDR), and potently suppressed T cell proliferation through 3 rounds of expansion. When CFSE- or VPD450-labeled autologous (auto) and non-autologous (non-auto) expanded Treg were infused into monkeys, the number of labeled auto-Treg in peripheral blood declined rapidly during the first week, but persisted at low levels in both normal and anti-thymocyte globulin plus rapamycin-treated (immunosuppressed; IS) animals for at least 3 weeks. By contrast, MHC-mismatched non-auto-Treg could not be detected in normal monkey blood or in blood of two out of the three IS monkeys by day 6 post-infusion. They were also more difficult to detect than auto-Treg in peripheral lymphoid tissue. Both auto- and non-auto-Treg maintained Ki67 expression early after infusion. Sequential monitoring revealed that adoptively-transferred auto-Treg maintained similarly high levels of Foxp3 and CD25 and low CD127 compared with endogenous Treg, although Foxp3 staining diminished over time in these non-transplanted recipients. Thus, infused ex vivo-expanded auto-Treg persist longer than MHC-mismatched non-auto-Treg in blood of non-human primates and can be detected in secondary lymphoid tissue. Host lymphodepletion and rapamycin administration did not consistently prolong the persistence of non-auto-Treg in these sites. PMID:25783759

  15. Economic evaluation of epinephrine auto-injectors for peanut allergy.

    PubMed

    Shaker, Marcus; Bean, Katherine; Verdi, Marylee

    2017-08-01

    Three commercial epinephrine auto-injectors were available in the United States in the summer of 2016: EpiPen, Adrenaclick, and epinephrine injection, USP auto-injector. To describe the variation in pharmacy costs among epinephrine auto-injector devices in New England and evaluate the additional expense associated with incremental auto-injector costs. Decision analysis software was used to evaluate costs of the most and least expensive epinephrine auto-injector devices for children with peanut allergy. To evaluate regional variation in epinephrine auto-injector costs, a random sample of New England national and corporate pharmacies was compared with a convenience sample of pharmacies from 10 Canadian provinces. Assuming prescriptions written for 2 double epinephrine packs each year (home and school), the mean costs of food allergy over the 20-year model horizon totaled $58,667 (95% confidence interval [CI] $57,745-$59,588) when EpiPen was prescribed and $45,588 (95% CI $44,873-$46,304) when epinephrine injection, USP auto-injector was prescribed. No effectiveness differences were evident between groups, with 17.19 (95% CI 17.11-17.27) quality-adjusted life years accruing for each subject. The incremental cost per episode of anaphylaxis treated with epinephrine over the model horizon was $12,576 for EpiPen vs epinephrine injection, USP auto-injector. EpiPen costs were lowest at Canadian pharmacies ($96, 95% CI $85-$107). There was price consistency between corporate and independent pharmacies throughout New England by device brand, with the epinephrine injection, USP auto-injector being the most affordable device. Cost differences among epinephrine auto-injectors were significant. More expensive auto-injector brands did not appear to provide incremental benefit. Copyright © 2017 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  16. Unintentional Epinephrine Auto-injector Injuries: A National Poison Center Observational Study.

    PubMed

    Anshien, Marco; Rose, S Rutherfoord; Wills, Brandon K

    2016-11-24

    Epinephrine is the only first-line therapeutic agent used to treat life-threatening anaphylaxis. Epinephrine auto-injectors are commonly carried by patients at risk for anaphylaxis, and reported cases of unintentional auto-injector injury have increased over the last decade. Modifications of existing designs and release of a new style of auto-injector are intended to reduce epinephrine auto-injector misuse. The aim of the study was to characterize reported cases of unintentional epinephrine auto-injector exposures from 2013 to 2014 and compare demographics, auto-injector model, and anatomical site of such exposures. The American Association of Poison Control Center's National Poison Data System was searched from January 1, 2013, to December 31, 2014, for cases of unintentional epinephrine auto-injector exposures. Anatomical site data were obtained from all cases reported to the Virginia Poison Center and participating regional poison center for Auvi-Q cases. A total of 6806 cases of unintentional epinephrine auto-injector exposures were reported to US Poison Centers in 2013 and 2014. Of these cases, 3933 occurred with EpiPen, 2829 with EpiPen Jr, 44 with Auvi-Q, and no case reported of Adrenaclick. The most common site of unintentional injection for traditional epinephrine auto-injectors was the digit or thumb, with 58% of cases for EpiPen and 39% of cases with EpiPen Jr. With Auvi-Q, the most common site was the leg (78% of cases). The number of unintentional epinephrine auto-injector cases reported to American Poison Centers in 2013-2014 has increased compared with previous data. Most EpiPen exposures were in the digits, whereas Auvi-Q was most frequently in the leg. Because of the limitations of Poison Center data, more research is needed to identify incidence of unintentional exposures and the effectiveness of epinephrine auto-injector redesign.

  17. Stochastic-Constraints Method in Nonstationary Hot-Clutter Cancellation Part I: Fundamentals and Supervised Training Applications

    DTIC Science & Technology

    2003-04-01

    any of the P interfering sources, and Hkt i (1) (P)] T is defined below. The P-variate vector = t kt , • t J consists of complex waveforms radiated by...line. More precisely, the (i, j ) t element of the matrix Hke is a complex 4-4 coefficient which is practically constant over the kth PRI, and is a...multivariate auto-regressive (AR) model of order n: Ykt + Z Bj Yk- j , t = tkt (25) j =l In the above equation, Bj are the M-variate matrices which are the

  18. [Exploration of influencing factors of price of herbal based on VAR model].

    PubMed

    Wang, Nuo; Liu, Shu-Zhen; Yang, Guang

    2014-10-01

    Based on vector auto-regression (VAR) model, this paper takes advantage of Granger causality test, variance decomposition and impulse response analysis techniques to carry out a comprehensive study of the factors influencing the price of Chinese herbal, including herbal cultivation costs, acreage, natural disasters, the residents' needs and inflation. The study found that there is Granger causality relationship between inflation and herbal prices, cultivation costs and herbal prices. And in the total variance analysis of Chinese herbal and medicine price index, the largest contribution to it is from its own fluctuations, followed by the cultivation costs and inflation.

  19. A hybrid model for PM₂.₅ forecasting based on ensemble empirical mode decomposition and a general regression neural network.

    PubMed

    Zhou, Qingping; Jiang, Haiyan; Wang, Jianzhou; Zhou, Jianling

    2014-10-15

    Exposure to high concentrations of fine particulate matter (PM₂.₅) can cause serious health problems because PM₂.₅ contains microscopic solid or liquid droplets that are sufficiently small to be ingested deep into human lungs. Thus, daily prediction of PM₂.₅ levels is notably important for regulatory plans that inform the public and restrict social activities in advance when harmful episodes are foreseen. A hybrid EEMD-GRNN (ensemble empirical mode decomposition-general regression neural network) model based on data preprocessing and analysis is firstly proposed in this paper for one-day-ahead prediction of PM₂.₅ concentrations. The EEMD part is utilized to decompose original PM₂.₅ data into several intrinsic mode functions (IMFs), while the GRNN part is used for the prediction of each IMF. The hybrid EEMD-GRNN model is trained using input variables obtained from principal component regression (PCR) model to remove redundancy. These input variables accurately and succinctly reflect the relationships between PM₂.₅ and both air quality and meteorological data. The model is trained with data from January 1 to November 1, 2013 and is validated with data from November 2 to November 21, 2013 in Xi'an Province, China. The experimental results show that the developed hybrid EEMD-GRNN model outperforms a single GRNN model without EEMD, a multiple linear regression (MLR) model, a PCR model, and a traditional autoregressive integrated moving average (ARIMA) model. The hybrid model with fast and accurate results can be used to develop rapid air quality warning systems. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Periodicity analysis of tourist arrivals to Banda Aceh using smoothing SARIMA approach

    NASA Astrophysics Data System (ADS)

    Miftahuddin, Helida, Desri; Sofyan, Hizir

    2017-11-01

    Forecasting the number of tourist arrivals who enters a region is needed for tourism businesses, economic and industrial policies, so that the statistical modeling needs to be conducted. Banda Aceh is the capital of Aceh province more economic activity is driven by the services sector, one of which is the tourism sector. Therefore, the prediction of the number of tourist arrivals is needed to develop further policies. The identification results indicate that the data arrival of foreign tourists to Banda Aceh to contain the trend and seasonal nature. Allegedly, the number of arrivals is influenced by external factors, such as economics, politics, and the holiday season caused the structural break in the data. Trend patterns are detected by using polynomial regression with quadratic and cubic approaches, while seasonal is detected by a periodic regression polynomial with quadratic and cubic approach. To model the data that has seasonal effects, one of the statistical methods that can be used is SARIMA (Seasonal Autoregressive Integrated Moving Average). The results showed that the smoothing, a method to detect the trend pattern is cubic polynomial regression approach, with the modified model and the multiplicative periodicity of 12 months. The AIC value obtained was 70.52. While the method for detecting the seasonal pattern is a periodic regression polynomial cubic approach, with the modified model and the multiplicative periodicity of 12 months. The AIC value obtained was 73.37. Furthermore, the best model to predict the number of foreign tourist arrivals to Banda Aceh in 2017 to 2018 is SARIMA (0,1,1)(1,1,0) with MAPE is 26%.

Top